Sample records for detailed process-level understanding

  1. Growth in Mathematical Understanding: How Can We Characterise It and How Can We Represent It?

    ERIC Educational Resources Information Center

    Pirie, Susan; Kieren, Thomas

    1994-01-01

    Proposes a model for the growth of mathematical understanding based on the consideration of understanding as a whole, dynamic, leveled but nonlinear process. Illustrates the model using the concept of fractions. How to map the growth of understanding is explained in detail. (Contains 26 references.) (MKR)

  2. Assessing Threat Detection Scenarios through Hypothesis Generation and Testing

    DTIC Science & Technology

    2015-12-01

    experience as they evaluated scenarios with varying levels of uncertainty. This research focused on understanding the interaction of experience and...details more often than irrelevant details. Those findings provide insight into the cognitive processes Soldiers with varying levels of experience ...like to thank the Soldiers and leaders who participated in this research , especially for providing their time and for sharing their experiences

  3. Neural Representations of Location Outside the Hippocampus

    ERIC Educational Resources Information Center

    Knierim, James J.

    2006-01-01

    Place cells of the rat hippocampus are a dominant model system for understanding the role of the hippocampus in learning and memory at the level of single-unit and neural ensemble responses. A complete understanding of the information processing and computations performed by the hippocampus requires detailed knowledge about the properties of the…

  4. Understanding a basic biological process: Expert and novice models of meiosis

    NASA Astrophysics Data System (ADS)

    Kindfield, Ann C. H.

    Central to secondary and college-level biology instruction is the development of student understanding of a number of subcellular processes. Yet some of the most crucial are consistently cited as the most difficult components of biology to learn. Among these is meiosis. In this article I report on the meiosis models utilized by five individuals at each of three levels of expertise in genetics as each reasoned about this process in an individual interview setting. Detailed characterization of individual meiosis models and comparison among models revealed a set of biologically correct features common to all individuals' models as well as a variety of model flaws (i.e., meiosis misunderstandings) which are categorized according to type and level of expertise. These results are suggestive of both sources of various misunderstandings and factors that might contribute to the construction of a sound understanding of meiosis. Each of these is addressed in relation to their respective implications for instruction.

  5. Understanding product cost vs. performance through an in-depth system Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Sanson, Mark C.

    2017-08-01

    The manner in which an optical system is toleranced and compensated greatly affects the cost to build it. By having a detailed understanding of different tolerance and compensation methods, the end user can decide on the balance of cost and performance. A detailed phased approach Monte Carlo analysis can be used to demonstrate the tradeoffs between cost and performance. In complex high performance optical systems, performance is fine-tuned by making adjustments to the optical systems after they are initially built. This process enables the overall best system performance, without the need for fabricating components to stringent tolerance levels that often can be outside of a fabricator's manufacturing capabilities. A good performance simulation of as built performance can interrogate different steps of the fabrication and build process. Such a simulation may aid the evaluation of whether the measured parameters are within the acceptable range of system performance at that stage of the build process. Finding errors before an optical system progresses further into the build process saves both time and money. Having the appropriate tolerances and compensation strategy tied to a specific performance level will optimize the overall product cost.

  6. A Novel Method for the In-Depth Multimodal Analysis of Student Learning Trajectories in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Liu, Ran; Stamper, John; Davenport, Jodi

    2018-01-01

    Temporal analyses are critical to understanding learning processes, yet understudied in education research. Data from different sources are often collected at different grain sizes, which are difficult to integrate. Making sense of data at many levels of analysis, including the most detailed levels, is highly time-consuming. In this paper, we…

  7. Evidence from mixed hydrate nucleation for a funnel model of crystallization.

    PubMed

    Hall, Kyle Wm; Carpendale, Sheelagh; Kusalik, Peter G

    2016-10-25

    The molecular-level details of crystallization remain unclear for many systems. Previous work has speculated on the phenomenological similarities between molecular crystallization and protein folding. Here we demonstrate that molecular crystallization can involve funnel-shaped potential energy landscapes through a detailed analysis of mixed gas hydrate nucleation, a prototypical multicomponent crystallization process. Through this, we contribute both: (i) a powerful conceptual framework for exploring and rationalizing molecular crystallization, and (ii) an explanation of phenomenological similarities between protein folding and crystallization. Such funnel-shaped potential energy landscapes may be typical of broad classes of molecular ordering processes, and can provide a new perspective for both studying and understanding these processes.

  8. Evidence from mixed hydrate nucleation for a funnel model of crystallization

    PubMed Central

    Hall, Kyle Wm.; Carpendale, Sheelagh; Kusalik, Peter G.

    2016-01-01

    The molecular-level details of crystallization remain unclear for many systems. Previous work has speculated on the phenomenological similarities between molecular crystallization and protein folding. Here we demonstrate that molecular crystallization can involve funnel-shaped potential energy landscapes through a detailed analysis of mixed gas hydrate nucleation, a prototypical multicomponent crystallization process. Through this, we contribute both: (i) a powerful conceptual framework for exploring and rationalizing molecular crystallization, and (ii) an explanation of phenomenological similarities between protein folding and crystallization. Such funnel-shaped potential energy landscapes may be typical of broad classes of molecular ordering processes, and can provide a new perspective for both studying and understanding these processes. PMID:27790987

  9. The role of prefrontal catecholamines in attention and working memory

    PubMed Central

    Clark, Kelsey L.; Noudoost, Behrad

    2014-01-01

    While much progress has been made in identifying the brain regions and neurochemical systems involved in the cognitive processes disrupted in mental illnesses, to date, the level of detail at which neurobiologists can describe the chain of events giving rise to cognitive functions is very rudimentary. Much of the intense interest in understanding cognitive functions is motivated by the hope that it might be possible to understand these complex functions at the level of neurons and neural circuits. Here, we review the current state of the literature regarding how modulations in catecholamine levels within the prefrontal cortex (PFC) alter the neuronal and behavioral correlates of cognitive functions, particularly attention and working memory. PMID:24782714

  10. Approach to Mars Field Geology

    NASA Technical Reports Server (NTRS)

    Muehlberger, William; Rice, James W.; Parker, Timothy; Lipps, Jere H.; Hoffman, Paul; Burchfiel, Clark; Brasier, Martin

    1998-01-01

    The goals of field study on Mars are nothing less than to understand the processes and history of the planet at whatever level of detail is necessary. A manned mission gives us an unprecedented opportunity to use the immense power of the human mind to comprehend Mars in extraordinary detail. To take advantage of this opportunity, it is important to examine how we should approach the field study of Mars. In this effort, we are guided by over 200 years of field exploration experience on Earth as well as six manned missions exploring the Moon.

  11. A time-driven, activity-based costing methodology for determining the costs of red blood cell transfusion in patients with beta thalassaemia major.

    PubMed

    Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M

    2018-04-10

    To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.

  12. Representations and processes of human spatial competence.

    PubMed

    Gunzelmann, Glenn; Lyon, Don R

    2011-10-01

    This article presents an approach to understanding human spatial competence that focuses on the representations and processes of spatial cognition and how they are integrated with cognition more generally. The foundational theoretical argument for this research is that spatial information processing is central to cognition more generally, in the sense that it is brought to bear ubiquitously to improve the adaptivity and effectiveness of perception, cognitive processing, and motor action. We describe research spanning multiple levels of complexity to understand both the detailed mechanisms of spatial cognition, and how they are utilized in complex, naturalistic tasks. In the process, we discuss the critical role of cognitive architectures in developing a consistent account that spans this breadth, and we note some areas in which the current version of a popular architecture, ACT-R, may need to be augmented. Finally, we suggest a framework for understanding the representations and processes of spatial competence and their role in human cognition generally. Copyright © 2011 Cognitive Science Society, Inc.

  13. Nonlinear Growth Curves in Developmental Research

    PubMed Central

    Grimm, Kevin J.; Ram, Nilam; Hamagami, Fumiaki

    2011-01-01

    Developmentalists are often interested in understanding change processes and growth models are the most common analytic tool for examining such processes. Nonlinear growth curves are especially valuable to developmentalists because the defining characteristics of the growth process such as initial levels, rates of change during growth spurts, and asymptotic levels can be estimated. A variety of growth models are described beginning with the linear growth model and moving to nonlinear models of varying complexity. A detailed discussion of nonlinear models is provided, highlighting the added insights into complex developmental processes associated with their use. A collection of growth models are fit to repeated measures of height from participants of the Berkeley Growth and Guidance Studies from early childhood through adulthood. PMID:21824131

  14. Beyond the Shadow of a Trait: Understanding Discounting through Item-Level Analysis of Personality Scales

    ERIC Educational Resources Information Center

    Charlton, Shawn R.; Gossett, Bradley D.; Charlton, Veda A.

    2011-01-01

    Temporal discounting, the loss in perceived value associated with delayed outcomes, correlates with a number of personality measures, suggesting that an item-level analysis of trait measures might provide a more detailed understanding of discounting. The current report details two studies that investigate the utility of such an item-level…

  15. Modeling microcirculatory blood flow: current state and future perspectives.

    PubMed

    Gompper, Gerhard; Fedosov, Dmitry A

    2016-01-01

    Microvascular blood flow determines a number of important physiological processes of an organism in health and disease. Therefore, a detailed understanding of microvascular blood flow would significantly advance biophysical and biomedical research and its applications. Current developments in modeling of microcirculatory blood flow already allow to go beyond available experimental measurements and have a large potential to elucidate blood flow behavior in normal and diseased microvascular networks. There exist detailed models of blood flow on a single cell level as well as simplified models of the flow through microcirculatory networks, which are reviewed and discussed here. The combination of these models provides promising prospects for better understanding of blood flow behavior and transport properties locally as well as globally within large microvascular networks. © 2015 Wiley Periodicals, Inc.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aromí, G.; Beavers, C. M.; Sánchez Costa, J.

    Crystal-to-crystal transformations have been crucial in the understanding of solid-state processes, since these may be studied in detail by means of single crystal X-ray diffraction (SCXRD) techniques. The description of the mechanisms and potential intermediates of those processes remains very challenging. In fact, solid-state transient states have rarely been observed, at least to a sufficient level of detail. We have investigated the process of guest extrusion from the non-porous molecular material [Fe(bpp)(H 2L)](ClO 4) 2·1.5C 3H 6O (bpp = 2,6-bis(pyrazol-3-yl)pyridine; H 2L = 2,6-bis(5-(2-methoxyphenyl)-pyrazol-3-yl)pyridine; C 3H 6O = acetone), which occurs through ordered diffusion of acetone in a crystal-to-crystal manner,more » leading to dramatic structural changes. The slow kinetics of the transition allows thermal trapping of the system at various intermediate stages. The transiting single crystal can be then examined at these points through synchrotron SCXRD, offering a window upon the mechanism of the transformation at the molecular scale. These experiments have unveiled the development of an ordered intermediate phase, distinct from the initial and the final states, coexisting as the process advances with either of these two phases or, at a certain moment with both of them. The new intermediate phase has been structurally characterized in full detail by SCXRD, providing insights into the mechanism of this diffusion triggered solid-state phenomenon. Lastly, the process has been also followed by calorimetry, optical microscopy, local Raman spectroscopy and powder X-ray diffraction. The discovery and description of an intermediate ordered state in a molecular solid-state transformation is of great interest and will help to understand the mechanistic details and reaction pathways underlying these transformations.« less

  17. No question about exciting questions in cell biology.

    PubMed

    Pollard, Thomas D

    2013-12-01

    Although we have a good grasp of many important processes in cell biology, including knowledge of many molecules involved and how they interact with each other, we still do not understand most of the dynamical features that are the essence of living systems. Fortunately, we now have the ability to dissect biological systems in enough detail to understand their dynamics, including the use of mathematical models to account for past observations and predict future experiments. This deep level of mechanistic understanding should be our goal—not simply to satisfy our scientific curiosity, but also to understand the causes of disease well enough to predict risks, make early diagnoses, and treat effectively. Many big questions remain to be answered before we reach this goal of understanding cellular dynamics.

  18. 'Geo'chemical research: a key building block for nuclear waste disposal safety cases.

    PubMed

    Altmann, Scott

    2008-12-12

    Disposal of high level radioactive waste in deep underground repositories has been chosen as solution by several countries. Because of the special status this type waste has in the public mind, national implementation programs typically mobilize massive R&D efforts, last decades and are subject to extremely detailed and critical social-political scrutiny. The culminating argument of each program is a 'Safety Case' for a specific disposal concept containing, among other elements, the results of performance assessment simulations whose object is to model the release of radionuclides to the biosphere. Public and political confidence in performance assessment results (which generally show that radionuclide release will always be at acceptable levels) is based on their confidence in the quality of the scientific understanding in the processes included in the performance assessment model, in particular those governing radionuclide speciation and mass transport in the geological host formation. Geochemistry constitutes a core area of research in this regard. Clay-mineral rich formations are the subjects of advanced radwaste programs in several countries (France, Belgium, Switzerland...), principally because of their very low permeabilities and demonstrated capacities to retard by sorption most radionuclides. Among the key processes which must be represented in performance assessment models are (i) radioelement speciation (redox state, speciation, reactions determining radionuclide solid-solution partitioning) and (ii) diffusion-driven transport. The safety case must therefore demonstrate a detailed understanding of the physical-chemical phenomena governing the effects of these two aspects, for each radionuclide, within the geological barrier system. A wide range of coordinated (and internationally collaborated) research has been, and is being, carried out in order to gain the detailed scientific understanding needed for constructing those parts of the Safety Case supporting how radionuclide transfer is represented in the performance assessment model. The objective here is to illustrate how geochemical research contributes to this process and, above all, to identify a certain number of subjects which should be treated in priority.

  19. Computer-Aided TRIZ Ideality and Level of Invention Estimation Using Natural Language Processing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Adams, Christopher; Tate, Derrick

    Patent textual descriptions provide a wealth of information that can be used to understand the underlying design approaches that result in the generation of novel and innovative technology. This article will discuss a new approach for estimating Degree of Ideality and Level of Invention metrics from the theory of inventive problem solving (TRIZ) using patent textual information. Patent text includes information that can be used to model both the functions performed by a design and the associated costs and problems that affect a design’s value. The motivation of this research is to use patent data with calculation of TRIZ metrics to help designers understand which combinations of system components and functions result in creative and innovative design solutions. This article will discuss in detail methods to estimate these TRIZ metrics using natural language processing and machine learning with the use of neural networks.

  20. Environmental research program. 1995 Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, N.J.

    1996-06-01

    The objective of the Environmental Research Program is to enhance the understanding of, and mitigate the effects of pollutants on health, ecological systems, global and regional climate, and air quality. The program is multidisciplinary and includes fundamental research and development in efficient and environmentally benign combustion, pollutant abatement and destruction, and novel methods of detection and analysis of criteria and noncriteria pollutants. This diverse group conducts investigations in combustion, atmospheric and marine processes, flue-gas chemistry, and ecological systems. Combustion chemistry research emphasizes modeling at microscopic and macroscopic scales. At the microscopic scale, functional sensitivity analysis is used to explore themore » nature of the potential-to-dynamics relationships for reacting systems. Rate coefficients are estimated using quantum dynamics and path integral approaches. At the macroscopic level, combustion processes are modelled using chemical mechanisms at the appropriate level of detail dictated by the requirements of predicting particular aspects of combustion behavior. Parallel computing has facilitated the efforts to use detailed chemistry in models of turbulent reacting flow to predict minor species concentrations.« less

  1. Integrating metabolic modeling and population heterogeneity analysis into optimizing recombinant protein production by Komagataella (Pichia) pastoris.

    PubMed

    Theron, Chrispian W; Berrios, Julio; Delvigne, Frank; Fickers, Patrick

    2018-01-01

    The methylotrophic yeast Komagataella (Pichia) pastoris has become one of the most utilized cell factories for the production of recombinant proteins over the last three decades. This success story is linked to its specific physiological traits, i.e., the ability to grow at high cell density in inexpensive culture medium and to secrete proteins at high yield. Exploiting methanol metabolism is at the core of most P. pastoris-based processes but comes with its own challenges. Co-feeding cultures with glycerol/sorbitol and methanol is a promising approach, which can benefit from improved understanding and prediction of metabolic response. The development of profitable processes relies on the construction and selection of efficient producing strains from less efficient ones but also depends on the ability to master the bioreactor process itself. More specifically, how a bioreactor processes could be monitored and controlled to obtain high yield of production. In this review, new perspectives are detailed regarding a multi-faceted approach to recombinant protein production processes by P. pastoris; including gaining improved understanding of the metabolic pathways involved, accounting for variations in transcriptional and translational efficiency at the single cell level and efficient monitoring and control of methanol levels at the bioreactor level.

  2. Innovative technologies to understand hydrogeomorphic impacts of climate change scenarios on gully development in drylands: case study from Ethiopia

    NASA Astrophysics Data System (ADS)

    Frankl, Amaury; Stal, Cornelis; Abraha, Amanuel; De Wulf, Alain; Poesen, Jean

    2014-05-01

    Taking climate change scenarios into account, rainfall patterns are likely to change over the coming decades in eastern Africa. In brief, large parts of eastern Africa are expected to experience a wetting, including seasonality changes. Gullies are threshold phenomena that accomplish most of their geomorphic change during short periods of strong rainfall. Understanding the links between geomorphic change and rainfall characteristics in detail, is thus crucial to ensure the sustainability of future land management. In this study, we present image-based 3D modelling as a low-cost, flexible and rapid method to quantify gully morphology from terrestrial photographs. The methodology was tested on two gully heads in Northern Ethiopia. Ground photographs (n = 88-235) were taken during days with cloud cover. The photographs were processed in PhotoScan software using a semi-automated Structure from Motion-Multi View Stereo (SfM-MVS) workflow. As a result, full 3D models were created, accurate at cm level. These models allow to quantify gully morphology in detail, including information on undercut walls and soil pipe inlets. Such information is crucial for understanding the hydrogeomorphic processes involved. Producing accurate 3D models after each rainfall event, allows to model interrelations between rainfall, land management, runoff and erosion. Expected outcomes are the production of detailed vulnerability maps that allow to design soil and water conservation measures in a cost-effective way. Keywords: 3D model, Ethiopia, Image-based 3D modelling, Gully, PhotoScan, Rainfall.

  3. Characterizing the multiexciton fission intermediate in pentacene through 2D spectral modeling

    NASA Astrophysics Data System (ADS)

    Tempelaar, Roel; Reichman, David

    Singlet fission, the molecular process in which a singlet excitation splits into two triplet excitons, holds promise to enhance the photoconversion efficiency of solar cells. Despite advances in both experiments and theory, a detailed understanding of this process remains lacking. In particular, the nature of the correlated triplet pair state (TT), which acts as a fission intermediate, remains obscure. Recently, 2D spectroscopy was shown to allow for the direct detection of the extremely weak optical transition between TT and the ground state through coherently prepared vibrational wavepackets in the associated electronic potentials. Here, we present a microscopic model of singlet fission which includes an exact quantum treatment of such vibrational modes. Our model reproduces the reported 2D spectra of pentacene, while providing a detailed insight into the anatomy of TT. As such, our results form a stepping stone towards understanding singlet fission at a molecular level, while bridging the gap between the wealth of recent theoretical works on one side and experimental measurements on the other. R.T. acknowledges The Netherlands Organisation for Scientific Research NWO for support through a Rubicon Grant.

  4. Inferring diffusion in single live cells at the single-molecule level

    PubMed Central

    Robson, Alex; Burrage, Kevin; Leake, Mark C.

    2013-01-01

    The movement of molecules inside living cells is a fundamental feature of biological processes. The ability to both observe and analyse the details of molecular diffusion in vivo at the single-molecule and single-cell level can add significant insight into understanding molecular architectures of diffusing molecules and the nanoscale environment in which the molecules diffuse. The tool of choice for monitoring dynamic molecular localization in live cells is fluorescence microscopy, especially so combining total internal reflection fluorescence with the use of fluorescent protein (FP) reporters in offering exceptional imaging contrast for dynamic processes in the cell membrane under relatively physiological conditions compared with competing single-molecule techniques. There exist several different complex modes of diffusion, and discriminating these from each other is challenging at the molecular level owing to underlying stochastic behaviour. Analysis is traditionally performed using mean square displacements of tracked particles; however, this generally requires more data points than is typical for single FP tracks owing to photophysical instability. Presented here is a novel approach allowing robust Bayesian ranking of diffusion processes to discriminate multiple complex modes probabilistically. It is a computational approach that biologists can use to understand single-molecule features in live cells. PMID:23267182

  5. Counterion Size and Nature Control Structural and Mechanical Response in Cellulose Nanofibril Nanopapers.

    PubMed

    Benítez, Alejandro J; Walther, Andreas

    2017-05-08

    Nanopapers formed from aqueous dispersions of cellulose nanofibrils (CNFs) combine stiffness, strength, and toughness. Yet, delicate interactions operate between the CNFs during nanopaper formation and mechanical deformation. We unravel in detail how counterions, being either of the organic alkyl ammonium kind (NR 4 + ) or of the earth metal series (Li + , Na + , Cs + ), need to be chosen to achieve outstanding combinations of stiffness, strength, and toughness, extending to previously unreached territories. We relate structure formation processes in terms of colloidal stabilization to nanostructural details such as porosity and ability for swelling, as well as to interfibrillar interactions in bulk and macroscale mechanical properties. We demonstrate that our understanding also leads to new levels of ductility in bioinspired CNF/polymer nanocomposites at high levels of reinforcements. These results contribute to future rational design of CNF-based high-performance materials.

  6. Unraveling Mixed Hydrate Formation: Microscopic Insights into Early Stage Behavior.

    PubMed

    Hall, Kyle Wm; Zhang, Zhengcai; Kusalik, Peter G

    2016-12-29

    The molecular-level details of mixed hydrate nucleation remain unclear despite the broad implications of this process for a variety of scientific domains. Through analysis of mixed hydrate nucleation in a prototypical CH 4 /H 2 S/H 2 O system, we demonstrate that high-level kinetic similarities between mixed hydrate systems and corresponding pure hydrate systems are not a reliable basis for estimating the composition of early stage mixed hydrate nuclei. Moreover, we show that solution compositions prior to and during nucleation are not necessarily effective proxies for the composition of early stage mixed hydrate nuclei. Rather, microscopic details, (e.g., guest-host interactions and previously neglected cage types) apparently play key roles in determining early stage behavior of mixed hydrates. This work thus provides key foundational concepts and insights for understanding mixed hydrate nucleation.

  7. Snapshots of a solid-state transformation: coexistence of three phases trapped in one crystal

    DOE PAGES

    Aromí, G.; Beavers, C. M.; Sánchez Costa, J.; ...

    2016-01-05

    Crystal-to-crystal transformations have been crucial in the understanding of solid-state processes, since these may be studied in detail by means of single crystal X-ray diffraction (SCXRD) techniques. The description of the mechanisms and potential intermediates of those processes remains very challenging. In fact, solid-state transient states have rarely been observed, at least to a sufficient level of detail. We have investigated the process of guest extrusion from the non-porous molecular material [Fe(bpp)(H 2L)](ClO 4) 2·1.5C 3H 6O (bpp = 2,6-bis(pyrazol-3-yl)pyridine; H 2L = 2,6-bis(5-(2-methoxyphenyl)-pyrazol-3-yl)pyridine; C 3H 6O = acetone), which occurs through ordered diffusion of acetone in a crystal-to-crystal manner,more » leading to dramatic structural changes. The slow kinetics of the transition allows thermal trapping of the system at various intermediate stages. The transiting single crystal can be then examined at these points through synchrotron SCXRD, offering a window upon the mechanism of the transformation at the molecular scale. These experiments have unveiled the development of an ordered intermediate phase, distinct from the initial and the final states, coexisting as the process advances with either of these two phases or, at a certain moment with both of them. The new intermediate phase has been structurally characterized in full detail by SCXRD, providing insights into the mechanism of this diffusion triggered solid-state phenomenon. Lastly, the process has been also followed by calorimetry, optical microscopy, local Raman spectroscopy and powder X-ray diffraction. The discovery and description of an intermediate ordered state in a molecular solid-state transformation is of great interest and will help to understand the mechanistic details and reaction pathways underlying these transformations.« less

  8. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  9. Fundamental insights into interfacial catalysis.

    PubMed

    Gong, Jinlong; Bao, Xinhe

    2017-04-03

    Surface and interfacial catalysis plays a vital role in chemical industries, electrochemistry and photochemical reactions. The challenges of modern chemistry are to optimize the chemical reaction processes and understand the detailed mechanism of chemical reactions. Since the early 1960s, the foundation of surface science systems has allowed the study of surface and interfacial phenomena on atomic/molecular level, and thus brought a number of significant developments to fundamental and technological processes, such as catalysis, material science and biochemistry, just to name a few. This themed issue describes the recent advances and developments in the fundamental understanding of surface and interfacial catalysis, encompassing areas of knowledge from metal to metal oxide, carbide, graphene, hexagonal boron nitride, and transition metal dichalcogenides under ultrahigh vacuum conditions, as well as under realistic reaction conditions.

  10. Intelligence Preparation of the Battlefield (IPB): One Size Fits All

    DTIC Science & Technology

    1991-12-10

    higher levels because it was too detailed for that lev- el commander. 1 27 Battalion S2s must understand the level of derail needed for their product...82174-;-U- el . Te A10034iniiatvcoffesa excellent opprtnrity to involve commanders in the IPB process and clarify the inputs alnd outputs of the system...1.8. 57. Glass and Davidson, 62. 58. US Army, FE 34-1_30, 4-66. 59. US Army, FM 34, 2-3. 60. US Army, F11 34-8_0 El . 61. US Army, FM 71-2, 2--23. 62. US

  11. JPRS Report, Near East and South Asia.

    DTIC Science & Technology

    1989-02-23

    Modernization of Armed Forces Surveyed in Detail f’UMAN 18 Nov] 52 Textile Industry Celebrates Inauguration of Massive Mill f’UMAN 18 Nov] 54 SUDAN...procession, its horizons, and its needs . They also reflect an understanding of the tasks required at the various levels to bolster the intifadah... needed to extend the cracks to the heart. Until then, we must brace for further escalation in the brutal oppression. As for the U.S. elections, they

  12. NOSS Altimeter Detailed Algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Mcmillan, J. D.

    1982-01-01

    The details of the algorithms and data sets required for satellite radar altimeter data processing are documented in a form suitable for (1) development of the benchmark software and (2) coding the operational software. The algorithms reported in detail are those established for altimeter processing. The algorithms which required some additional development before documenting for production were only scoped. The algorithms are divided into two levels of processing. The first level converts the data to engineering units and applies corrections for instrument variations. The second level provides geophysical measurements derived from altimeter parameters for oceanographic users.

  13. The influence of text cohesion and picture detail on young readers' knowledge of science topics.

    PubMed

    Désiron, Juliette C; de Vries, Erica; Bartel, Anna N; Varahamurti, Nalini

    2017-10-16

    The effects of text cohesion and added pictures on acquired knowledge have been heavily studied each in isolation. Furthermore, studies on the effects of specific characteristics of pictures, whether facilitating or hindering, are scarce. Schnotz's ITCP Model (2014) allows to formulate hypotheses regarding the combined effect of text cohesion and presence and level of detail of a picture. This study investigates these hypotheses in the case of children reading scientific texts. One hundred and one-second-, third-, and fourth-grade pupils with a mean age of 9 years, in the western United States. Data were collected over three sessions to encompass an understanding of each pupil's knowledge based on prior sessions. Results showed a significant increase in pupils' knowledge between pre-test and immediate post-test, but as hypothesized, no significant difference between levels of cohesion. No significant difference between types of pictures was detected. After 1 week, knowledge built with a high cohesive text significantly dropped with low-detail picture, whereas, with high detail, or no picture, there was no significant difference. Results suggested that when participants were given a low-detail picture with a low cohesive text, the integration process of the material was more restricted than with a high cohesive text. © 2017 The British Psychological Society.

  14. To acquire more detailed radiation drive by use of ``quasi-steady'' approximation in atomic kinetics

    NASA Astrophysics Data System (ADS)

    Ren, Guoli; Pei, Wenbing; Lan, Ke; Gu, Peijun; Li, Xin

    2012-10-01

    In current routine 2D simulation of hohlraum physics, we adopt the principal-quantum- number(n-level) average atom model(AAM) in NLTE plasma description. However, the detailed experimental frequency-dependant radiative drive differs from our n-level simulated drive, which reminds us the need of a more detailed atomic kinetics description. The orbital-quantum- number(nl-level) average atom model is a natural consideration, however the nl-level in-line calculation needs much more computational resource. By distinguishing the rapid bound-bound atomic processes from the relative slow bound-free atomic processes, we found a method to build up a more detailed bound electron distribution(nl-level even nlm-level) using in-line n-level calculated plasma conditions(temperature, density, and average ionization degree). We name this method ``quasi-steady approximation'' in atomic kinetics. Using this method, we re-build the nl-level bound electron distribution (Pnl), and acquire a new hohlraum radiative drive by post-processing. Comparison with the n-level post-processed hohlraum drive shows that we get an almost identical radiation flux but with more fine frequency-denpending spectrum structure which appears only in nl-level transition with same n number(n=0) .

  15. Engineered metal based nanomaterials in aqueous environments: Interactions, transformations and implications

    NASA Astrophysics Data System (ADS)

    Mudunkotuwa, Imali Ama

    Nanoscience and nanotechnology offer potential routes towards addressing critical issues such as clean and sustainable energy, environmental protection and human health. Specifically, metal and metal oxide nanomaterials are found in a wide range of applications and therefore hold a greater potential of possible release into the environment or for the human to be exposed. Understanding the aqueous phase behavior of metal and metal oxide nanomaterials is a key factor in the safe design of these materials because their interactions with living systems are always mediated through the aqueous phase. Broadly the transformations in the aqueous phase can be classified as dissolution, aggregation and adsorption which are dependent and linked processes to one another. The complexity of these processes at the liquid-solid interface has therefore been one of the grand challenges that has persisted since the beginning of nanotechnology. Although classical models provide guidance for understanding dissolution and aggregation of nanoparticles in water, there are many uncertainties associated with the recent findings. This is often due to a lack of fundamental knowledge of the surface structure and surface energetics for very small particles. Therefore currently the environmental health and safety studies related to nanomaterials are more focused on understanding the surface chemistry that governs the overall processes in the liquid-solid interfacial region at the molecular level. The metal based nanomaterials focused on in this dissertation include TiO2, ZnO, Cu and CuO. These are among the most heavily used in a number of applications ranging from uses in the construction industry to cosmetic formulation. Therefore they are produced in large scale and have been detected in the environment. There is debate within the scientific community related to their safety as a result of the lack of understanding on the surface interactions that arise from the detailed nature of the surfaces. Specifically, the interactions of these metal and metal oxide nanoparticles with environmental and biological ligands in the solutions have demonstrated dramatic alterations in their aqueous phase behavior in terms of dissolution and aggregation. Dissolution and aggregation are among the determining factors of nanoparticle uptake and toxicity. Furthermore, solution conditions such as ionic strength and pH can act as controlling parameters for surface ligand adsorption while adsorbed ligands themselves undergo surface induced structural and conformational changes. Because, nanomaterials in both the environment and in biological systems are subjected to a wide range of matrix conditions they are in fact dynamic and not static entities. Thus monitoring and tracking these nanomaterials in real systems can be extremely challenging which requires a thorough understanding of the surface chemistry governing their transformations. The work presented in this dissertation attempts to bridge the gap between the dynamic processing of these nanomaterials, the details of the molecular level processes that occur at the liquid-solid interfacial region and potential environmental and biological interactions. Extensive nanomaterial characterization is an integral part of these investigations and all the materials presented here are thoroughly analyzed for particle size, shape, surface area, bulk and surface compositions. Detailed spectroscopic analysis was used to acquire molecular information of the processes in the liquid-solid interfacial region and the outcomes are linked with the macroscopic analysis with the aid of dynamic and static light scattering techniques. Furthermore, emphasis is given to the size dependent behavior and theoretical modeling is adapted giving careful consideration to the details of the physicochemical characterization and molecular information unique to the nanomaterials.

  16. MO-E-BRD-01: Adapt-A-Thon - Texas Hold’em Invitational

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kessler, M; Brock, K; Pouliot, J

    2014-06-15

    Software tools for image-based adaptive radiotherapy such as deformable image registration, contour propagation and dose mapping have progressed beyond the research setting and are now commercial products available as part of both treatment planning systems and stand-alone applications. These software tools are used together to create clinical workflows to detect, track and evaluate changes in the patient and to accumulate dose. Deviations uncovered in this process are used to guide decisions about replanning/adaptation with the goal of keeping the delivery of prescribed dose “on target” throughout the entire course of radiotherapy. Since the output from one step of the adaptivemore » process is used as an input for another, it is essential to understand and document the uncertainty associated with each of the step and how these uncertainties are propagated. This in turn requires an understanding how the underlying tools work. Unfortunately, important details about the algorithms used to implement these tools are scarce or incomplete, too often for competitive reasons. This is in contrast to the situation involving other basic treatment planning algorithms such as dose calculations, where the medical physics community essentially requires vendors to provide physically important details about their underlying theory and clinical implementation. Vendors should adopt this same level of information sharing when it comes to the tools and techniques for image guided adaptive radiotherapy. The goal of this session is to start this process by inviting vendors and medical physicists to discuss and demonstrate the available tools and describe how they are intended to be used in clinical practice. The format of the session will involve a combination of formal presentations, interactive demonstrations, audience participation and some friendly “Texas style” competition. Learning Objectives: Understand the components of the image-based adaptive radiotherapy process. Understand the how these components are implemented in various commercial systems. Understand the different use cases and workflows currently supported these tools.« less

  17. Confluence and Contours: Reflexive Management of Environmental Risk

    PubMed Central

    Schubert, Iljana; Pollard, Simon; Rocks, Sophie; Black, Edgar

    2015-01-01

    Government institutions have responsibilities to distribute risk management funds meaningfully and to be accountable for their choices. We took a macro‐level sociological approach to understanding the role of government in managing environmental risks, and insights from micro‐level psychology to examine individual‐level risk‐related perceptions and beliefs. Survey data from 2,068 U.K. citizens showed that lay people's funding preferences were associated positively with beliefs about responsibility and trust, yet associations with perception varied depending on risk type. Moreover, there were risk‐specific differences in the funding preferences of the lay sample and 29 policymakers. A laboratory‐based study of 109 participants examined funding allocation in more detail through iterative presentation of expert information. Quantitative and qualitative data revealed a meso‐level framework comprising three types of decisionmakers who varied in their willingness to change funding allocation preferences following expert information: adaptors, responders, and resistors. This research highlights the relevance of integrated theoretical approaches to understanding the policy process, and the benefits of reflexive dialogue to managing environmental risks. PMID:26720858

  18. Concepts and algorithms in digital photogrammetry

    NASA Technical Reports Server (NTRS)

    Schenk, T.

    1994-01-01

    Despite much progress in digital photogrammetry, there is still a considerable lack of understanding of theories and methods which would allow a substantial increase in the automation of photogrammetric processes. The purpose of this paper is to raise awareness that the automation problem is one that cannot be solved in a bottom-up fashion by a trial-and-error approach. We present a short overview of concepts and algorithms used in digital photogrammetry. This is followed by a more detailed presentation of perceptual organization, a typical middle-level task.

  19. Biosurfactant production by Aureobasidium pullulans in stirred tank bioreactor: New approach to understand the influence of important variables in the process.

    PubMed

    Brumano, Larissa Pereira; Antunes, Felipe Antonio Fernandes; Souto, Sara Galeno; Dos Santos, Júlio Cesar; Venus, Joachim; Schneider, Roland; da Silva, Silvio Silvério

    2017-11-01

    Surfactants are amphiphilic molecules with large industrial applications produced currently by chemical routes mainly derived from oil industry. However, biotechnological process, aimed to develop new sustainable process configurations by using favorable microorganisms, already requires investigations in more details. Thus, we present a novel approach for biosurfactant production using the promising yeast Aureobasidium pullulans LB 83, in stirred tank reactor. A central composite face-centered design was carried out to evaluate the effect of the aeration rate (0.1-1.1min -1 ) and sucrose concentration (20-80g.L -1 ) in the biosurfactant maximum tensoactivity and productivity. Statistical analysis showed that the use of variables at high levels enhanced tensoactivity, showing 8.05cm in the oil spread test and productivity of 0.0838cm.h -1 . Also, unprecedented investigation of aeration rate and sucrose concentration relevance in biosurfactant production by A. pullulans in stirred tank reactor was detailed, demonstrating the importance to establish adequate conditions in bioreactors, aimed to scale-up process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Analysis of Particle Image Velocimetry (PIV) Data for Application to Subsonic Jet Noise Studies

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    Global velocimetry measurements were taken using Particle Image Velocimetry (PIV) in the subsonic flow exiting a 1 inch circular nozzle in an attempt to better understand the turbulence characteristics of its shear layer region. This report presents the results of the PIV analysis and data reduction portions of the test and details the processing that was done. Custom data analysis and data validation algorithms were developed and applied to a data ensemble consisting of over 750 PIV 70 mm photographs taken in the 0.85 mach flow facility. Results are presented detailing spatial characteristics of the flow including ensemble mean and standard deviation, turbulence intensities and Reynold's stress levels, and 2-point spatial correlations.

  1. “Using Statistical Comparisons between SPartICus Cirrus Microphysical Measurements, Detailed Cloud Models, and GCM Cloud Parameterizations to Understand Physical Processes Controlling Cirrus Properties and to Improve the Cloud Parameterizations”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Sarah

    2015-12-01

    The dual objectives of this project were improving our basic understanding of processes that control cirrus microphysical properties and improvement of the representation of these processes in the parameterizations. A major effort in the proposed research was to integrate, calibrate, and better understand the uncertainties in all of these measurements.

  2. Intent Specifications

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1995-01-01

    We have been investigating the implications of using abstractions based on intent rather than the aggregation and information-hiding abstractions commonly used in software en- gineering: Cognitive psychologists have shown that intent abstraction is consistent with human problem-solving processes. We believe that new types of specifications and designs based on this concept can assist in understanding and specifying requirements, capturing the most important design rationale information in an efficient and economical way, and supporting the process of identifying and analyzing required changes to minimize the introduction of errors. The goal of hierarchical abstraction is to allow both top-down and bottom-up reasoning about a complex system. In computer science, we have made much use of (1) part-whole abstractions where each level of a hierarchy represents an aggregation of the components at a lower level and of (2) information-hiding abstractions where each level contains the same conceptual information but hides some details about the concepts, that is, each level is a refinement of the information at a higher level.

  3. A Challenging Pie to Splice: Drugging the Spliceosome.

    PubMed

    León, Brian; Kashyap, Manoj K; Chan, Warren C; Krug, Kelsey A; Castro, Januario E; La Clair, James J; Burkart, Michael D

    2017-09-25

    Since its discovery in 1977, the study of alternative RNA splicing has revealed a plethora of mechanisms that had never before been documented in nature. Understanding these transitions and their outcome at the level of the cell and organism has become one of the great frontiers of modern chemical biology. Until 2007, this field remained in the hands of RNA biologists. However, the recent identification of natural product and synthetic modulators of RNA splicing has opened new access to this field, allowing for the first time a chemical-based interrogation of RNA splicing processes. Simultaneously, we have begun to understand the vital importance of splicing in disease, which offers a new platform for molecular discovery and therapy. As with many natural systems, gaining clear mechanistic detail at the molecular level is key towards understanding the operation of any biological machine. This minireview presents recent lessons learned in this emerging field of RNA splicing chemistry and chemical biology. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. recount workflow: Accessing over 70,000 human RNA-seq samples with Bioconductor

    PubMed Central

    Collado-Torres, Leonardo; Nellore, Abhinav; Jaffe, Andrew E.

    2017-01-01

    The recount2 resource is composed of over 70,000 uniformly processed human RNA-seq samples spanning TCGA and SRA, including GTEx. The processed data can be accessed via the recount2 website and the recountBioconductor package. This workflow explains in detail how to use the recountpackage and how to integrate it with other Bioconductor packages for several analyses that can be carried out with the recount2 resource. In particular, we describe how the coverage count matrices were computed in recount2 as well as different ways of obtaining public metadata, which can facilitate downstream analyses. Step-by-step directions show how to do a gene-level differential expression analysis, visualize base-level genome coverage data, and perform an analyses at multiple feature levels. This workflow thus provides further information to understand the data in recount2 and a compendium of R code to use the data. PMID:29043067

  5. Large Terrain Continuous Level of Detail 3D Visualization Tool

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    This software solved the problem of displaying terrains that are usually too large to be displayed on standard workstations in real time. The software can visualize terrain data sets composed of billions of vertices, and can display these data sets at greater than 30 frames per second. The Large Terrain Continuous Level of Detail 3D Visualization Tool allows large terrains, which can be composed of billions of vertices, to be visualized in real time. It utilizes a continuous level of detail technique called clipmapping to support this. It offloads much of the work involved in breaking up the terrain into levels of details onto the GPU (graphics processing unit) for faster processing.

  6. An Overview of Hydrologic Studies at Center for Forested Wetlands Research, USDA Forest Service

    Treesearch

    Devendra M. Amatya; Carl C. Trettin; R. Wayne Skaggs; Timothy J. Callahan; Ge Sun; Masato Miwa; John E. Parsons

    2004-01-01

    Managing forested wetland landscapes for water quality improvement and productivity requires a detailed understanding of functional linkages between ecohydrological processes and management practices. Studies are being conducted at Center for Forested Wetlands Research (CFWR), USDA Forest Service to understand the fundamental hydrologic and biogeochemical processes...

  7. Intracellular response to process optimization and impact on productivity and product aggregates for a high-titer CHO cell process.

    PubMed

    Handlogten, Michael W; Lee-O'Brien, Allison; Roy, Gargi; Levitskaya, Sophia V; Venkat, Raghavan; Singh, Shailendra; Ahuja, Sanjeev

    2018-01-01

    A key goal in process development for antibodies is to increase productivity while maintaining or improving product quality. During process development of an antibody, titers were increased from 4 to 10 g/L while simultaneously decreasing aggregates. Process development involved optimization of media and feed formulations, feed strategy, and process parameters including pH and temperature. To better understand how CHO cells respond to process changes, the changes were implemented in a stepwise manner. The first change was an optimization of the feed formulation, the second was an optimization of the medium, and the third was an optimization of process parameters. Multiple process outputs were evaluated including cell growth, osmolality, lactate production, ammonium concentration, antibody production, and aggregate levels. Additionally, detailed assessment of oxygen uptake, nutrient and amino acid consumption, extracellular and intracellular redox environment, oxidative stress, activation of the unfolded protein response (UPR) pathway, protein disulfide isomerase (PDI) expression, and heavy and light chain mRNA expression provided an in-depth understanding of the cellular response to process changes. The results demonstrate that mRNA expression and UPR activation were unaffected by process changes, and that increased PDI expression and optimized nutrient supplementation are required for higher productivity processes. Furthermore, our findings demonstrate the role of extra- and intracellular redox environment on productivity and antibody aggregation. Processes using the optimized medium, with increased concentrations of redox modifying agents, had the highest overall specific productivity, reduced aggregate levels, and helped cells better withstand the high levels of oxidative stress associated with increased productivity. Specific productivities of different processes positively correlated to average intracellular values of total glutathione. Additionally, processes with the optimized media maintained an oxidizing intracellular environment, important for correct disulfide bond pairing, which likely contributed to reduced aggregate formation. These findings shed important understanding into how cells respond to process changes and can be useful to guide future development efforts to enhance productivity and improve product quality. © 2017 Wiley Periodicals, Inc.

  8. Quantitative imaging of mammalian transcriptional dynamics: from single cells to whole embryos.

    PubMed

    Zhao, Ziqing W; White, Melanie D; Bissiere, Stephanie; Levi, Valeria; Plachta, Nicolas

    2016-12-23

    Probing dynamic processes occurring within the cell nucleus at the quantitative level has long been a challenge in mammalian biology. Advances in bio-imaging techniques over the past decade have enabled us to directly visualize nuclear processes in situ with unprecedented spatial and temporal resolution and single-molecule sensitivity. Here, using transcription as our primary focus, we survey recent imaging studies that specifically emphasize the quantitative understanding of nuclear dynamics in both time and space. These analyses not only inform on previously hidden physical parameters and mechanistic details, but also reveal a hierarchical organizational landscape for coordinating a wide range of transcriptional processes shared by mammalian systems of varying complexity, from single cells to whole embryos.

  9. Relationship between population of the fibril-prone conformation in the monomeric state and oligomer formation times of peptides: Insights from all-atom simulations

    NASA Astrophysics Data System (ADS)

    Nam, Hoang Bao; Kouza, Maksim; Zung, Hoang; Li, Mai Suan

    2010-04-01

    Despite much progress in understanding the aggregation process of biomolecules, the factors that govern its rates have not been fully understood. This problem is of particular importance since many conformational diseases such as Alzheimer, Parkinson, and type-II diabetes are associated with the protein oligomerization. Having performed all-atom simulations with explicit water and various force fields for two short peptides KFFE and NNQQ, we show that their oligomer formation times are strongly correlated with the population of the fibril-prone conformation in the monomeric state. The larger the population the faster the aggregation process. Our result not only suggests that this quantity plays a key role in the self-assembly of polypeptide chains but also opens a new way to understand the fibrillogenesis of biomolecules at the monomeric level. The nature of oligomer ordering of NNQQ is studied in detail.

  10. Determination of the core promoter regions of the Saccharomyces cerevisiae RPS3 gene.

    PubMed

    Joo, Yoo Jin; Kim, Jin-Ha; Baek, Joung Hee; Seong, Ki Moon; Lee, Jae Yung; Kim, Joon

    2009-01-01

    Ribosomal protein genes (RPG), which are scattered throughout the genomes of all eukaryotes, are subjected to coordinated expression. In yeast, the expression of RPGs is highly regulated, mainly at the transcriptional level. Recent research has found that many ribosomal proteins (RPs) function in multiple processes in addition to protein synthesis. Therefore, detailed knowledge of promoter architecture as well as gene regulation is important in understanding the multiple cellular processes mediated by RPGs. In this study, we investigated the functional architecture of the yeast RPS3 promoter and identified many putative cis-elements. Using beta-galactosidase reporter analysis and EMSA, the core promoter of RPS3 containing UASrpg and T-rich regions was corroborated. Moreover, the promoter occupancy of RPS3 by three transcription factors was confirmed. Taken together, our results further the current understanding of the promoter architecture and trans-elements of the Saccharomyces cerevisiae RPS3 gene.

  11. Multiscale Systems Analysis of Root Growth and Development: Modeling Beyond the Network and Cellular Scales

    PubMed Central

    Band, Leah R.; Fozard, John A.; Godin, Christophe; Jensen, Oliver E.; Pridmore, Tony; Bennett, Malcolm J.; King, John R.

    2012-01-01

    Over recent decades, we have gained detailed knowledge of many processes involved in root growth and development. However, with this knowledge come increasing complexity and an increasing need for mechanistic modeling to understand how those individual processes interact. One major challenge is in relating genotypes to phenotypes, requiring us to move beyond the network and cellular scales, to use multiscale modeling to predict emergent dynamics at the tissue and organ levels. In this review, we highlight recent developments in multiscale modeling, illustrating how these are generating new mechanistic insights into the regulation of root growth and development. We consider how these models are motivating new biological data analysis and explore directions for future research. This modeling progress will be crucial as we move from a qualitative to an increasingly quantitative understanding of root biology, generating predictive tools that accelerate the development of improved crop varieties. PMID:23110897

  12. 2016 Cost of Wind Energy Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stehly, Tyler J.; Heimiller, Donna M.; Scott, George N.

    This report uses representative utility-scale projects to estimate the levelized cost of energy (LCOE) for land-based and offshore wind power plants in the United States. Data and results detailed here are derived from 2016 commissioned plants. More specifically, analysis detailed here relies on recent market data and state-of-the-art modeling capabilities to maintain an up-to-date understanding of wind energy cost trends and drivers. This report is intended to provide insight into current component-level costs as well as a basis for understanding variability in LCOE across the country. This publication represents the sixth installment of this annual report.

  13. 2015 Cost of Wind Energy Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moné, Christopher; Hand, Maureen; Bolinger, Mark

    This report uses representative utility-scale projects to estimate the levelized cost of energy (LCOE) for land-based and offshore wind plants in the United States. Data and results detailed here are derived from 2015 commissioned plants. More specifically, analysis detailed here relies on recent market data and state-of-the-art modeling capabilities to maintain an up-to-date understanding of wind energy cost trends and drivers. It is intended to provide insight into current component-level costs as well as a basis for understanding variability in LCOE across the industry. This publication reflects the fifth installment of this annual report.

  14. Understanding International Partnerships: A Theoretical and Practical Approach

    ERIC Educational Resources Information Center

    Taylor, John

    2016-01-01

    Internationalisation is now a key strategic priority for many universities. As part of this process, universities are increasingly looking to build a number of key strategic partnerships with a small number of like-minded institutions. This paper, based on a detailed study of three such partnerships, seeks to understand and theorise the process by…

  15. Detailed Physiologic Characterization Reveals Diverse Mechanisms for Novel Genetic Loci Regulating Glucose and Insulin Metabolism in Humans

    PubMed Central

    Ingelsson, Erik; Langenberg, Claudia; Hivert, Marie-France; Prokopenko, Inga; Lyssenko, Valeriya; Dupuis, Josée; Mägi, Reedik; Sharp, Stephen; Jackson, Anne U.; Assimes, Themistocles L.; Shrader, Peter; Knowles, Joshua W.; Zethelius, Björn; Abbasi, Fahim A.; Bergman, Richard N.; Bergmann, Antje; Berne, Christian; Boehnke, Michael; Bonnycastle, Lori L.; Bornstein, Stefan R.; Buchanan, Thomas A.; Bumpstead, Suzannah J.; Böttcher, Yvonne; Chines, Peter; Collins, Francis S.; Cooper, Cyrus C.; Dennison, Elaine M.; Erdos, Michael R.; Ferrannini, Ele; Fox, Caroline S.; Graessler, Jürgen; Hao, Ke; Isomaa, Bo; Jameson, Karen A.; Kovacs, Peter; Kuusisto, Johanna; Laakso, Markku; Ladenvall, Claes; Mohlke, Karen L.; Morken, Mario A.; Narisu, Narisu; Nathan, David M.; Pascoe, Laura; Payne, Felicity; Petrie, John R.; Sayer, Avan A.; Schwarz, Peter E. H.; Scott, Laura J.; Stringham, Heather M.; Stumvoll, Michael; Swift, Amy J.; Syvänen, Ann-Christine; Tuomi, Tiinamaija; Tuomilehto, Jaakko; Tönjes, Anke; Valle, Timo T.; Williams, Gordon H.; Lind, Lars; Barroso, Inês; Quertermous, Thomas; Walker, Mark; Wareham, Nicholas J.; Meigs, James B.; McCarthy, Mark I.; Groop, Leif; Watanabe, Richard M.; Florez, Jose C.

    2010-01-01

    OBJECTIVE Recent genome-wide association studies have revealed loci associated with glucose and insulin-related traits. We aimed to characterize 19 such loci using detailed measures of insulin processing, secretion, and sensitivity to help elucidate their role in regulation of glucose control, insulin secretion and/or action. RESEARCH DESIGN AND METHODS We investigated associations of loci identified by the Meta-Analyses of Glucose and Insulin-related traits Consortium (MAGIC) with circulating proinsulin, measures of insulin secretion and sensitivity from oral glucose tolerance tests (OGTTs), euglycemic clamps, insulin suppression tests, or frequently sampled intravenous glucose tolerance tests in nondiabetic humans (n = 29,084). RESULTS The glucose-raising allele in MADD was associated with abnormal insulin processing (a dramatic effect on higher proinsulin levels, but no association with insulinogenic index) at extremely persuasive levels of statistical significance (P = 2.1 × 10−71). Defects in insulin processing and insulin secretion were seen in glucose-raising allele carriers at TCF7L2, SCL30A8, GIPR, and C2CD4B. Abnormalities in early insulin secretion were suggested in glucose-raising allele carriers at MTNR1B, GCK, FADS1, DGKB, and PROX1 (lower insulinogenic index; no association with proinsulin or insulin sensitivity). Two loci previously associated with fasting insulin (GCKR and IGF1) were associated with OGTT-derived insulin sensitivity indices in a consistent direction. CONCLUSIONS Genetic loci identified through their effect on hyperglycemia and/or hyperinsulinemia demonstrate considerable heterogeneity in associations with measures of insulin processing, secretion, and sensitivity. Our findings emphasize the importance of detailed physiological characterization of such loci for improved understanding of pathways associated with alterations in glucose homeostasis and eventually type 2 diabetes. PMID:20185807

  16. Modeling greenhouse gas emissions from dairy farms.

    PubMed

    Rotz, C Alan

    2017-11-15

    Dairy farms have been identified as an important source of greenhouse gas emissions. Within the farm, important emissions include enteric CH 4 from the animals, CH 4 and N 2 O from manure in housing facilities during long-term storage and during field application, and N 2 O from nitrification and denitrification processes in the soil used to produce feed crops and pasture. Models using a wide range in level of detail have been developed to represent or predict these emissions. They include constant emission factors, variable process-related emission factors, empirical or statistical models, mechanistic process simulations, and life cycle assessment. To fully represent farm emissions, models representing the various emission sources must be integrated to capture the combined effects and interactions of all important components. Farm models have been developed using relationships across the full scale of detail, from constant emission factors to detailed mechanistic simulations. Simpler models, based upon emission factors and empirical relationships, tend to provide better tools for decision support, whereas more complex farm simulations provide better tools for research and education. To look beyond the farm boundaries, life cycle assessment provides an environmental accounting tool for quantifying and evaluating emissions over the full cycle, from producing the resources used on the farm through processing, distribution, consumption, and waste handling of the milk and dairy products produced. Models are useful for improving our understanding of farm processes and their interacting effects on greenhouse gas emissions. Through better understanding, they assist in the development and evaluation of mitigation strategies for reducing emissions and improving overall sustainability of dairy farms. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/3.0/).

  17. Students' Understanding of Quadratic Equations

    ERIC Educational Resources Information Center

    López, Jonathan; Robles, Izraim; Martínez-Planell, Rafael

    2016-01-01

    Action-Process-Object-Schema theory (APOS) was applied to study student understanding of quadratic equations in one variable. This required proposing a detailed conjecture (called a genetic decomposition) of mental constructions students may do to understand quadratic equations. The genetic decomposition which was proposed can contribute to help…

  18. The General Aggression Model.

    PubMed

    Allen, Johnie J; Anderson, Craig A; Bushman, Brad J

    2018-02-01

    The General Aggression Model (GAM) is a comprehensive, integrative, framework for understanding aggression. It considers the role of social, cognitive, personality, developmental, and biological factors on aggression. Proximate processes of GAM detail how person and situation factors influence cognitions, feelings, and arousal, which in turn affect appraisal and decision processes, which in turn influence aggressive or nonaggressive behavioral outcomes. Each cycle of the proximate processes serves as a learning trial that affects the development and accessibility of aggressive knowledge structures. Distal processes of GAM detail how biological and persistent environmental factors can influence personality through changes in knowledge structures. GAM has been applied to understand aggression in many contexts including media violence effects, domestic violence, intergroup violence, temperature effects, pain effects, and the effects of global climate change. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Applications of systems biology towards microbial fuel production.

    PubMed

    Gowen, Christopher M; Fong, Stephen S

    2011-10-01

    Harnessing the immense natural diversity of biological functions for economical production of fuel has enormous potential benefits. Inevitably, however, the native capabilities for any given organism must be modified to increase the productivity or efficiency of a biofuel bioprocess. From a broad perspective, the challenge is to sufficiently understand the details of cellular functionality to be able to prospectively predict and modify the cellular function of a microorganism. Recent advances in experimental and computational systems biology approaches can be used to better understand cellular level function and guide future experiments. With pressure to quickly develop viable, renewable biofuel processes a balance must be maintained between obtaining depth of biological knowledge and applying that knowledge. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Hydrogen Exchange and Mass Spectrometry: A Historical Perspective

    PubMed Central

    Englander, S. Walter

    2012-01-01

    Protein molecules naturally emit streams of information-rich signals in the language of hydrogen exchange concerning the intimate details of their stability, dynamics, function, changes therein, and effects thereon, all resolved to the level of their individual amino acids. The effort to measure protein hydrogen exchange behavior, understand the underlying chemistry and structural physics of hydrogen exchange processes, and use this information to learn about protein properties and function has continued for 50 years. Recent work uses mass spectrometric analysis together with an earlier proteolytic fragmentation method to extend the hydrogen exchange capability to large biologically interesting proteins. This article briefly reviews the advances that have led us to this point and the understanding that has so far been achieved. PMID:16876429

  1. Total Mercury, Methylmercury, Methylmercury Production Potential, and Ancillary Streambed-Sediment and Pore-Water Data for Selected Streams in Oregon, Wisconsin, and Florida, 2003-04

    USGS Publications Warehouse

    Marvin-DiPasquale, Mark C.; Lutz, Michelle A.; Krabbenhoft, David P.; Aiken, George R.; Orem, William H.; Hall, Britt D.; DeWild, John F.; Brigham, Mark E.

    2008-01-01

    Mercury contamination of aquatic ecosystems is an issue of national concern, affecting both wildlife and human health. Detailed information on mercury cycling and food-web bioaccumulation in stream settings and the factors that control these processes is currently limited. In response, the U.S. Geological Survey (USGS) National Water-Quality Assessment Program (NAWQA) conducted detailed studies from 2002 to 2006 on various media to enhance process-level understanding of mercury contamination, biogeochemical cycling, and trophic transfer. Eight streams were sampled for this study: two streams in Oregon, and three streams each in Wisconsin and Florida. Streambed-sediment and pore-water samples were collected between February 2003 and September 2004. This report summarizes the suite of geochemical and microbial constituents measured, the analytical methods used, and provides the raw data in electronic form for both bed-sediment and pore-water media associated with this study.

  2. Multiscale simulations of defect dipole-enhanced electromechanical coupling at dilute defect concentrations

    NASA Astrophysics Data System (ADS)

    Liu, Shi; Cohen, R. E.

    2017-08-01

    The role of defects in solids of mixed ionic-covalent bonds such as ferroelectric oxides is complex. Current understanding of defects on ferroelectric properties at the single-defect level remains mostly at the empirical level, and the detailed atomistic mechanisms for many defect-mediated polarization-switching processes have not been convincingly revealed quantum mechanically. We simulate the polarization-electric field (P-E) and strain-electric field (ɛ-E) hysteresis loops for BaTiO3 in the presence of generic defect dipoles with large-scale molecular dynamics and provide a detailed atomistic picture of the defect dipole-enhanced electromechanical coupling. We develop a general first-principles-based atomistic model, enabling a quantitative understanding of the relationship between macroscopic ferroelectric properties and dipolar impurities of different orientations, concentrations, and dipole moments. We find that the collective orientation of dipolar defects relative to the external field is the key microscopic structure feature that strongly affects materials hardening/softening and electromechanical coupling. We show that a small concentration (≈0.1 at. %) of defect dipoles dramatically improves electromechanical responses. This offers the opportunity to improve the performance of inexpensive polycrystalline ferroelectric ceramics through defect dipole engineering for a range of applications including piezoelectric sensors, actuators, and transducers.

  3. Confluence and Contours: Reflexive Management of Environmental Risk.

    PubMed

    Soane, Emma; Schubert, Iljana; Pollard, Simon; Rocks, Sophie; Black, Edgar

    2016-06-01

    Government institutions have responsibilities to distribute risk management funds meaningfully and to be accountable for their choices. We took a macro-level sociological approach to understanding the role of government in managing environmental risks, and insights from micro-level psychology to examine individual-level risk-related perceptions and beliefs. Survey data from 2,068 U.K. citizens showed that lay people's funding preferences were associated positively with beliefs about responsibility and trust, yet associations with perception varied depending on risk type. Moreover, there were risk-specific differences in the funding preferences of the lay sample and 29 policymakers. A laboratory-based study of 109 participants examined funding allocation in more detail through iterative presentation of expert information. Quantitative and qualitative data revealed a meso-level framework comprising three types of decisionmakers who varied in their willingness to change funding allocation preferences following expert information: adaptors, responders, and resistors. This research highlights the relevance of integrated theoretical approaches to understanding the policy process, and the benefits of reflexive dialogue to managing environmental risks. © 2015 Society for Risk Analysis.

  4. Understanding pictorial information in biology: students' cognitive activities and visual reading strategies

    NASA Astrophysics Data System (ADS)

    Brandstetter, Miriam; Sandmann, Angela; Florian, Christine

    2017-06-01

    In classroom, scientific contents are increasingly communicated through visual forms of representations. Students' learning outcomes rely on their ability to read and understand pictorial information. Understanding pictorial information in biology requires cognitive effort and can be challenging to students. Yet evidence-based knowledge about students' visual reading strategies during the process of understanding pictorial information is pending. Therefore, 42 students at the age of 14-15 were asked to think aloud while trying to understand visual representations of the blood circulatory system and the patellar reflex. A category system was developed differentiating 16 categories of cognitive activities. A Principal Component Analysis revealed two underlying patterns of activities that can be interpreted as visual reading strategies: 1. Inferences predominated by using a problem-solving schema; 2. Inferences predominated by recall of prior content knowledge. Each pattern consists of a specific set of cognitive activities that reflect selection, organisation and integration of pictorial information as well as different levels of expertise. The results give detailed insights into cognitive activities of students who were required to understand the pictorial information of complex organ systems. They provide an evidence-based foundation to derive instructional aids that can promote students pictorial-information-based learning on different levels of expertise.

  5. A robust automated system elucidates mouse home cage behavioral structure

    PubMed Central

    Goulding, Evan H.; Schenk, A. Katrin; Juneja, Punita; MacKay, Adrienne W.; Wade, Jennifer M.; Tecott, Laurence H.

    2008-01-01

    Patterns of behavior exhibited by mice in their home cages reflect the function and interaction of numerous behavioral and physiological systems. Detailed assessment of these patterns thus has the potential to provide a powerful tool for understanding basic aspects of behavioral regulation and their perturbation by disease processes. However, the capacity to identify and examine these patterns in terms of their discrete levels of organization across diverse behaviors has been difficult to achieve and automate. Here, we describe an automated approach for the quantitative characterization of fundamental behavioral elements and their patterns in the freely behaving mouse. We demonstrate the utility of this approach by identifying unique features of home cage behavioral structure and changes in distinct levels of behavioral organization in mice with single gene mutations altering energy balance. The robust, automated, reproducible quantification of mouse home cage behavioral structure detailed here should have wide applicability for the study of mammalian physiology, behavior, and disease. PMID:19106295

  6. Chromatin Challenges during DNA Replication: A Systems Representation

    PubMed Central

    Aladjem, Mirit I.; Weinstein, John N.; Pommier, Yves

    2008-01-01

    In a recent review, A. Groth and coworkers presented a comprehensive account of nucleosome disassembly in front of a DNA replication fork, assembly behind the replication fork, and the copying of epigenetic information onto the replicated chromatin. Understanding those processes however would be enhanced by a comprehensive graphical depiction analogous to a circuit diagram. Accordingly, we have constructed a molecular interaction map (MIM) that preserves in essentially complete detail the processes described by Groth et al. The MIM organizes and elucidates the information presented by Groth et al. on the complexities of chromatin replication, thereby providing a tool for system-level comprehension of the effects of genetic mutations, altered gene expression, and pharmacologic intervention. PMID:17959828

  7. The role of remote sensing in process‐scaling studies of managed forest ecosystems

    Treesearch

    Jeffrey G. Masek; Daniel J. Hayes; M. Joseph Hughes; Sean P. Healey; David P. Turner

    2015-01-01

    Sustaining forest resources requires a better understanding of forest ecosystem processes, and how management decisions and climate change may affect these processes in the future. While plot and inventory data provide our most detailed information on forest carbon, energy, and water cycling, applying this understanding to broader spatial and temporal domains...

  8. Detailed numerical simulations of laser cooling processes

    NASA Technical Reports Server (NTRS)

    Ramirez-Serrano, J.; Kohel, J.; Thompson, R.; Yu, N.

    2001-01-01

    We developed a detailed semiclassical numerical code of the forces applied on atoms in optical and magnetic fields to increase the understanding of the different roles that light, atomic collisions, background pressure, and number of particles play in experiments with laser cooled and trapped atoms.

  9. Considerations for Reporting Finite Element Analysis Studies in Biomechanics

    PubMed Central

    Erdemir, Ahmet; Guess, Trent M.; Halloran, Jason; Tadepalli, Srinivas C.; Morrison, Tina M.

    2012-01-01

    Simulation-based medicine and the development of complex computer models of biological structures is becoming ubiquitous for advancing biomedical engineering and clinical research. Finite element analysis (FEA) has been widely used in the last few decades to understand and predict biomechanical phenomena. Modeling and simulation approaches in biomechanics are highly interdisciplinary, involving novice and skilled developers in all areas of biomedical engineering and biology. While recent advances in model development and simulation platforms offer a wide range of tools to investigators, the decision making process during modeling and simulation has become more opaque. Hence, reliability of such models used for medical decision making and for driving multiscale analysis comes into question. Establishing guidelines for model development and dissemination is a daunting task, particularly with the complex and convoluted models used in FEA. Nonetheless, if better reporting can be established, researchers will have a better understanding of a model’s value and the potential for reusability through sharing will be bolstered. Thus, the goal of this document is to identify resources and considerate reporting parameters for FEA studies in biomechanics. These entail various levels of reporting parameters for model identification, model structure, simulation structure, verification, validation, and availability. While we recognize that it may not be possible to provide and detail all of the reporting considerations presented, it is possible to establish a level of confidence with selective use of these parameters. More detailed reporting, however, can establish an explicit outline of the decision-making process in simulation-based analysis for enhanced reproducibility, reusability, and sharing. PMID:22236526

  10. Formation of homophily in academic performance: Students change their friends rather than performance

    PubMed Central

    Smirnov, Ivan

    2017-01-01

    Homophily, the tendency of individuals to associate with others who share similar traits, has been identified as a major driving force in the formation and evolution of social ties. In many cases, it is not clear if homophily is the result of a socialization process, where individuals change their traits according to the dominance of that trait in their local social networks, or if it results from a selection process, in which individuals reshape their social networks so that their traits match those in the new environment. Here we demonstrate the detailed temporal formation of strong homophily in academic achievements of high school and university students. We analyze a unique dataset that contains information about the detailed time evolution of a friendship network of 6,000 students across 42 months. Combining the evolving social network data with the time series of the academic performance (GPA) of individual students, we show that academic homophily is a result of selection: students prefer to gradually reorganize their social networks according to their performance levels, rather than adapting their performance to the level of their local group. We find no signs for a pull effect, where a social environment of good performers motivates bad students to improve their performance. We are able to understand the underlying dynamics of grades and networks with a simple model. The lack of a social pull effect in classical educational settings could have important implications for the understanding of the observed persistence of segregation, inequality and social immobility in societies. PMID:28854202

  11. Microscopic Structure and Solubility Predictions of Multifunctional Solids in Supercritical Carbon Dioxide: A Molecular Simulation Study.

    PubMed

    Noroozi, Javad; Paluch, Andrew S

    2017-02-23

    Molecular dynamics simulations were employed to both estimate the solubility of nonelectrolyte solids, such as acetanilide, acetaminophen, phenacetin, methylparaben, and lidocaine, in supercritical carbon dioxide and understand the underlying molecular-level driving forces. The solubility calculations involve the estimation of the solute's limiting activity coefficient, which may be computed using conventional staged free-energy calculations. For the case of lidocaine, wherein the infinite dilution approximation is not appropriate, we demonstrate how the activity coefficient at finite concentrations may be estimated without additional effort using the dilute solution approximation and how this may be used to further understand the solvation process. Combining with experimental pure-solid properties, namely, the normal melting point and enthalpy of fusion, solubilities were estimated. The results are in good quantitative agreement with available experimental data, suggesting that molecular simulations may be a powerful tool for understanding supercritical processes and the design of carbon dioxide-philic molecular systems. Structural analyses were performed to shed light on the microscopic details of the solvation of different functional groups by carbon dioxide and the observed solubility trends.

  12. CERES Product Level Details

    Atmospheric Science Data Center

    2013-02-28

    ... CERES Product Level Details   Level 1B:  Data products are processed to sensor units. The BDS product contains CERES ... position and velocity, and all raw engineering and status data from the instrument. Level 2:  Data products are derived ... between average global net TOA flux imbalance and ocean heat storage). ...

  13. Making ecological models adequate

    USGS Publications Warehouse

    Getz, Wayne M.; Marshall, Charles R.; Carlson, Colin J.; Giuggioli, Luca; Ryan, Sadie J.; Romañach, Stephanie; Boettiger, Carl; Chamberlain, Samuel D.; Larsen, Laurel; D'Odorico, Paolo; O'Sullivan, David

    2018-01-01

    Critical evaluation of the adequacy of ecological models is urgently needed to enhance their utility in developing theory and enabling environmental managers and policymakers to make informed decisions. Poorly supported management can have detrimental, costly or irreversible impacts on the environment and society. Here, we examine common issues in ecological modelling and suggest criteria for improving modelling frameworks. An appropriate level of process description is crucial to constructing the best possible model, given the available data and understanding of ecological structures. Model details unsupported by data typically lead to over parameterisation and poor model performance. Conversely, a lack of mechanistic details may limit a model's ability to predict ecological systems’ responses to management. Ecological studies that employ models should follow a set of model adequacy assessment protocols that include: asking a series of critical questions regarding state and control variable selection, the determinacy of data, and the sensitivity and validity of analyses. We also need to improve model elaboration, refinement and coarse graining procedures to better understand the relevancy and adequacy of our models and the role they play in advancing theory, improving hind and forecasting, and enabling problem solving and management.

  14. Microscopic theory of cavity-enhanced single-photon emission from optical two-photon Raman processes

    NASA Astrophysics Data System (ADS)

    Breddermann, Dominik; Praschan, Tom; Heinze, Dirk; Binder, Rolf; Schumacher, Stefan

    2018-03-01

    We consider cavity-enhanced single-photon generation from stimulated two-photon Raman processes in three-level systems. We compare four fundamental system configurations, one Λ -, one V-, and two ladder (Ξ -) configurations. These can be realized as subsystems of a single quantum dot or of quantum-dot molecules. For a new microscopic understanding of the Raman process, we analyze the Heisenberg equation of motion applying the cluster-expansion scheme. Within this formalism an exact and rigorous definition of a cavity-enhanced Raman photon via its corresponding Raman correlation is possible. This definition for example enables us to systematically investigate the on-demand potential of Raman-transition-based single-photon sources. The four system arrangements can be divided into two subclasses, Λ -type and V-type, which exhibit strongly different Raman-emission characteristics and Raman-emission probabilities. Moreover, our approach reveals whether the Raman path generates a single photon or just induces destructive quantum interference with other excitation paths. Based on our findings and as a first application, we gain a more detailed understanding of experimental data from the literature. Our analysis and results are also transferable to the case of atomic three-level-resonator systems and can be extended to more complicated multilevel schemes.

  15. Enduring Understandings, Artistic Processes, and the New Visual Arts Standards: A Close-up Consideration for Curriculum Planning

    ERIC Educational Resources Information Center

    Stewart, Marilyn G.

    2014-01-01

    National Coalition for Core Arts Standards (NCCAS) Writing Team member Marilyn G. Stewart discusses what to expect from the new "next generation" Visual Arts Standards, detailing the 4 Artistic Processes and 15 Enduring Understandings. This invited essay addresses the instructional aspects of the standards, and looks at how they can help…

  16. WE-DE-202-02: Are Track Structure Simulations Truly Needed for Radiobiology at the Cellular and Tissue Levels?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  17. A Multi-scale Computational Platform to Mechanistically Assess the Effect of Genetic Variation on Drug Responses in Human Erythrocyte Metabolism

    PubMed Central

    Bordbar, Aarash; Palsson, Bernhard O.

    2016-01-01

    Progress in systems medicine brings promise to addressing patient heterogeneity and individualized therapies. Recently, genome-scale models of metabolism have been shown to provide insight into the mechanistic link between drug therapies and systems-level off-target effects while being expanded to explicitly include the three-dimensional structure of proteins. The integration of these molecular-level details, such as the physical, structural, and dynamical properties of proteins, notably expands the computational description of biochemical network-level properties and the possibility of understanding and predicting whole cell phenotypes. In this study, we present a multi-scale modeling framework that describes biological processes which range in scale from atomistic details to an entire metabolic network. Using this approach, we can understand how genetic variation, which impacts the structure and reactivity of a protein, influences both native and drug-induced metabolic states. As a proof-of-concept, we study three enzymes (catechol-O-methyltransferase, glucose-6-phosphate dehydrogenase, and glyceraldehyde-3-phosphate dehydrogenase) and their respective genetic variants which have clinically relevant associations. Using all-atom molecular dynamic simulations enables the sampling of long timescale conformational dynamics of the proteins (and their mutant variants) in complex with their respective native metabolites or drug molecules. We find that changes in a protein’s structure due to a mutation influences protein binding affinity to metabolites and/or drug molecules, and inflicts large-scale changes in metabolism. PMID:27467583

  18. A Multi-scale Computational Platform to Mechanistically Assess the Effect of Genetic Variation on Drug Responses in Human Erythrocyte Metabolism.

    PubMed

    Mih, Nathan; Brunk, Elizabeth; Bordbar, Aarash; Palsson, Bernhard O

    2016-07-01

    Progress in systems medicine brings promise to addressing patient heterogeneity and individualized therapies. Recently, genome-scale models of metabolism have been shown to provide insight into the mechanistic link between drug therapies and systems-level off-target effects while being expanded to explicitly include the three-dimensional structure of proteins. The integration of these molecular-level details, such as the physical, structural, and dynamical properties of proteins, notably expands the computational description of biochemical network-level properties and the possibility of understanding and predicting whole cell phenotypes. In this study, we present a multi-scale modeling framework that describes biological processes which range in scale from atomistic details to an entire metabolic network. Using this approach, we can understand how genetic variation, which impacts the structure and reactivity of a protein, influences both native and drug-induced metabolic states. As a proof-of-concept, we study three enzymes (catechol-O-methyltransferase, glucose-6-phosphate dehydrogenase, and glyceraldehyde-3-phosphate dehydrogenase) and their respective genetic variants which have clinically relevant associations. Using all-atom molecular dynamic simulations enables the sampling of long timescale conformational dynamics of the proteins (and their mutant variants) in complex with their respective native metabolites or drug molecules. We find that changes in a protein's structure due to a mutation influences protein binding affinity to metabolites and/or drug molecules, and inflicts large-scale changes in metabolism.

  19. Better models are more effectively connected models

    NASA Astrophysics Data System (ADS)

    Nunes, João Pedro; Bielders, Charles; Darboux, Frederic; Fiener, Peter; Finger, David; Turnbull-Lloyd, Laura; Wainwright, John

    2016-04-01

    The concept of hydrologic and geomorphologic connectivity describes the processes and pathways which link sources (e.g. rainfall, snow and ice melt, springs, eroded areas and barren lands) to accumulation areas (e.g. foot slopes, streams, aquifers, reservoirs), and the spatial variations thereof. There are many examples of hydrological and sediment connectivity on a watershed scale; in consequence, a process-based understanding of connectivity is crucial to help managers understand their systems and adopt adequate measures for flood prevention, pollution mitigation and soil protection, among others. Modelling is often used as a tool to understand and predict fluxes within a catchment by complementing observations with model results. Catchment models should therefore be able to reproduce the linkages, and thus the connectivity of water and sediment fluxes within the systems under simulation. In modelling, a high level of spatial and temporal detail is desirable to ensure taking into account a maximum number of components, which then enables connectivity to emerge from the simulated structures and functions. However, computational constraints and, in many cases, lack of data prevent the representation of all relevant processes and spatial/temporal variability in most models. In most cases, therefore, the level of detail selected for modelling is too coarse to represent the system in a way in which connectivity can emerge; a problem which can be circumvented by representing fine-scale structures and processes within coarser scale models using a variety of approaches. This poster focuses on the results of ongoing discussions on modelling connectivity held during several workshops within COST Action Connecteur. It assesses the current state of the art of incorporating the concept of connectivity in hydrological and sediment models, as well as the attitudes of modellers towards this issue. The discussion will focus on the different approaches through which connectivity can be represented in models: either by allowing it to emerge from model behaviour or by parameterizing it inside model structures; and on the appropriate scale at which processes should be represented explicitly or implicitly. It will also explore how modellers themselves approach connectivity through the results of a community survey. Finally, it will present the outline of an international modelling exercise aimed at assessing how different modelling concepts can capture connectivity in real catchments.

  20. Multi-Resolution Indexing for Hierarchical Out-of-Core Traversal of Rectilinear Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pascucci, V.

    2000-07-10

    The real time processing of very large volumetric meshes introduces specific algorithmic challenges due to the impossibility of fitting the input data in the main memory of a computer. The basic assumption (RAM computational model) of uniform-constant-time access to each memory location is not valid because part of the data is stored out-of-core or in external memory. The performance of most algorithms does not scale well in the transition from the in-core to the out-of-core processing conditions. The performance degradation is due to the high frequency of I/O operations that may start dominating the overall running time. Out-of-core computing [28]more » addresses specifically the issues of algorithm redesign and data layout restructuring to enable data access patterns with minimal performance degradation in out-of-core processing. Results in this area are also valuable in parallel and distributed computing where one has to deal with the similar issue of balancing processing time with data migration time. The solution of the out-of-core processing problem is typically divided into two parts: (i) analysis of a specific algorithm to understand its data access patterns and, when possible, redesign the algorithm to maximize their locality; and (ii) storage of the data in secondary memory with a layout consistent with the access patterns of the algorithm to amortize the cost of each I/O operation over several memory access operations. In the case of a hierarchical visualization algorithms for volumetric data the 3D input hierarchy is traversed to build derived geometric models with adaptive levels of detail. The shape of the output models is then modified dynamically with incremental updates of their level of detail. The parameters that govern this continuous modification of the output geometry are dependent on the runtime user interaction making it impossible to determine a priori what levels of detail are going to be constructed. For example they can be dependent from external parameters like the viewpoint of the current display window or from internal parameters like the isovalue of an isocontour or the position of an orthogonal slice. The structure of the access pattern can be summarized into two main points: (i) the input hierarchy is traversed level by level so that the data in the same level of resolution or in adjacent levels is traversed at the same time and (ii) within each level of resolution the data is mostly traversed at the same time in regions that are geometrically close. In this paper I introduce a new static indexing scheme that induces a data layout satisfying both requirements (i) and (ii) for the hierarchical traversal of n-dimensional regular grids. In one particular implementation the scheme exploits in a new way the recursive construction of the Z-order space filling curve. The standard indexing that maps the input nD data onto a 1D sequence for the Z-order curve is based on a simple bit interleaving operation that merges the n input indices into one index n times longer. This helps in grouping the data for geometric proximity but only for a specific level of detail. In this paper I show how this indexing can be transformed into an alternative index that allows to group the data per level of resolution first and then the data within each level per geometric proximity. This yields a data layout that is appropriate for hierarchical out-of-core processing of large grids.« less

  1. High School Teachers' Understanding of Blackbody Radiation

    ERIC Educational Resources Information Center

    Balta, Nuri

    2018-01-01

    This study is a detailed look at the level of understanding of fundamental ideas about blackbody radiation (BBR) among physics teachers. The aim is to explore associations and ideas that teachers have regarding blackbody radiation: a concept used routinely in physics and chemistry, which is necessary to understand fundamentals of quantum physics.…

  2. The Interaction of Global Biochemical Cycles

    NASA Technical Reports Server (NTRS)

    Moore, B., III; Dastoor, M. N.

    1984-01-01

    The global biosphere in an exceedingly complex system. To gain an understanding of its structure and dynamic features, it is necessary not only to increase the knowledge about the detailed processes but also to develop models of how global interactions take place. Attempts to analyze the detailed physical, chemical and biological processes in this context need to be guided by an advancement of understanding of the latter. It is necessary to develop a strategy of data gathering that severs both these purposes simultaneously. The following papers deal with critical aspects in the global cycles of carbon, nitrogen, phosphorus and sulfur in details as well as the cycle of water and the flow of energy in the Earth's environment. The objective is to set partly the foundation for the development of mathematical models that allow exploration of the coupled dynamics of the global cycles of carbon, nitrogen, phosphorus, sulfur, as well as energy and water flux.

  3. Current at Metal-Organic Interfaces

    NASA Astrophysics Data System (ADS)

    Kern, Klaus

    2012-02-01

    Charge transport through atomic and molecular constrictions greatly affects the operation and performance of organic electronic devices. Much of our understanding of the charge injection and extraction processes in these systems relays on our knowledge of the electronic structure at the metal-organic interface. Despite significant experimental and theoretical advances in studying charge transport in nanoscale junctions, a microscopic understanding at the single atom/molecule level is missing. In the present talk I will present our recent results to probe directly the nanocontact between single molecules and a metal electrode using scanning probe microscopy and spectroscopy. The experiments provide unprecedented microscopic details of single molecule and atom junctions and open new avenues to study quantum critical and many body phenomena at the atomic scale. Implications for energy conversion devices and carbon based nanoelectronics will also be discussed.

  4. Central role of the cell in microbial ecology.

    PubMed

    Zengler, Karsten

    2009-12-01

    Over the last few decades, advances in cultivation-independent methods have significantly contributed to our understanding of microbial diversity and community composition in the environment. At the same time, cultivation-dependent methods have thrived, and the growing number of organisms obtained thereby have allowed for detailed studies of their physiology and genetics. Still, most microorganisms are recalcitrant to cultivation. This review not only conveys current knowledge about different isolation and cultivation strategies but also discusses what implications can be drawn from pure culture work for studies in microbial ecology. Specifically, in the light of single-cell individuality and genome heterogeneity, it becomes important to evaluate population-wide measurements carefully. An overview of various approaches in microbial ecology is given, and the cell as a central unit for understanding processes on a community level is discussed.

  5. Imaging and Analytical Approaches for Characterization of Soil Mineral Weathering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dohnalkova, Alice; Arey, Bruce; Varga, Tamas

    Soil minerals weathering is the primary natural source of nutrients necessary to sustain productivity in terrestrial ecosystems. Soil microbial communities increase soil mineral weathering and mineral-derived nutrient availability through physical and chemical processes. Rhizosphere, the zone immediately surrounding plant roots, is a biogeochemical hotspot with microbial activity, soil organic matter production, mineral weathering, and secondary phase formation all happening in a small temporally ephemeral zone of steep geochemical gradients. The detailed exploration of the micro-scale rhizosphere is essential to our better understanding of large-scale processes in soils, such as nutrient cycling, transport and fate of soil components, microbial-mineral interactions, soilmore » erosion, soil organic matter turnover and its molecular-level characterization, and predictive modeling.« less

  6. Nanodiscs in Membrane Biochemistry and Biophysics.

    PubMed

    Denisov, Ilia G; Sligar, Stephen G

    2017-03-22

    Membrane proteins play a most important part in metabolism, signaling, cell motility, transport, development, and many other biochemical and biophysical processes which constitute fundamentals of life on the molecular level. Detailed understanding of these processes is necessary for the progress of life sciences and biomedical applications. Nanodiscs provide a new and powerful tool for a broad spectrum of biochemical and biophysical studies of membrane proteins and are commonly acknowledged as an optimal membrane mimetic system that provides control over size, composition, and specific functional modifications on the nanometer scale. In this review we attempted to combine a comprehensive list of various applications of nanodisc technology with systematic analysis of the most attractive features of this system and advantages provided by nanodiscs for structural and mechanistic studies of membrane proteins.

  7. Mass Spectrometry: A Technique of Many Faces

    PubMed Central

    Olshina, Maya A.; Sharon, Michal

    2016-01-01

    Protein complexes form the critical foundation for a wide range of biological process, however understanding the intricate details of their activities is often challenging. In this review we describe how mass spectrometry plays a key role in the analysis of protein assemblies and the cellular pathways which they are involved in. Specifically, we discuss how the versatility of mass spectrometric approaches provides unprecedented information on multiple levels. We demonstrate this on the ubiquitin-proteasome proteolytic pathway, a process that is responsible for protein turnover. We follow the various steps of this degradation route and illustrate the different mass spectrometry workflows that were applied for elucidating molecular information. Overall, this review aims to stimulate the integrated use of multiple mass spectrometry approaches for analyzing complex biological systems. PMID:28100928

  8. Natural selection and the evolution of reproductive effort.

    PubMed

    Hirshfield, M F; Tinkle, D W

    1975-06-01

    Reproductive effort is defined as that proportion of the total energy budget of an organism that is devoted to reproductive processes. Reproductive effort at a given age within a species will be selected to maximize reproductive value at that age. Reproductive effort is not directly affected by changes in juvenile survivorship, nor necessarily reduced by an increase in adult survivorship. Selection for high levels of reproductive effort should occur when extrinsic adult mortality is high, in environments with constant juvenile survivorship, and in good years for juvenile survivorship in a variable environment, provided that the quality of the year is predictable by adults. Data necessary to measure reproductive effort and to understand how selection results in different levels of effort between individuals and species are discussed. We make several predictions about the effect of increased resource availability on reproductive effort. The empirical bases for testing these predictions are presently inadequate, and we consider data on energy budgets of organisms in nature to be essential for such test. We also conclude that variance in life table parameters must be known in detail to understand the selective bases of levels of reproductive effort.

  9. Computational fluid dynamics and frequency-dependent finite-difference time-domain method coupling for the interaction between microwaves and plasma in rocket plumes

    NASA Astrophysics Data System (ADS)

    Kinefuchi, K.; Funaki, I.; Shimada, T.; Abe, T.

    2012-10-01

    Under certain conditions during rocket flights, ionized exhaust plumes from solid rocket motors may interfere with radio frequency transmissions. To understand the relevant physical processes involved in this phenomenon and establish a prediction process for in-flight attenuation levels, we attempted to measure microwave attenuation caused by rocket exhaust plumes in a sea-level static firing test for a full-scale solid propellant rocket motor. The microwave attenuation level was calculated by a coupling simulation of the inviscid-frozen-flow computational fluid dynamics of an exhaust plume and detailed analysis of microwave transmissions by applying a frequency-dependent finite-difference time-domain method with the Drude dispersion model. The calculated microwave attenuation level agreed well with the experimental results, except in the case of interference downstream the Mach disk in the exhaust plume. It was concluded that the coupling estimation method based on the physics of the frozen plasma flow with Drude dispersion would be suitable for actual flight conditions, although the mixing and afterburning in the plume should be considered depending on the flow condition.

  10. A guide to writing a scientific paper: a focus on high school through graduate level student research.

    PubMed

    Hesselbach, Renee A; Petering, David H; Berg, Craig A; Tomasiewicz, Henry; Weber, Daniel

    2012-12-01

    This article presents a detailed guide for high school through graduate level instructors that leads students to write effective and well-organized scientific papers. Interesting research emerges from the ability to ask questions, define problems, design experiments, analyze and interpret data, and make critical connections. This process is incomplete, unless new results are communicated to others because science fundamentally requires peer review and criticism to validate or discard proposed new knowledge. Thus, a concise and clearly written research paper is a critical step in the scientific process and is important for young researchers as they are mastering how to express scientific concepts and understanding. Moreover, learning to write a research paper provides a tool to improve science literacy as indicated in the National Research Council's National Science Education Standards (1996), and A Framework for K-12 Science Education (2011), the underlying foundation for the Next Generation Science Standards currently being developed. Background information explains the importance of peer review and communicating results, along with details of each critical component, the Abstract, Introduction, Methods, Results, and Discussion. Specific steps essential to helping students write clear and coherent research papers that follow a logical format, use effective communication, and develop scientific inquiry are described.

  11. A Guide to Writing a Scientific Paper: A Focus on High School Through Graduate Level Student Research

    PubMed Central

    Petering, David H.; Berg, Craig A.; Tomasiewicz, Henry; Weber, Daniel

    2012-01-01

    Abstract This article presents a detailed guide for high school through graduate level instructors that leads students to write effective and well-organized scientific papers. Interesting research emerges from the ability to ask questions, define problems, design experiments, analyze and interpret data, and make critical connections. This process is incomplete, unless new results are communicated to others because science fundamentally requires peer review and criticism to validate or discard proposed new knowledge. Thus, a concise and clearly written research paper is a critical step in the scientific process and is important for young researchers as they are mastering how to express scientific concepts and understanding. Moreover, learning to write a research paper provides a tool to improve science literacy as indicated in the National Research Council's National Science Education Standards (1996), and A Framework for K–12 Science Education (2011), the underlying foundation for the Next Generation Science Standards currently being developed. Background information explains the importance of peer review and communicating results, along with details of each critical component, the Abstract, Introduction, Methods, Results, and Discussion. Specific steps essential to helping students write clear and coherent research papers that follow a logical format, use effective communication, and develop scientific inquiry are described. PMID:23094692

  12. Draft guidelines for measurement and assessment of low-level ambient noise

    DOT National Transportation Integrated Search

    1998-03-31

    This document describes an ambient noise measurement protocol, a detailed methodology for characterizing ambient noise in low-level environments such as the National Parks. It presents definitions of terminology useful for understanding the mea...

  13. [Process-oriented cost calculation in interventional radiology. A case study].

    PubMed

    Mahnken, A H; Bruners, P; Günther, R W; Rasche, C

    2012-01-01

    Currently used costing methods such as cost centre accounting do not sufficiently reflect the process-based resource utilization in medicine. The goal of this study was to establish a process-oriented cost assessment of percutaneous radiofrequency (RF) ablation of liver and lung metastases. In each of 15 patients a detailed task analysis of the primary process of hepatic and pulmonary RF ablation was performed. Based on these data a dedicated cost calculation model was developed for each primary process. The costs of each process were computed and compared with the revenue for in-patients according to the German diagnosis-related groups (DRG) system 2010. The RF ablation of liver metastases in patients without relevant comorbidities and a low patient complexity level results in a loss of EUR 588.44, whereas the treatment of patients with a higher complexity level yields an acceptable profit. The treatment of pulmonary metastases is profitable even in cases of additional expenses due to complications. Process-oriented costing provides relevant information that is needed for understanding the economic impact of treatment decisions. It is well suited as a starting point for economically driven process optimization and reengineering. Under the terms of the German DRG 2010 system percutaneous RF ablation of lung metastases is economically reasonable, while RF ablation of liver metastases in cases of low patient complexity levels does not cover the costs.

  14. Numerical analysis of quantitative measurement of hydroxyl radical concentration using laser-induced fluorescence in flame

    NASA Astrophysics Data System (ADS)

    Shuang, Chen; Tie, Su; Yao-Bang, Zheng; Li, Chen; Ting-Xu, Liu; Ren-Bing, Li; Fu-Rong, Yang

    2016-06-01

    The aim of the present work is to quantitatively measure the hydroxyl radical concentration by using LIF (laser-induced fluorescence) in flame. The detailed physical models of spectral absorption lineshape broadening, collisional transition and quenching at elevated pressure are built. The fine energy level structure of the OH molecule is illustrated to understand the process with laser-induced fluorescence emission and others in the case without radiation, which include collisional quenching, rotational energy transfer (RET), and vibrational energy transfer (VET). Based on these, some numerical results are achieved by simulations in order to evaluate the fluorescence yield at elevated pressure. These results are useful for understanding the real physical processes in OH-LIF technique and finding a way to calibrate the signal for quantitative measurement of OH concentration in a practical combustor. Project supported by the National Natural Science Foundation of China (Grant No. 11272338) and the Fund from the Science and Technology on Scramjet Key Laboratory, China (Grant No. STSKFKT2013004).

  15. Meta-cognitive student reflections

    NASA Astrophysics Data System (ADS)

    Barquist, Britt; Stewart, Jim

    2009-05-01

    We have recently concluded a project testing the effectiveness of a weekly assignment designed to encourage awareness and improvement of meta-cognitive skills. The project is based on the idea that successful problem solvers implement a meta-cognitive process in which they identify the specific concept they are struggling with, and then identify what they understand, what they don't understand, and what they need to know in order to resolve their problem. The assignment required the students to write an email assessing the level of completion of a weekly workbook assignment and to examine in detail their experiences regarding a specific topic they struggled with. The assignment guidelines were designed to coach them through this meta-cognitive process. We responded to most emails with advice for next week's assignment. Our data follow 12 students through a quarter consisting of 11 email assignments which were scored using a rubric based on the assignment guidelines. We found no correlation between rubric scores and final grades. We do have anecdotal evidence that the assignment was beneficial.

  16. Serum cytokine levels in patients with hepatocellular carcinoma.

    PubMed

    Capone, Francesca; Costantini, Susan; Guerriero, Eliana; Calemma, Rosa; Napolitano, Maria; Scala, Stefania; Izzo, Francesco; Castello, Giuseppe

    2010-06-01

    The role played by the microenvironment in cancer induction, promotion and progression is crucial. Emerging evidence suggests that cytokines, chemokines and growth factors are major players in carcinogenesis. Therefore, a detailed understanding of factors and mechanisms associated with the processes leading from inflammation to cancer could improve the therapeutic strategies against this disease. We have used hepatocarcinoma as our model in this study. We evaluated the serum levels of 50 different cytokines, chemokines and growth factors in patients affected by HCC with chronic HCV-related hepatitis and liver cirrhosis using multiplex biometric ELISA-based immunoassay. Our data showed that some pro-inflammatory molecules were significantly up-regulated in these patients, and highlighted the complexity of the cytokine network in this disease. This work suggests the need to monitor these proteins in order to define a profile that could characterize patients with HCC or to help identify useful markers. This could lead to better definition of the disease state, and to an increased understanding of the relationships between chronic inflammation and cancer.

  17. Escape rate for nonequilibrium processes dominated by strong non-detailed balance force

    NASA Astrophysics Data System (ADS)

    Tang, Ying; Xu, Song; Ao, Ping

    2018-02-01

    Quantifying the escape rate from a meta-stable state is essential to understand a wide range of dynamical processes. Kramers' classical rate formula is the product of an exponential function of the potential barrier height and a pre-factor related to the friction coefficient. Although many applications of the rate formula focused on the exponential term, the prefactor can have a significant effect on the escape rate in certain parameter regions, such as the overdamped limit and the underdamped limit. There have been continuous interests to understand the effect of non-detailed balance on the escape rate; however, how the prefactor behaves under strong non-detailed balance force remains elusive. In this work, we find that the escape rate formula has a vanishing prefactor with decreasing friction strength under the strong non-detailed balance limit. We both obtain analytical solutions in specific examples and provide a derivation for more general cases. We further verify the result by simulations and propose a testable experimental system of a charged Brownian particle in electromagnetic field. Our study demonstrates that a special care is required to estimate the effect of prefactor on the escape rate when non-detailed balance force dominates.

  18. The devil is in the detail: Quantifying vocal variation in a complex, multi-levelled, and rapidly evolving display.

    PubMed

    Garland, Ellen C; Rendell, Luke; Lilley, Matthew S; Poole, M Michael; Allen, Jenny; Noad, Michael J

    2017-07-01

    Identifying and quantifying variation in vocalizations is fundamental to advancing our understanding of processes such as speciation, sexual selection, and cultural evolution. The song of the humpback whale (Megaptera novaeangliae) presents an extreme example of complexity and cultural evolution. It is a long, hierarchically structured vocal display that undergoes constant evolutionary change. Obtaining robust metrics to quantify song variation at multiple scales (from a sound through to population variation across the seascape) is a substantial challenge. Here, the authors present a method to quantify song similarity at multiple levels within the hierarchy. To incorporate the complexity of these multiple levels, the calculation of similarity is weighted by measurements of sound units (lower levels within the display) to bridge the gap in information between upper and lower levels. Results demonstrate that the inclusion of weighting provides a more realistic and robust representation of song similarity at multiple levels within the display. This method permits robust quantification of cultural patterns and processes that will also contribute to the conservation management of endangered humpback whale populations, and is applicable to any hierarchically structured signal sequence.

  19. Two are not better than one: Combining unitization and relational encoding strategies.

    PubMed

    Tu, Hsiao-Wei; Diana, Rachel A

    2016-01-01

    In recognition memory, recollection is defined as retrieval of the context associated with an event, whereas familiarity is defined as retrieval based on item strength alone. Recent studies have shown that conventional recollection-based tasks, in which context details are manipulated for source memory assessment at test, can also rely on familiarity when context information is "unitized" with the relevant item information at encoding. Unlike naturalistic episodic memories that include many context details encoded in different ways simultaneously, previous studies have focused on unitization and its effect on the recognition of a single context detail. To further understand how various encoding strategies operate on item and context representations, we independently assigned unitization and relational association to 2 context details (size and color) of each item and tested the contribution of recollection and familiarity to source recognition of each detail. The influence of familiarity on retrieval of each context detail was compared as a function of the encoding strategy used for each detail. Receiver operating characteristic curves suggested that the unitization effect was not additive and that similar levels of familiarity occurred for 1 or multiple details when unitization was the only strategy applied during encoding. On the other hand, a detrimental effect was found when relational encoding and unitization were simultaneously applied to 1 item such that a salient nonunitized context detail interfered with the effortful processing required to unitize an accompanying context detail. However, this detrimental effect was not reciprocal and possibly dependent on the nature of individual context details. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Global Consultation Processes: Lessons Learned from Refugee Teacher Consultation Research in Malaysia

    ERIC Educational Resources Information Center

    O'Neal, Colleen R.; Gosnell, Nicole M.; Ng, Wai Sheng; Clement, Jennifer; Ong, Edward

    2018-01-01

    The process of global consultation has received little attention despite its potential for promoting international mutual understanding with marginalized communities. This article details theory, entry, implementation, and evaluation processes for global consultation research, including lessons learned from our refugee teacher intervention. The…

  1. Monitoring and Predicting the Export and Fate of Global Ocean Net Primary Production: The EXPORTS Field Program

    NASA Astrophysics Data System (ADS)

    Exports Science Definition Team

    2016-04-01

    Ocean ecosystems play a critical role in the Earth's carbon cycle and its quantification on global scales remains one of the greatest challenges in global ocean biogeochemistry. The goal of the EXport Processes in the Ocean from Remote Sensing (EXPORTS) science plan is to develop a predictive understanding of the export and fate of global ocean primary production and its implications for the Earth's carbon cycle in present and future climates. NASA's satellite ocean-color data record has revolutionized our understanding of global marine systems. EXPORTS is designed to advance the utility of NASA ocean color assets to predict how changes in ocean primary production will impact the global carbon cycle. EXPORTS will create a predictive understanding of both the export of organic carbon from the euphotic zone and its fate in the underlying "twilight zone" (depths of 500 m or more) where variable fractions of exported organic carbon are respired back to CO2. Ultimately, it is the sequestration of deep organic carbon transport that defines the impact of ocean biota on atmospheric CO2 levels and hence climate. EXPORTS will generate a new, detailed understanding of ocean carbon transport processes and pathways linking upper ocean phytoplankton processes to the export and fate of organic matter in the underlying twilight zone using a combination of field campaigns, remote sensing and numerical modeling. The overarching objective for EXPORTS is to ensure the success of future satellite missions by establishing mechanistic relationships between remotely sensed signals and carbon cycle processes. Through a process-oriented approach, EXPORTS will foster new insights on ocean carbon cycling that will maximize its societal relevance and be a key component in the U.S. investment to understand Earth as an integrated system.

  2. Using the Co-Production of Knowledge for Developing Realistic Natural Disaster Scenarios for Small-to-Medium Scale Emergency Management Exercises

    NASA Astrophysics Data System (ADS)

    Robinson, T.; Wilson, T. M.; Davies, T. R.; Orchiston, C.; Thompson, J.

    2014-12-01

    Disaster scenarios for Emergency Management (EM) exercises are a widely-used and effective tool for communicating hazard information to policy makers, EM personnel, lifelines operators and communities in general. It is crucial that the scenarios are as realistic as possible. Major disasters however, contain a series of cascading consequences, both environmental and social, which are difficult to model. Consequently, only recently have large-scale exercises included such processes; incorporating these in small- and medium-scale scenarios has yet to be attempted. This study details work undertaken in a recent medium-scale earthquake exercise in New Zealand to introduce such cascading processes into the disaster scenario. Given limited time, data, and funding, we show that the co-production of knowledge between natural disaster scientists, EM personnel, and governance and lifelines organisations can yield detailed, realistic scenarios. Using the co-production process, scenario development was able to identify where the pre-exercise state of knowledge was insufficient. This enabled a focussed research response driven by end-user needs. This found that in general, seismic hazard (ground shaking) and its likely impacts were well known and understood by all parties. However, subsequent landsliding and associated effects were poorly known and understood and their potential impacts unconsidered. Scenario development therefore focussed primarily on understanding these processes and their potential effects. This resulted in cascading hazards being included in a medium-scale NZ exercise for the first time. Further, all participants were able to focus on the potential impacts on their specific sectors, increasing the level of knowledge of cascading processes across all parties. Using group based discussions throughout the design process allowed a detailed scenario to be created, fostered stronger inter-disciplinary relationships, and identified areas for further research. Consequently, further detailed research has begun specifically into the impacts from secondary effects in an effort to further increase resilience to future events.

  3. Dynamics of biomolecular processes

    NASA Astrophysics Data System (ADS)

    Behringer, Hans; Eichhorn, Ralf; Wallin, Stefan

    2013-05-01

    The last few years have seen enormous progress in the availability of computational resources, so that the size and complexity of physical systems that can be investigated numerically has increased substantially. The physical mechanisms behind the processes creating life, such as those in a living cell, are of foremost interest in biophysical research. A main challenge here is that complexity not only emerges from interactions of many macro-molecular compounds, but is already evident at the level of a single molecule. An exciting recent development in this context is, therefore, that detailed atomistic level characterization of large-scale dynamics of individual bio-macromolecules, such as proteins and DNA, is starting to become feasible in some cases. This has contributed to a better understanding of the molecular mechanisms of, e.g. protein folding and aggregation, as well as DNA dynamics. Nevertheless, simulations of the dynamical behaviour of complex multicomponent cellular processes at an all-atom level will remain beyond reach for the foreseeable future, and may not even be desirable. Ultimate understanding of many biological processes will require the development of methods targeting different time and length scales and, importantly, ways to bridge these in multiscale approaches. At the scientific programme Dynamics of biomolecular processes: from atomistic representations to coarse-grained models held between 27 February and 23 March 2012, and hosted by the Nordic Institute for Theoretical Physics, new modelling approaches and results for particular biological systems were presented and discussed. The programme was attended by around 30 scientists from the Nordic countries and elsewhere. It also included a PhD and postdoc 'winter school', where basic theoretical concepts and techniques of biomolecular modelling and simulations were presented. One to two decades ago, the biomolecular modelling field was dominated by two widely different and largely independent approaches. On the one hand, computationally convenient and highly simplified lattice models were being used to elucidate the fundamental aspects of biomolecular conformational transitions, such as protein folding. On the other hand, these generic coarse-grained approaches were complemented by atomistic representations of the biomolecules. Physico-chemical all-atom models, often with an explicit representation of the surrounding solvent, were applied to specific protein structures to investigate their detailed dynamical behaviour. Today the situation is strikingly different, as was evident during the programme, where several new efforts were presented that try to combine the atomistic and the generic modelling approaches. The aim is to develop coarse-grained models at an intermediate-level resolution that are detailed enough to study specific biomolecular systems, and yet remain computationally efficient. These attempts are accompanied by the emergence of systematic coarse-graining techniques which bridge the physics of different lengths and timescales in a single simulation dynamically by applying appropriate representations of the associated degrees of freedom. Such adaptive resolution schemes represent promising candidates to tackle systems with an intrinsic multiscale nature, such as hierarchical chains and networks of biochemical reactions on a cellular level, calling for a very detailed description on an atomistic particle (or even quantum) level but simultaneously allowing the investigation of large-scale structuring and transport phenomena. The presentations and discussions during the programme also showed that the numerical evidence from (multiscale) simulations needs to be complemented by analytical and theoretical investigations to provide, eventually, a combined and deepened insight into the properties of biomolecular processes. The contributions from this scientific programme published in this issue of Physica Scripta highlight some of these new developments while also addressing related issues, such as the challenge of achieving efficient conformational sampling for chain molecules, and the interaction of nano-particles with biomolecules. The latter topic is especially timely as nano-particles are currently being considered for use as drug delivery devices, and present concerns about the general safety of their use might be resolved (or substantiated) by studies of this kind. This scientific programme and the contributions presented here were made possible by the financial and administrative support of the Nordic Institute for Theoretical Physics.

  4. Fingerprinting sea-level variations in response to continental ice loss: a benchmark exercise

    NASA Astrophysics Data System (ADS)

    Barletta, Valentina R.; Spada, Giorgio; Riva, Riccardo E. M.; James, Thomas S.; Simon, Karen M.; van der Wal, Wouter; Martinec, Zdenek; Klemann, Volker; Olsson, Per-Anders; Hagedoorn, Jan; Stocchi, Paolo; Vermeersen, Bert

    2013-04-01

    Understanding the response of the Earth to the waxing and waning ice sheets is crucial in various contexts, ranging from the interpretation of modern satellite geodetic measurements to the projections of future sea level trends in response to climate change. All the processes accompanying Glacial Isostatic Adjustment (GIA) can be described solving the so-called Sea Level Equation (SLE), an integral equation that accounts for the interactions between the ice sheets, the solid Earth, and the oceans. Modern approaches to the SLE are based on various techniques that range from purely analytical formulations to fully numerical methods. Here we present the results of a benchmark exercise of independently developed codes designed to solve the SLE. The study involves predictions of current sea level changes due to present-day ice mass loss. In spite of the differences in the methods employed, the comparison shows that a significant number of GIA modellers can reproduce their sea-level computations within 2% for well defined, large-scale present-day ice mass changes. Smaller and more detailed loads need further and dedicated benchmarking and high resolution computation. This study shows how the details of the implementation and the inputs specifications are an important, and often underappreciated, aspect. Hence this represents a step toward the assessment of reliability of sea level projections obtained with benchmarked SLE codes.

  5. Seeing the Forest and the Trees: Default Local Processing in Individuals with High Autistic Traits Does Not Come at the Expense of Global Attention.

    PubMed

    Stevenson, Ryan A; Sun, Sol Z; Hazlett, Naomi; Cant, Jonathan S; Barense, Morgan D; Ferber, Susanne

    2018-04-01

    Atypical sensory perception is one of the most ubiquitous symptoms of autism, including a tendency towards a local-processing bias. We investigated whether local-processing biases were associated with global-processing impairments on a global/local attentional-scope paradigm in conjunction with a composite-face task. Behavioural results were related to individuals' levels of autistic traits, specifically the Attention to Detail subscale of the Autism Quotient, and the Sensory Profile Questionnaire. Individuals showing high rates of Attention to Detail were more susceptible to global attentional-scope manipulations, suggesting that local-processing biases associated with Attention to Detail do not come at the cost of a global-processing deficit, but reflect a difference in default global versus local bias. This relationship operated at the attentional/perceptual level, but not response criterion.

  6. Computational Biochemistry-Enzyme Mechanisms Explored.

    PubMed

    Culka, Martin; Gisdon, Florian J; Ullmann, G Matthias

    2017-01-01

    Understanding enzyme mechanisms is a major task to achieve in order to comprehend how living cells work. Recent advances in biomolecular research provide huge amount of data on enzyme kinetics and structure. The analysis of diverse experimental results and their combination into an overall picture is, however, often challenging. Microscopic details of the enzymatic processes are often anticipated based on several hints from macroscopic experimental data. Computational biochemistry aims at creation of a computational model of an enzyme in order to explain microscopic details of the catalytic process and reproduce or predict macroscopic experimental findings. Results of such computations are in part complementary to experimental data and provide an explanation of a biochemical process at the microscopic level. In order to evaluate the mechanism of an enzyme, a structural model is constructed which can be analyzed by several theoretical approaches. Several simulation methods can and should be combined to get a reliable picture of the process of interest. Furthermore, abstract models of biological systems can be constructed combining computational and experimental data. In this review, we discuss structural computational models of enzymatic systems. We first discuss various models to simulate enzyme catalysis. Furthermore, we review various approaches how to characterize the enzyme mechanism both qualitatively and quantitatively using different modeling approaches. © 2017 Elsevier Inc. All rights reserved.

  7. Current understanding of structure-processing-property relationships in BaTiO 3-Bi( M)O 3 dielectrics

    DOE PAGES

    Beuerlein, Michaela A.; Kumar, Nitish; Usher, Tedi -Marie; ...

    2016-09-01

    Here, as part of a continued push for high permittivity dielectrics suitable for use at elevated operating temperatures and/or large electric fields, modifications of BaTiO 3 with Bi( M)O 3, where M represents a net-trivalent B-site occupied by one or more species, have received a great deal of recent attention. Materials in this composition family exhibit weakly coupled relaxor behavior that is not only remarkably stable at high temperatures and under large electric fields, but is also quite similar across various identities of M. Moderate levels of Bi content (as much as 50 mol%) appear to be crucial to themore » stability of the dielectric response. In addition, the presence of significant Bi reduces the processing temperatures required for densification and increases the required oxygen content in processing atmospheres relative to traditional X7R-type BaTiO 3-based dielectrics. Although detailed understanding of the structure–processing–property relationships in this class of materials is still in its infancy, this article reviews the current state of understanding of the mechanisms underlying the high and stable values of both relative permittivity and resistivity that are characteristic of BaTiO 3-Bi( M)O 3 dielectrics as well as the processing challenges and opportunities associated with these materials.« less

  8. Image and emotion: from outcomes to brain behavior.

    PubMed

    Nanda, Upali; Zhu, Xi; Jansen, Ben H

    2012-01-01

    A systematic review of neuroscience articles on the emotional states of fear, anxiety, and pain to understand how emotional response is linked to the visual characteristics of an image at the level of brain behavior. A number of outcome studies link exposure to visual images (with nature content) to improvements in stress, anxiety, and pain perception. However, an understanding of the underlying perceptual mechanisms has been lacking. In this article, neuroscience studies that use visual images to induce fear, anxiety, or pain are reviewed to gain an understanding of how the brain processes visual images in this context and to explore whether this processing can be linked to specific visual characteristics. The amygdala was identified as one of the key regions of the brain involved in the processing of fear, anxiety, and pain (induced by visual images). Other key areas included the thalamus, insula, and hippocampus. Characteristics of visual images such as the emotional dimension (valence/arousal), subject matter (familiarity, ambiguity, novelty, realism, and facial expressions), and form (sharp and curved contours) were identified as key factors influencing emotional processing. The broad structural properties of an image and overall content were found to have a more pivotal role in the emotional response than the specific details of an image. Insights on specific visual properties were translated to recommendations for what should be incorporated-and avoided-in healthcare environments.

  9. The Process of Student Cognition in Constructing Mathematical Conjecture

    ERIC Educational Resources Information Center

    Astawa, I. Wayan Puja; Budayasa, I. Ketut; Juniati, Dwi

    2018-01-01

    This research aims to describe the process of student cognition in constructing mathematical conjecture. Many researchers have studied this process but without giving a detailed explanation of how students understand the information to construct a mathematical conjecture. The researchers focus their analysis on how to construct and prove the…

  10. Procedural Due Process Rights in Student Discipline.

    ERIC Educational Resources Information Center

    Pressman, Robert; Weinstein, Susan

    To assist administrators in understanding procedural due process rights in student discipline, this manual draws together hundreds of citations and case summaries of federal and state court decisions and provides detailed commentary as well. Chapter 1 outlines the general principles of procedural due process rights in student discipline, such as…

  11. Knowledge-Production-and Utilization: A General Model. Third Approximation. Iowa Agricultural and Home Economics Experiment Station Project No. 2218. Sociology Report No. 138.

    ERIC Educational Resources Information Center

    Meehan, Peter M.; Beal, George M.

    The objective of this monograph is to contribute to the further understanding of the knowledge-production-and-utilization process. Its primary focus is on a model both general and detailed enough to provide a comprehensive overview of the diverse functions, roles, and processes required to understand the flow of knowledge from its point of origin…

  12. One Teacher's Identity, Emotions, and Commitment to Change: A Case Study into the Cognitive-Affective Processes of a Secondary School Teacher in the Context of Reforms

    ERIC Educational Resources Information Center

    van Veen, Klaas; Sleegers, Peter; van de Ven, Piet-Hein

    2005-01-01

    This paper presents a cognitive social-psychological theoretical framework on emotions, derived from Richard Lazarus, to understand how teachers' identity can be affected in a context of reforms. The emphasis of this approach is on the cognitive-affective processes of individual teachers, enabling us to gain a detailed understanding of what…

  13. Language learning impairments: integrating basic science, technology, and remediation.

    PubMed

    Tallal, P; Merzenich, M M; Miller, S; Jenkins, W

    1998-11-01

    One of the fundamental goals of the modern field of neuroscience is to understand how neuronal activity gives rise to higher cortical function. However, to bridge the gap between neurobiology and behavior, we must understand higher cortical functions at the behavioral level at least as well as we have come to understand neurobiological processes at the cellular and molecular levels. This is certainly the case in the study of speech processing, where critical studies of behavioral dysfunction have provided key insights into the basic neurobiological mechanisms relevant to speech perception and production. Much of this progress derives from a detailed analysis of the sensory, perceptual, cognitive, and motor abilities of children who fail to acquire speech, language, and reading skills normally within the context of otherwise normal development. Current research now shows that a dysfunction in normal phonological processing, which is critical to the development of oral and written language, may derive, at least in part, from difficulties in perceiving and producing basic sensory-motor information in rapid succession--within tens of ms (see Tallal et al. 1993a for a review). There is now substantial evidence supporting the hypothesis that basic temporal integration processes play a fundamental role in establishing neural representations for the units of speech (phonemes), which must be segmented from the (continuous) speech stream and combined to form words, in order for the normal development of oral and written language to proceed. Results from magnetic resonance imaging (MRI) and positron emission tomography (PET) studies, as well as studies of behavioral performance in normal and language impaired children and adults, will be reviewed to support the view that the integration of rapidly changing successive acoustic events plays a primary role in phonological development and disorders. Finally, remediation studies based on this research, coupled with neuroplasticity research, will be presented.

  14. USAREUR Command Challenges

    DTIC Science & Technology

    1993-04-15

    philosophy, the level of detail and leader involvement, and the standards of the process will assist future commanders and staff officers prepare for their...and the threat diminished in size but grew in scope, the national strategy , as well as USAREUR’s mission and focus, and staff officers is to...drawdown philosophy, the level of detail and leader involvement, and the standards of the process will assist future commanders and staff officers

  15. Comparative study between single core model and detail core model of CFD modelling on reactor core cooling behaviour

    NASA Astrophysics Data System (ADS)

    Darmawan, R.

    2018-01-01

    Nuclear power industry is facing uncertainties since the occurrence of the unfortunate accident at Fukushima Daiichi Nuclear Power Plant. The issue of nuclear power plant safety becomes the major hindrance in the planning of nuclear power program for new build countries. Thus, the understanding of the behaviour of reactor system is very important to ensure the continuous development and improvement on reactor safety. Throughout the development of nuclear reactor technology, investigation and analysis on reactor safety have gone through several phases. In the early days, analytical and experimental methods were employed. For the last four decades 1D system level codes were widely used. The continuous development of nuclear reactor technology has brought about more complex system and processes of nuclear reactor operation. More detailed dimensional simulation codes are needed to assess these new reactors. Recently, 2D and 3D system level codes such as CFD are being explored. This paper discusses a comparative study on two different approaches of CFD modelling on reactor core cooling behaviour.

  16. A Biopsychosocial Formulation of Pain Communication

    ERIC Educational Resources Information Center

    Hadjistavropoulos, Thomas; Craig, Kenneth D.; Duck, Steve; Cano, Annmarie; Goubert, Liesbet; Jackson, Philip L.; Mogil, Jeffrey S.; Rainville, Pierre; Sullivan, Michael J. L.; de C. Williams, Amanda C.; Vervoort, Tine; Fitzgerald, Theresa Dever

    2011-01-01

    We present a detailed framework for understanding the numerous and complicated interactions among psychological and social determinants of pain through examination of the process of pain communication. The focus is on an improved understanding of immediate dyadic transactions during painful events in the context of broader social phenomena.…

  17. Signatures of Heavy Element Production in Neutron Star Mergers

    NASA Astrophysics Data System (ADS)

    Barnes, Jennifer

    2018-06-01

    Compact object mergers involving at least one neutron star have long been theorized to be sites of astrophysical nucleosynthesis via rapid neutron capture (the r-process). The observation in light and gravitational waves of the first neutron star merger (GW1701817) this past summer provided a stunning confirmation of this theory. Electromagnetic emission powered by the radioactive decay of freshly synthesized nuclei from mergers encodes information about the composition burned by the r-process, including whether a particular merger event synthesized the heaviest nuclei along the r-process path, or froze out at lower mass number. However, efforts to model the emission in detail must still contend with many uncertainties. For instance, the uncertain nuclear masses far from the valley of stability influence the final composition burned by the r-process, as will weak interactions operating in the merger’s immediate aftermath. This in turn can affect the color electromagnetic emission. Understanding the details of these transients’ spectra will also require a detailed accounting the electronic transitions of r-process elements and ions, in order to identify the strong transitions that underlie spectral formation. This talk will provide an overview of our current understanding of radioactive transients from mergers, with an emphasis on the role of experiment in providing critical inputs for models and reducing uncertainty.

  18. Problematic effects of antibiotics on anaerobic treatment of swine wastewater.

    PubMed

    Cheng, D L; Ngo, H H; Guo, W S; Chang, S W; Nguyen, D D; Kumar, S Mathava; Du, B; Wei, Q; Wei, D

    2018-05-04

    Swine wastewaters with high levels of organic pollutants and antibiotics have become serious environmental concerns. Anaerobic technology is a feasible option for swine wastewater treatment due to its advantage in low costs and bioenergy production. However, antibiotics in swine wastewater have problematic effects on micro-organisms, and the stability and performance of anaerobic processes. Thus, this paper critically reviews impacts of antibiotics on pH, COD removal efficiencies, biogas and methane productions as well as the accumulation of volatile fatty acids (VFAs) in the anaerobic processes. Meanwhile, impacts on the structure of bacteria and methanogens in anaerobic processes are also discussed comprehensively. Furthermore, to better understand the effect of antibiotics on anaerobic processes, detailed information about antimicrobial mechanisms of antibiotics and microbial functions in anaerobic processes is also summarized. Future research on deeper knowledge of the effect of antibiotics on anaerobic processes are suggested to reduce their adverse environmental impacts. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. From psychosocial equilibrium to catastrophic breakdown: Cyprus 1955-1974.

    PubMed

    Galatariotou, Catia

    2008-08-01

    The recent history of Cyprus presents an example of a society in which a centuries-long peaceful coexistence of diverse populations gave way to violence and murderous hate, with devastating end results. This paper tries to understand and describe the process by which Cypriot society slid from a position of psychosocial equilibrium and integration towards one of disintegration, fragmentation and catastrophic breakdown. This paper draws from work by social anthropologists, sociologists, historians and others, and from my own personal experience. To these I applied insights afforded by psychoanalysis to identify and explore the psychic processes and states of mind that characterized a psychosocial disintegrative process. I came to see external political events and internal psychological processes as inseparably intertwined and dynamically interdependent, each emanating from and catalysing the other. The factual details of the process described are of course unique in their local specificity, but the psychic phenomena that characterized it are not: at both the individual and group levels they are replicated in other societies undergoing similar processes of self-destruction.

  20. Metacognitive Analysis of Pre-Service Teachers of Chemistry in Posting Questions

    NASA Astrophysics Data System (ADS)

    Santoso, T.; Yuanita, L.

    2017-04-01

    Questions addressed to something can induce metacognitive function to monitor a person’s thinking process. This study aims to describe the structure of the level of student questions based on thinking level and chemistry understanding level and describe how students use their metacognitive knowledge in asking. This research is a case study in chemistry learning, followed by 87 students. Results of the analysis revealed that the structure of thinking level of student question consists of knowledge question, understanding and application question, and high thinking question; the structure of chemistry understanding levels of student questions are a symbol, macro, macro-micro, macro-process, micro-process, and the macro-micro-process. The level Questioning skill of students to scientific articles more qualified than the level questioning skills of students to the teaching materials. The analysis result of six student interviews, a student question demonstrate the metacognitive processes with categories: (1) low-level metacognitive process, which is compiled based on questions focusing on a particular phrase or change the words; (2) intermediate level metacognitive process, submission of questions requires knowledge and understanding, and (3) high-level metacognitive process, the student questions posed based on identifying the central topic or abstraction essence of scientific articles.

  1. Pervasive randomness in physics: an introduction to its modelling and spectral characterisation

    NASA Astrophysics Data System (ADS)

    Howard, Roy

    2017-10-01

    An introduction to the modelling and spectral characterisation of random phenomena is detailed at a level consistent with a first exposure to the subject at an undergraduate level. A signal framework for defining a random process is provided and this underpins an introduction to common random processes including the Poisson point process, the random walk, the random telegraph signal, shot noise, information signalling random processes, jittered pulse trains, birth-death random processes and Markov chains. An introduction to the spectral characterisation of signals and random processes, via either an energy spectral density or a power spectral density, is detailed. The important case of defining a white noise random process concludes the paper.

  2. Understanding a Pakistani Science Teacher's Practice through a Life History Study

    ERIC Educational Resources Information Center

    Halai, Nelofer

    2011-01-01

    The purpose of the single case life history study was to understand a female science teacher's conceptions of the nature of science as explicit in her practice. While this paper highlights these understandings, an additional purpose is to give a detailed account of the process of creating a life history account through more than 13 in-depth…

  3. Antigen processing and presentation: evolution from a bird's eye view.

    PubMed

    Kaufman, Jim

    2013-09-01

    Most detailed knowledge of the MHC outside of mammals has come from studies of chickens, originally due to the economic importance of the poultry industry. We have used our discoveries about the chicken MHC to develop a framework for understanding the evolution of the MHC, based on the importance of genomic organisation for gene co-evolution. In humans, MHC class I molecules are polymorphic and determine the specificity of peptide presentation, while the molecules involved in antigen processing are functionally monomorphic. The genes for tapasin, transporters associated with antigen presentation (TAPs) and inducible proteasome components (LMPs) are located in and beyond the class II region, far away from the class I genes in the class I region. In contrast, chickens express only one class I locus at high levels, which can result in strong MHC associations with resistance to particular infectious pathogens. The chicken TAP and tapasin genes are located very close to the class I genes, and have high levels of allelic polymorphism and moderate sequence diversity, co-evolving their specificities to work optimally with the dominantly expressed class I molecule. The salient features of the chicken MHC are found in many if not most non-mammalian species examined, and are likely to represent the ancestral organisation of the MHC. Comparison with the MHC organisation of humans and typical mammals suggests that a large inversion brought the class III region into the middle of the MHC, separating the antigen processing genes from the class I gene, breaking the co-evolutionary relationships and allowing a multigene family of well-expressed class I genes. Such co-evolution in the primordial MHC was likely responsible for the appearance of the antigen presentation pathways and receptor-ligand interactions at the birth of the adaptive immune system. Of course, much further work is required to understand this evolutionary framework in more detail. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Actualizing culture change: The Promoting Excellent Alternatives in Kansas Nursing Homes (PEAK 2.0) program.

    PubMed

    Doll, Gayle A; Cornelison, Laci J; Rath, Heath; Syme, Maggie L

    2017-08-01

    Nursing homes have been challenged in their attempts to achieve deep, organizational change (i.e., culture change) aimed at providing quality of care and quality of life for nursing home residents through person-centered care. To attain deep change, 2 well-defined components must be in place: a shared understanding of (a) the what, or content goals, and (b) the how, or process of change. However, there are few examples of this at a macro or micro level in long-term care. In an effort to enact true culture change in nursing homes statewide, the Kansas Department for Aging and Disability Services implemented the Promoting Excellent Alternatives in Kansas Nursing Homes program. This program is a Medicaid, pay-for-performance program that formalizes the content and process of achieving culture change through person-centered care principles. This article aims to detail the content (what) and process (how) of a model macro-level program of culture change throughout the State of Kansas. Applications to the micro level (individual homes) are presented, and implications for psychologists' roles in facilitating culture change are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Development of watershed hydrologic research at Santee Experimental Forest, coastal South Carolina

    Treesearch

    Devendra Amatya; Carl Trettin

    2007-01-01

    Managing forested wetland landscapes for water quality improvement and productivity requires a detailed understanding of functional linkages between ecohydrological processes and management practices. Watershed studies are being conducted at USDA Forest Service Santee Experimental Forest, South Carolina, to understand the fundamental hydrologic and biogeochemical...

  6. Landscape genetics: combining landscape ecology and population genetics

    Treesearch

    Stephanie Manel; Michael K. Schwartz; Gordon Luikart; Pierre Taberlet

    2003-01-01

    Understanding the processes and patterns of gene flow and local adaptation requires a detailed knowledge of how landscape characteristics structure populations. This understanding is crucial, not only for improving ecological knowledge, but also for managing properly the genetic diversity of threatened and endangered populations. For nearly 80 years, population...

  7. Towards understanding what contributes to forming an opinion

    NASA Astrophysics Data System (ADS)

    Wang, Peng; Song, Jia; Huo, Jie; Hao, Rui; Wang, Xu-Ming

    Opinion evolution mechanism can be captured by physical modeling. In this paper, a kinetic equation is established by defining a generalized displacement(cognitive level), a driving force and the related factors such as generalized potential, information quantity and attitude. It has been shown that the details of opinion evolution depend on the type of the driving force, self-dominated driving or environment-dominated driving. In the former case, the participants can have their attitudes changed in the process of competition between the self-driving force and environment-driving force. In the latter case, all of the participants are pulled by the environment. Some regularities behind the dynamics of opinion are also revealed, for instance, the information entropy decays with time in a special way, etc. The results may help us to get some deep understanding for the formation of a public opinion.

  8. Synthesis, Delivery and Regulation of Eukaryotic Heme and Fe-S Cluster Cofactors

    PubMed Central

    Barupala, Dulmini P.; Dzul, Stephen P.; Riggs-Gelasco, Pamela Jo; Stemmler, Timothy L.

    2016-01-01

    In humans, the bulk of iron in the body (over 75%) is directed towards heme- or Fe-S cluster cofactor synthesis, and the complex, highly regulated pathways in place to accomplish biosynthesis have evolved to safely assemble and load these cofactors into apoprotein partners. In eukaryotes, heme biosynthesis is both initiated and finalized within the mitochondria, while cellular Fe-S cluster assembly is controlled by correlated pathways both within the mitochondria and within the cytosol. Iron plays a vital role in a wide array of metabolic processes and defects in iron cofactor assembly leads to human diseases. This review describes progress towards our molecular-level understanding of cellular heme and Fe-S cluster biosynthesis, focusing on the regulation and mechanistic details that are essential for understanding human disorders related to the breakdown in these essential pathways. PMID:26785297

  9. Building the bridge between animal movement and population dynamics.

    PubMed

    Morales, Juan M; Moorcroft, Paul R; Matthiopoulos, Jason; Frair, Jacqueline L; Kie, John G; Powell, Roger A; Merrill, Evelyn H; Haydon, Daniel T

    2010-07-27

    While the mechanistic links between animal movement and population dynamics are ecologically obvious, it is much less clear when knowledge of animal movement is a prerequisite for understanding and predicting population dynamics. GPS and other technologies enable detailed tracking of animal location concurrently with acquisition of landscape data and information on individual physiology. These tools can be used to refine our understanding of the mechanistic links between behaviour and individual condition through 'spatially informed' movement models where time allocation to different behaviours affects individual survival and reproduction. For some species, socially informed models that address the movements and average fitness of differently sized groups and how they are affected by fission-fusion processes at relevant temporal scales are required. Furthermore, as most animals revisit some places and avoid others based on their previous experiences, we foresee the incorporation of long-term memory and intention in movement models. The way animals move has important consequences for the degree of mixing that we expect to find both within a population and between individuals of different species. The mixing rate dictates the level of detail required by models to capture the influence of heterogeneity and the dynamics of intra- and interspecific interaction.

  10. Communicating patient-reported outcome scores using graphic formats: results from a mixed-methods evaluation.

    PubMed

    Brundage, Michael D; Smith, Katherine C; Little, Emily A; Bantug, Elissa T; Snyder, Claire F

    2015-10-01

    Patient-reported outcomes (PROs) promote patient-centered care by using PRO research results ("group-level data") to inform decision making and by monitoring individual patient's PROs ("individual-level data") to inform care. We investigated the interpretability of current PRO data presentation formats. This cross-sectional mixed-methods study randomized purposively sampled cancer patients and clinicians to evaluate six group-data or four individual-data formats. A self-directed exercise assessed participants' interpretation accuracy and ratings of ease-of-understanding and usefulness (0 = least to 10 = most) of each format. Semi-structured qualitative interviews explored helpful and confusing format attributes. We reached thematic saturation with 50 patients (44 % < college graduate) and 20 clinicians. For group-level data, patients rated simple line graphs highest for ease-of-understanding and usefulness (median 8.0; 33 % selected for easiest to understand/most useful) and clinicians rated simple line graphs highest for ease-of-understanding and usefulness (median 9.0, 8.5) but most often selected line graphs with confidence limits or norms (30 % for each format for easiest to understand/most useful). Qualitative results support that clinicians value confidence intervals, norms, and p values, but patients find them confusing. For individual-level data, both patients and clinicians rated line graphs highest for ease-of-understanding (median 8.0 patients, 8.5 clinicians) and usefulness (median 8.0, 9.0) and selected them as easiest to understand (50, 70 %) and most useful (62, 80 %). The qualitative interviews supported highlighting scores requiring clinical attention and providing reference values. This study has identified preferences and opportunities for improving on current formats for PRO presentation and will inform development of best practices for PRO presentation. Both patients and clinicians prefer line graphs across group-level data and individual-level data formats, but clinicians prefer greater detail (e.g., statistical details) for group-level data.

  11. Learning from catchments to understand hydrological drought (HS Division Outstanding ECS Award Lecture)

    NASA Astrophysics Data System (ADS)

    Van Loon, Anne

    2017-04-01

    Drought is a global challenge. To be able to manage drought effectively on global or national scales without losing smaller scale variability and local context, we need to understand what the important hydrological drought processes are at different scales. Global scale models and satellite data are providing a global overview and catchment scale studies provide detailed site-specific information. I am interested in bridging these two scale levels by learning from catchments from around the world. Much information from local case studies is currently underused on larger scales because there is too much complexity. However, some of this complexity might be crucial on the level where people are facing the consequences of drought. In this talk, I will take you on a journey around the world to unlock catchment scale information and see if the comparison of many catchments gives us additional understanding of hydrological drought processes on the global scale. I will focus on the role of storage in different compartments of the terrestrial hydrological cycle, and how we as humans interact with that storage. I will discuss aspects of spatial and temporal variability in storage that are crucial for hydrological drought development and persistence, drawing from examples of catchments with storage in groundwater, lakes and wetlands, and snow and ice. The added complexity of human activities shifts the focus from natural to catchments with anthropogenic increases in storage (reservoirs), decreases in storage (groundwater abstraction), and changes in hydrological processes (urbanisation). We learn how local information is providing valuable insights, in some cases challenging theoretical understanding or model outcomes. Despite the challenges of working across countries, with a high number of collaborators, in a multitude of languages, under data-scarce conditions, the scientific advantages of bridging scales are substantial. The comparison of catchments around the world can inform global scale models, give the needed spatial variability to satellite data, and help us make steps in understanding and managing the complex challenge of drought, now and in the future.

  12. FILTSoft: A computational tool for microstrip planar filter design

    NASA Astrophysics Data System (ADS)

    Elsayed, M. H.; Abidin, Z. Z.; Dahlan, S. H.; Cholan N., A.; Ngu, Xavier T. I.; Majid, H. A.

    2017-09-01

    Filters are key component of any communication system to control spectrum and suppress interferences. Designing a filter involves long process as well as good understanding of the basic hardware technology. Hence this paper introduces an automated design tool based on Matlab-GUI, called the FILTSoft (acronym for Filter Design Software) to ease the process. FILTSoft is a user friendly filter design tool to aid, guide and expedite calculations from lumped elements level to microstrip structure. Users just have to provide the required filter specifications as well as the material description. FILTSoft will calculate and display the lumped element details, the planar filter structure, and the expected filter's response. An example of a lowpass filter design was calculated using FILTSoft and the results were validated through prototype measurement for comparison purposes.

  13. WE-DE-202-00: Connecting Radiation Physics with Computational Biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMahon, S.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  15. WE-DE-202-01: Connecting Nanoscale Physics to Initial DNA Damage Through Track Structure Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuemann, J.

    Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less

  16. Human Research Program Unique Processes, Criteria, and Guidelines (UPCG). Revision C, July 28, 2011

    NASA Technical Reports Server (NTRS)

    Chin, Duane

    2011-01-01

    This document defines the processes, criteria, and guidelines exclusive to managing the Human Research Program (HRP). The intent of this document is to provide instruction to the reader in the form of processes, criteria, and guidelines. Of the three instructional categories, processes contain the most detail because of the need for a systematic series of actions directed to some end. In contrast, criteria have lesser detail than processes with the idea of creating a rule or principle structure for evaluating or testing something. Guidelines are a higher level indication of a course of action typically with the least amount of detail. The lack of detail in guidelines allows the reader flexibility when performing an action or actions.

  17. Evaluation of the lubrication mechanism at various rotation speeds and granule filling levels in a container mixer using a thermal effusivity sensor.

    PubMed

    Uchiyama, Jumpei; Aoki, Shigeru

    2015-01-01

    To research the detailed mechanism of the lubrication process using the thermal effusivity sensor, the relationships of the lubrication progress with the pattern of powder flow, the rotation speed and the filling level were investigated. The thermal effusivity profile was studied as a function of the number of rotations at various rotation speeds. It was observed that at lower rotation speeds, the profiles of the lubrication progress were almost the same, regardless of the rotation speed. In this region, the highest speed was defined as the critical rotation speed (CRS), which was found to be one of the important factors. The CRS had close relations with avalanche flow in the blender. The first and the second phases were observed in the lubrication process. The first phase was influenced by the CRS and the filling level in the blender. The second phase was influenced by the rotation speed. The mechanism of two-phase process was proposed as a macro progression of the dispersion of the lubricant (first phase) and micro progression of the coating of the powder particles with lubricant (second phase). The accurate monitoring by the thermal effusivity sensor was able to help a better understanding in the lubrication process.

  18. Temporal distance and person memory: thinking about the future changes memory for the past.

    PubMed

    Wyer, Natalie A; Perfect, Timothy J; Pahl, Sabine

    2010-06-01

    Psychological distance has been shown to influence how people construe an event such that greater distance produces high-level construal (characterized by global or holistic processing) and lesser distance produces low-level construal (characterized by detailed or feature-based processing). The present research tested the hypothesis that construal level has carryover effects on how information about an event is retrieved from memory. Two experiments manipulated temporal distance and found that greater distance (high-level construal) improves face recognition and increases retrieval of the abstract features of an event, whereas lesser distance (low-level construal) impairs face recognition and increases retrieval of the concrete details of an event. The findings have implications for transfer-inappropriate processing accounts of face recognition and event memory, and suggest potential applications in forensic settings.

  19. Tackling The Dragon: Investigating Lensed Galaxy Structure

    NASA Astrophysics Data System (ADS)

    Fortenberry, Alexander; Livermore, Rachael

    2018-01-01

    Galaxies have been seen to have a rapid decrease in star formation beginning at a redshift of around 1-2 up to the present day. To understand the processes underpinning this change, we need to observe the inner structure of galaxies and understand where and how the stellar mass builds up. However, at high redshifts our observable resolution is limited, which hinders the accuracy of the data. The lack of resolution at high redshift can be counteracted with the use of gravitational lensing. The magnification provided by the gravitational lens between us and the galaxies in question enables us to see extreme detail within the galaxies. To begin fine-tuning this process, we used Hubble data of Abell 370, a galaxy cluster, which lenses a galaxy know as “The Dragon” at z=0.725. With the increased detail proved by the gravitational lens we provide a detailed analysis of the galaxy’s spatially resolved star formation rate, stellar age, and masses.

  20. Magnetic field reconnection. [energy conversion in space plasma

    NASA Technical Reports Server (NTRS)

    Sonnerup, U. O.

    1979-01-01

    A reasonably detailed description is obtained of the current status of our understanding of magnetic field reconnection. The picture that emerges is of a process, simple in concept but extremely complicated and multifaceted in detail. Nonlinear MHD processes in the external flow region, governed by distant boundary conditions, are coupled to nonlinear microscopic plasma processes in the diffusion region, in a manner not clearly understood. It appears that reconnection may operate in entirely different ways for different plasma parameters and different external boundary conditions. Steady reconnection may be allowed in some cases, forbidden in others, with intermediate situations involving impulsive or pulsative events.

  1. The Cognitive Spiral: Creative Thinking and Cognitive Processing.

    ERIC Educational Resources Information Center

    Ebert, Edward S., II

    1994-01-01

    The lack of a common understanding of the construct of creative thinking is noted, and the cognitive spiral model is presented, which conceptualizes creative thinking as an integral component of all cognitive processing. This article details the synthesis of a definition and the structure of a model of cognitive processing. (Author/DB)

  2. Tactical Synthesis Of Efficient Global Search Algorithms

    NASA Technical Reports Server (NTRS)

    Nedunuri, Srinivas; Smith, Douglas R.; Cook, William R.

    2009-01-01

    Algorithm synthesis transforms a formal specification into an efficient algorithm to solve a problem. Algorithm synthesis in Specware combines the formal specification of a problem with a high-level algorithm strategy. To derive an efficient algorithm, a developer must define operators that refine the algorithm by combining the generic operators in the algorithm with the details of the problem specification. This derivation requires skill and a deep understanding of the problem and the algorithmic strategy. In this paper we introduce two tactics to ease this process. The tactics serve a similar purpose to tactics used for determining indefinite integrals in calculus, that is suggesting possible ways to attack the problem.

  3. Understanding novel language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dejong, G.F.; Waltz, D.L.

    1983-01-01

    This paper treats in some detail the problem of designing mechanisms that will allow one to deal with two types of novel language: (1) text requiring scheme learning; and (2) the understanding of novel metaphorical use of verbs. Schema learning is addressed by four types of processes: scheme composition, secondary effect elevation, schema alteration, and volitionalization. The processing of novel metaphors depends on a decompositional analysis of verbs into event shape diagrams, along with a matching process that uses semantic marker-like information, to construct novel meaning structures. The examples described have been chosen to be types that occur commonly, somore » that rules that are needed to understand them can also be used to understand a much wider range of novel language. 38 references.« less

  4. Characteristics of Academic Detailing: Results of a Literature Review

    PubMed Central

    Van Hoof, Thomas J.; Harrison, Lisa G.; Miller, Nicole E.; Pappas, Maryanne S.; Fischer, Michael A.

    2015-01-01

    Background Academic detailing is an evidence-based strategy to improve patient care. Efforts to understand the intervention and to use it strategically require an understanding of its important characteristics. A recent systematic review and a subsequent reporting framework call for more accurate and complete reporting of continuing medical education interventions. Objectives Building on a previously published systematic review of 69 studies, we sought to determine how an expanded set of 106 academic detailing studies, including many recently published articles, fared with respect to reporting of important data about this intervention. Methods We conducted a search of MEDLINE, the Cumulative Index to Nursing and Allied Health Literature (clinical) database, and Scopus, from which we identified 38 additional randomized controlled trials published from August 2007 through March 2013. Including the original 69 studies, we abstracted 106 available English-language studies and quantitatively analyzed information about 4 important characteristics of academic detailing: content of visits, clinicians being visited, communication process underlying visits, and outreach workers making visits. Results We found considerable variation (36.5%-100%) in the extent of reporting intervention characteristics, especially about the communication process underlying visits and the outreach workers making visits. The best overall documentation of intervention characteristics of any single study was 68%. Results also demonstrate wide variation in the approach to academic detailing. Conclusions This study demonstrates the need for a standardized approach to collecting and reporting data about academic detailing interventions. Our findings also highlight opportunities for using academic detailing more effectively in research and quality-improvement efforts. PMID:26702333

  5. Characteristics of Academic Detailing: Results of a Literature Review.

    PubMed

    Van Hoof, Thomas J; Harrison, Lisa G; Miller, Nicole E; Pappas, Maryanne S; Fischer, Michael A

    2015-11-01

    Academic detailing is an evidence-based strategy to improve patient care. Efforts to understand the intervention and to use it strategically require an understanding of its important characteristics. A recent systematic review and a subsequent reporting framework call for more accurate and complete reporting of continuing medical education interventions. Building on a previously published systematic review of 69 studies, we sought to determine how an expanded set of 106 academic detailing studies, including many recently published articles, fared with respect to reporting of important data about this intervention. We conducted a search of MEDLINE, the Cumulative Index to Nursing and Allied Health Literature (clinical) database, and Scopus, from which we identified 38 additional randomized controlled trials published from August 2007 through March 2013. Including the original 69 studies, we abstracted 106 available English-language studies and quantitatively analyzed information about 4 important characteristics of academic detailing: content of visits, clinicians being visited, communication process underlying visits, and outreach workers making visits. We found considerable variation (36.5%-100%) in the extent of reporting intervention characteristics, especially about the communication process underlying visits and the outreach workers making visits. The best overall documentation of intervention characteristics of any single study was 68%. Results also demonstrate wide variation in the approach to academic detailing. This study demonstrates the need for a standardized approach to collecting and reporting data about academic detailing interventions. Our findings also highlight opportunities for using academic detailing more effectively in research and quality-improvement efforts.

  6. LACIS-T - A moist air wind tunnel for investigating the interactions between cloud microphysics and turbulence

    NASA Astrophysics Data System (ADS)

    Niedermeier, Dennis; Voigtländer, Jens; Siebert, Holger; Desai, Neel; Shaw, Raymond; Chang, Kelken; Krueger, Steven; Schumacher, Jörg; Stratmann, Frank

    2017-11-01

    Turbulence - cloud droplet interaction processes have been investigated primarily through numerical simulation and field measurements over the last ten years. However, only in the laboratory we can be confident in our knowledge of initial and boundary conditions, and are able to measure for extended times under statistically stationary and repeatable conditions. Therefore, the newly built turbulent wind tunnel LACIS-T (Turbulent Leipzig Aerosol Cloud Interaction Simulator) is an ideal facility for pursuing mechanistic understanding of these processes. Within the tunnel we are able to adjust precisely controlled turbulent temperature and humidity fields so as to achieve supersaturation levels allowing for detailed investigations of the interactions between cloud microphysical processes (e.g., cloud droplet activation) and the turbulent flow, under well-defined and reproducible laboratory conditions. We will present the fundamental operating principle, first results from ongoing characterization efforts, numerical simulations as well as first droplet activation experiments.

  7. High-energy solar flare observations at the Y2K maximum

    NASA Astrophysics Data System (ADS)

    Emslie, A. Gordon

    2000-04-01

    Solar flares afford an opportunity to observe processes associated with the acceleration and propagation of high-energy particles at a level of detail not accessible in any other astrophysical source. I will review some key results from previous high-energy solar flare observations, including those from the Compton Gamma-Ray Observatory, and the problems that they pose for our understanding of energy release and particle acceleration processes in the astrophysical environment. I will then discuss a program of high-energy observations to be carried out during the upcoming 2000-2001 solar maximum that is aimed at addressing and resolving these issues. A key element in this observational program is the High Energy Solar Spectroscopic Imager (HESSI) spacecraft, which will provide imaging spectroscopic observations with spatial, temporal, and energy resolutions commensurate with the physical processes believed to be operating, and will in addition provide the first true gamma-ray spectroscopy of an astrophysical source. .

  8. SeaWiFS Science Algorithm Flow Chart

    NASA Technical Reports Server (NTRS)

    Darzi, Michael

    1998-01-01

    This flow chart describes the baseline science algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Data Processing System (SDPS). As such, it includes only processing steps used in the generation of the operational products that are archived by NASA's Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC). It is meant to provide the reader with a basic understanding of the scientific algorithm steps applied to SeaWiFS data. It does not include non-science steps, such as format conversions, and places the greatest emphasis on the geophysical calculations of the level-2 processing. Finally, the flow chart reflects the logic sequences and the conditional tests of the software so that it may be used to evaluate the fidelity of the implementation of the scientific algorithm. In many cases however, the chart may deviate from the details of the software implementation so as to simplify the presentation.

  9. The construction of causal schemes: learning mechanisms at the knowledge level.

    PubMed

    diSessa, Andrea A

    2014-06-01

    This work uses microgenetic study of classroom learning to illuminate (1) the role of pre-instructional student knowledge in the construction of normative scientific knowledge, and (2) the learning mechanisms that drive change. Three enactments of an instructional sequence designed to lead to a scientific understanding of thermal equilibration are used as data sources. Only data from a scaffolded student inquiry preceding introduction of a normative model were used. Hence, the study involves nearly autonomous student learning. In two classes, students developed stable and socially shared explanations ("causal schemes") for understanding thermal equilibration. One case resulted in a near-normative understanding, while the other resulted in a non-normative "alternative conception." The near-normative case seems to be a particularly clear example wherein the constructed causal scheme is a composition of previously documented naïve conceptions. Detailed prior description of these naive elements allows a much better than usual view of the corresponding details of change during construction of the new scheme. A list of candidate mechanisms that can account for observed change is presented. The non-normative construction seems also to be a composition, albeit of a different structural form, using a different (although similar) set of naïve elements. This article provides one of very few high-resolution process analyses showing the productive use of naïve knowledge in learning. © 2014 Cognitive Science Society, Inc.

  10. A framework for understanding grocery purchasing in a low-income urban environment.

    PubMed

    Zachary, Drew A; Palmer, Anne M; Beckham, Sarah W; Surkan, Pamela J

    2013-05-01

    Research demonstrates that food desert environments limit low-income shoppers' ability to purchase healthy foods, thereby increasing their likelihood of diet-related illnesses. We sought to understand how individuals in an urban American food desert make grocery-purchasing decisions, and specifically why unhealthy purchases arise. Analysis is based on ethnographic data from participant observation, 37 in-depth interviews, and three focus groups with low-income, primarily African American shoppers with children. We found participants had detailed knowledge of and preference for healthy foods, but the obligation to consistently provide food for their families required them to apply specific decision criteria which, combined with structural qualities of the supermarket environment, increased unhealthy purchases and decreased healthy purchases. Applying situated cognition theory, we constructed an emic model explaining this widely shared grocery-purchasing decision process and its implications. This context-specific understanding of behavior suggests that multifaceted, system-level approaches to intervention are needed to increase healthy purchasing in food deserts.

  11. Mechanism of Diphtheria Toxin Catalytic Domain Delivery to the Eukaryotic Cell Cytosol and the Cellular Factors that Directly Participate in the Process

    PubMed Central

    Murphy, John R.

    2011-01-01

    Research on diphtheria and anthrax toxins over the past three decades has culminated in a detailed understanding of their structure function relationships (e.g., catalytic (C), transmembrane (T), and receptor binding (R) domains), as well as the identification of their eukaryotic cell surface receptor, an understanding of the molecular events leading to the receptor-mediated internalization of the toxin into an endosomal compartment, and the pH triggered conformational changes required for pore formation in the vesicle membrane. Recently, a major research effort has been focused on the development of a detailed understanding of the molecular interactions between each of these toxins and eukaryotic cell factors that play an essential role in the efficient translocation of their respective catalytic domains through the trans-endosomal vesicle membrane pore and delivery into the cell cytosol. In this review, I shall focus on recent findings that have led to a more detailed understanding of the mechanism by which the diphtheria toxin catalytic domain is delivered to the eukaryotic cell cytosol. While much work remains, it is becoming increasingly clear that the entry process is facilitated by specific interactions with a number of cellular factors in an ordered sequential fashion. In addition, since diphtheria, anthrax lethal factor and anthrax edema factor all carry multiple coatomer I complex binding motifs and COPI complex has been shown to play an essential role in entry process, it is likely that the initial steps in catalytic domain entry of these divergent toxins follow a common mechanism. PMID:22069710

  12. Comparison of Control Approaches in Genetic Regulatory Networks by Using Stochastic Master Equation Models, Probabilistic Boolean Network Models and Differential Equation Models and Estimated Error Analyzes

    NASA Astrophysics Data System (ADS)

    Caglar, Mehmet Umut; Pal, Ranadip

    2011-03-01

    Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.

  13. Lean flammability limit of downward propagating hydrogen-air flames

    NASA Technical Reports Server (NTRS)

    Patnaik, G.; Kailasanath, K.

    1992-01-01

    Detailed multidimensional numerical simulations that include the effects of wall heat losses have been performed to study the dynamics of downward flame propagation and extinguishment in lean hydrogen-air mixtures. The computational results show that a downward propagating flame in an isothermal channel has a flammability limit of around 9.75 percent. This is in excellent agreement with experimental results. Also in excellent agreement are the detailed observations of the flame behavior at the point of extinguishment. The primary conclusion of this work is that detailed numerical simulations that include wall heat losses and the effect of gravity can adequately simulate the dynamics of the extinguishment process in downward-propagating hydrogen-air flames. These simulations can be examined in detail to gain understanding of the actual extinction process.

  14. A Connectionist Approach to Embodied Conceptual Metaphor

    PubMed Central

    Flusberg, Stephen J.; Thibodeau, Paul H.; Sternberg, Daniel A.; Glick, Jeremy J.

    2010-01-01

    A growing body of data has been gathered in support of the view that the mind is embodied and that cognition is grounded in sensory-motor processes. Some researchers have gone so far as to claim that this paradigm poses a serious challenge to central tenets of cognitive science, including the widely held view that the mind can be analyzed in terms of abstract computational principles. On the other hand, computational approaches to the study of mind have led to the development of specific models that help researchers understand complex cognitive processes at a level of detail that theories of embodied cognition (EC) have sometimes lacked. Here we make the case that connectionist architectures in particular can illuminate many surprising results from the EC literature. These models can learn the statistical structure in their environments, providing an ideal framework for understanding how simple sensory-motor mechanisms could give rise to higher-level cognitive behavior over the course of learning. Crucially, they form overlapping, distributed representations, which have exactly the properties required by many embodied accounts of cognition. We illustrate this idea by extending an existing connectionist model of semantic cognition in order to simulate findings from the embodied conceptual metaphor literature. Specifically, we explore how the abstract domain of time may be structured by concrete experience with space (including experience with culturally specific spatial and linguistic cues). We suggest that both EC researchers and connectionist modelers can benefit from an integrated approach to understanding these models and the empirical findings they seek to explain. PMID:21833256

  15. Toward Process-resolving Synthesis and Prediction of Arctic Climate Change Using the Regional Arctic System Model

    NASA Astrophysics Data System (ADS)

    Maslowski, W.

    2017-12-01

    The Regional Arctic System Model (RASM) has been developed to better understand the operation of Arctic System at process scale and to improve prediction of its change at a spectrum of time scales. RASM is a pan-Arctic, fully coupled ice-ocean-atmosphere-land model with marine biogeochemistry extension to the ocean and sea ice models. The main goal of our research is to advance a system-level understanding of critical processes and feedbacks in the Arctic and their links with the Earth System. The secondary, an equally important objective, is to identify model needs for new or additional observations to better understand such processes and to help constrain models. Finally, RASM has been used to produce sea ice forecasts for September 2016 and 2017, in contribution to the Sea Ice Outlook of the Sea Ice Prediction Network. Future RASM forecasts, are likely to include increased resolution for model components and ecosystem predictions. Such research is in direct support of the US environmental assessment and prediction needs, including those of the U.S. Navy, Department of Defense, and the recent IARPC Arctic Research Plan 2017-2021. In addition to an overview of RASM technical details, selected model results are presented from a hierarchy of climate models together with available observations in the region to better understand potential oceanic contributions to polar amplification. RASM simulations are analyzed to evaluate model skill in representing seasonal climatology as well as interannual and multi-decadal climate variability and predictions. Selected physical processes and resulting feedbacks are discussed to emphasize the need for fully coupled climate model simulations, high model resolution and sensitivity of simulated sea ice states to scale dependent model parameterizations controlling ice dynamics, thermodynamics and coupling with the atmosphere and ocean.

  16. Computational fluid dynamics and frequency-dependent finite-difference time-domain method coupling for the interaction between microwaves and plasma in rocket plumes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinefuchi, K.; Funaki, I.; Shimada, T.

    Under certain conditions during rocket flights, ionized exhaust plumes from solid rocket motors may interfere with radio frequency transmissions. To understand the relevant physical processes involved in this phenomenon and establish a prediction process for in-flight attenuation levels, we attempted to measure microwave attenuation caused by rocket exhaust plumes in a sea-level static firing test for a full-scale solid propellant rocket motor. The microwave attenuation level was calculated by a coupling simulation of the inviscid-frozen-flow computational fluid dynamics of an exhaust plume and detailed analysis of microwave transmissions by applying a frequency-dependent finite-difference time-domain method with the Drude dispersion model.more » The calculated microwave attenuation level agreed well with the experimental results, except in the case of interference downstream the Mach disk in the exhaust plume. It was concluded that the coupling estimation method based on the physics of the frozen plasma flow with Drude dispersion would be suitable for actual flight conditions, although the mixing and afterburning in the plume should be considered depending on the flow condition.« less

  17. Developing visualisation software for rehabilitation: investigating the requirements of patients, therapists and the rehabilitation process

    PubMed Central

    Loudon, David; Macdonald, Alastair S.; Carse, Bruce; Thikey, Heather; Jones, Lucy; Rowe, Philip J.; Uzor, Stephen; Ayoade, Mobolaji; Baillie, Lynne

    2012-01-01

    This paper describes the ongoing process of the development and evaluation of prototype visualisation software, designed to assist in the understanding and the improvement of appropriate movements during rehabilitation. The process of engaging users throughout the research project is detailed in the paper, including how the design of the visualisation software is being adapted to meet the emerging understanding of the needs of patients and professionals, and of the rehabilitation process. The value of the process for the design of the visualisation software is illustrated with a discussion of the findings of pre-pilot focus groups with stroke survivors and therapists. PMID:23011812

  18. Ionizing radiation processing and its potential in advancing biorefining and nanocellulose composite materials manufacturing

    NASA Astrophysics Data System (ADS)

    Postek, Michael T.; Poster, Dianne L.; Vládar, András E.; Driscoll, Mark S.; LaVerne, Jay A.; Tsinas, Zois; Al-Sheikhly, Mohamad I.

    2018-02-01

    Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. Through utilization of a biorefinery concept, nanocellulose can be produced in large volumes from wood at relatively low cost via ionizing radiation processing. Ionizing radiation causes significant break down of the polysaccharide and leads to the production of potentially useful gaseous products such as H2 and CO. The application of radiation processing to the production of nanocellulose from woody and non-wood sources, such as field grasses, bio-refining by-products, industrial pulp waste, and agricultural surplus materials remains an open field, ripe for innovation and application. Elucidating the mechanisms of the radiolytic decomposition of cellulose and the mass generation of nanocellulose by radiation processing is key to tapping into this source of nanocelluose for the growth of nanocellulostic-product development. More importantly, understanding the structural break-up of the cell walls as a function of radiation exposure is a key goal and only through careful, detailed characterization and dimensional metrology can this be achieved at the level of detail that is needed to further the growth of large scale radiation processing of plant materials. This work is resulting from strong collaborations between NIST and its academic partners who are pursuing the unique demonstration of applied ionizing radiation processing to plant materials as well as the development of manufacturing metrology for novel nanomaterials.

  19. Ionizing radiation processing and its potential in advancing biorefining and nanocellulose composite materials manufacturing.

    PubMed

    Postek, Michael T; Poster, Dianne L; Vládar, András E; Driscoll, Mark S; LaVerne, Jay A; Tsinas, Zois; Al-Sheikhly, Mohamad I

    2018-02-01

    Nanocellulose is a high value material that has gained increasing attention because of its high strength, stiffness, unique photonic and piezoelectric properties, high stability and uniform structure. Through utilization of a biorefinery concept, nanocellulose can be produced in large volumes from wood at relatively low cost via ionizing radiation processing. Ionizing radiation causes significant break down of the polysaccharide and leads to the production of potentially useful gaseous products such as H 2 and CO. The application of radiation processing to the production of nanocellulose from woody and non-wood sources, such as field grasses, bio-refining byproducts, industrial pulp waste, and agricultural surplus materials remains an open field, ripe for innovation and application. Elucidating the mechanisms of the radiolytic decomposition of cellulose and the mass generation of nanocellulose by radiation processing is key to tapping into this source of nanocelluose for the growth of nanocellulostic-product development. More importantly, understanding the structural break-up of the cell walls as a function of radiation exposure is a key goal and only through careful, detailed characterization and dimensional metrology can this be achieved at the level of detail that is needed to further the growth of large scale radiation processing of plant materials. This work is resulting from strong collaborations between NIST and its academic partners who are pursuing the unique demonstration of applied ionizing radiation processing to plant materials as well as the development of manufacturing metrology for novel nanomaterials.

  20. Maintaining consistency between planning hierarchies: Techniques and applications

    NASA Technical Reports Server (NTRS)

    Zoch, David R.

    1987-01-01

    In many planning and scheduling environments, it is desirable to be able to view and manipulate plans at different levels of abstraction, allowing the users the option of viewing and manipulating either a very detailed representation of the plan or a high-level more abstract version of the plan. Generating a detailed plan from a more abstract plan requires domain-specific planning/scheduling knowledge; the reverse process of generating a high-level plan from a detailed plan Reverse Plan Maintenance, or RPM) requires having the system remember the actions it took based on its domain-specific knowledge and its reasons for taking those actions. This reverse plan maintenance process is described as implemented in a specific planning and scheduling tool, The Mission Operations Planning Assistant (MOPA), as well as the applications of RPM to other planning and scheduling problems; emphasizing the knowledge that is needed to maintain the correspondence between the different hierarchical planning levels.

  1. Metabolomics-Based Elucidation of Active Metabolic Pathways in Erythrocytes and HSC-Derived Reticulocytes.

    PubMed

    Srivastava, Anubhav; Evans, Krystal J; Sexton, Anna E; Schofield, Louis; Creek, Darren J

    2017-04-07

    A detailed analysis of the metabolic state of human-stem-cell-derived erythrocytes allowed us to characterize the existence of active metabolic pathways in younger reticulocytes and compare them to mature erythrocytes. Using high-resolution LC-MS-based untargeted metabolomics, we found that reticulocytes had a comparatively much richer repertoire of metabolites, which spanned a range of metabolite classes. An untargeted metabolomics analysis using stable-isotope-labeled glucose showed that only glycolysis and the pentose phosphate pathway actively contributed to the biosynthesis of metabolites in erythrocytes, and these pathways were upregulated in reticulocytes. Most metabolite species found to be enriched in reticulocytes were residual pools of metabolites produced by earlier erythropoietic processes, and their systematic depletion in mature erythrocytes aligns with the simplification process, which is also seen at the cellular and the structural level. Our work shows that high-resolution LC-MS-based untargeted metabolomics provides a global coverage of the biochemical species that are present in erythrocytes. However, the incorporation of stable isotope labeling provides a more accurate description of the active metabolic processes that occur in each developmental stage. To our knowledge, this is the first detailed characterization of the active metabolic pathways of the erythroid lineage, and it provides a rich database for understanding the physiology of the maturation of reticulocytes into mature erythrocytes.

  2. Developmental Implications of the Levels of Processing Memory Framework.

    ERIC Educational Resources Information Center

    Naus, Mary J.

    The levels of processing framework for understanding memory development has generated little empirical or theoretical work that furthers an understanding of the developmental memory system. Although empirical studies by those testing the levels of processing framework have demonstrated that mnemonic strategies employed by children are the critical…

  3. Explaining rISC and 100% efficient TADF (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Monkman, Andrew P.; Etherington, Marc; Graves, David; Data, Przemyslaw; Dos Santos, Paloma Lays; Nobuyasu, Roberto; Baiao Dias, Fernando M.

    2016-09-01

    Detailed photophysical measurements of intramolecular charge transfer (ICT) states have been made both in solution and solid state. Temperature dependent time resolved emission, delayed emission and photoinduced absorption are used to map the energy levels involved in molecule decay, and through detailed kinetic modelling of the thermally activated processes observed, true electron exchange energies and other energy barriers of the systems determined with the real states involved in the reversed intersystem crossing mechanism elucidated. For specific donor acceptor molecules, the CT singlet and local triplet states (of donor or acceptor) are found to be the lowest lying excited states of the molecule with very small energy barrier between them ? kT. In these cases the decay kinetics of the molecules become significantly different to normal molecules, and the effect of rapid recycling between CT singlet and local triplet states is observed which gives rise to the true triplet harvesting mechanism in TADF. Using a series of different TADF emitters we will show how the energy level ordering effects or does not effect TADF and how ultimate OLED performance is dictated by energy level ordering, from 5% to 22% external quantum efficiency. From this understanding, we are able to define three criterion for TADF in different molecules and these will be discussed.

  4. Energy Return on Investment (EROI) for Forty Global Oilfields Using a Detailed Engineering-Based Model of Oil Production.

    PubMed

    Brandt, Adam R; Sun, Yuchi; Bharadwaj, Sharad; Livingston, David; Tan, Eugene; Gordon, Deborah

    2015-01-01

    Studies of the energy return on investment (EROI) for oil production generally rely on aggregated statistics for large regions or countries. In order to better understand the drivers of the energy productivity of oil production, we use a novel approach that applies a detailed field-level engineering model of oil and gas production to estimate energy requirements of drilling, producing, processing, and transporting crude oil. We examine 40 global oilfields, utilizing detailed data for each field from hundreds of technical and scientific data sources. Resulting net energy return (NER) ratios for studied oil fields range from ≈2 to ≈100 MJ crude oil produced per MJ of total fuels consumed. External energy return (EER) ratios, which compare energy produced to energy consumed from external sources, exceed 1000:1 for fields that are largely self-sufficient. The lowest energy returns are found to come from thermally-enhanced oil recovery technologies. Results are generally insensitive to reasonable ranges of assumptions explored in sensitivity analysis. Fields with very large associated gas production are sensitive to assumptions about surface fluids processing due to the shifts in energy consumed under different gas treatment configurations. This model does not currently include energy invested in building oilfield capital equipment (e.g., drilling rigs), nor does it include other indirect energy uses such as labor or services.

  5. Energy Return on Investment (EROI) for Forty Global Oilfields Using a Detailed Engineering-Based Model of Oil Production

    PubMed Central

    Brandt, Adam R.; Sun, Yuchi; Bharadwaj, Sharad; Livingston, David; Tan, Eugene; Gordon, Deborah

    2015-01-01

    Studies of the energy return on investment (EROI) for oil production generally rely on aggregated statistics for large regions or countries. In order to better understand the drivers of the energy productivity of oil production, we use a novel approach that applies a detailed field-level engineering model of oil and gas production to estimate energy requirements of drilling, producing, processing, and transporting crude oil. We examine 40 global oilfields, utilizing detailed data for each field from hundreds of technical and scientific data sources. Resulting net energy return (NER) ratios for studied oil fields range from ≈2 to ≈100 MJ crude oil produced per MJ of total fuels consumed. External energy return (EER) ratios, which compare energy produced to energy consumed from external sources, exceed 1000:1 for fields that are largely self-sufficient. The lowest energy returns are found to come from thermally-enhanced oil recovery technologies. Results are generally insensitive to reasonable ranges of assumptions explored in sensitivity analysis. Fields with very large associated gas production are sensitive to assumptions about surface fluids processing due to the shifts in energy consumed under different gas treatment configurations. This model does not currently include energy invested in building oilfield capital equipment (e.g., drilling rigs), nor does it include other indirect energy uses such as labor or services. PMID:26695068

  6. Improving Analytical Characterization of Glycoconjugate Vaccines through Combined High-Resolution MS and NMR: Application to Neisseria meningitidis Serogroup B Oligosaccharide-Peptide Glycoconjugates.

    PubMed

    Yu, Huifeng; An, Yanming; Battistel, Marcos D; Cipollo, John F; Freedberg, Darón I

    2018-04-17

    Conjugate vaccines are highly heterogeneous in terms of glycosylation sites and linked oligosaccharide length. Therefore, the characterization of conjugate vaccines' glycosylation state is challenging. However, improved product characterization can lead to enhancements in product control and product quality. Here, we present a synergistic combination of high-resolution mass spectrometry (MS) and nuclear magnetic resonance spectroscopy (NMR) for the analysis of glycoconjugates. We use the power of this strategy to characterize model polysaccharide conjugates and to demonstrate a detailed level of glycoproteomic analysis. These are first steps on model compounds that will help untangle the details of complex product characterization in conjugate vaccines. Ultimately, this strategy can be applied to enhance the characterization of polysaccharide conjugate vaccines. In this study, we lay the groundwork for the analysis of conjugate vaccines. To begin this effort, oligosaccharide-peptide conjugates were synthesized by periodate oxidation of an oligosaccharide of a defined length, α,2-8 sialic acid trimer, followed by a reductive amination, and linking the trimer to an immunogenic peptide from tetanus toxoid. Combined mass spectrometry and nuclear magnetic resonance were used to monitor each reaction and conjugation products. Complete NMR peak assignment and detailed MS information on oxidized oligosialic acid and conjugates are reported. These studies provide a deeper understanding of the conjugation chemistry process and products, which can lead to a better controlled production process.

  7. Implementation of cardiovascular disease prevention in primary health care: enhancing understanding using normalisation process theory.

    PubMed

    Volker, Nerida; Williams, Lauren T; Davey, Rachel C; Cochrane, Thomas; Clancy, Tanya

    2017-02-24

    The reorientation of primary health care towards prevention is fundamental to addressing the rising burden of chronic disease. However, in Australia, cardiovascular disease prevention practice in primary health care is not generally consistent with existing guidelines. The Model for Prevention study was a whole-of-system cardiovascular disease prevention intervention, with one component being enhanced lifestyle modification support and addition of a health coaching service in the general practice setting. To determine the feasibility of translating intervention outcomes into real world practice, implementation work done by stakeholders was examined using Normalisation Process Theory as a framework. Data was collected through interviews with 40 intervention participants and included general practitioners, practice nurses, practice managers, lifestyle advisors and participants. Data analysis was informed by normalisation process theory constructs. Stakeholders were in agreement that, while prevention is a key function of general practice, it was not their usual work. There were varying levels of engagement with the intervention by practice staff due to staff interest, capacity and turnover, but most staff reconfigured their work for required activities. The Lifestyle Advisors believed staff had varied levels of interest in and understanding of, their service, but most staff felt their role was useful. Patients expanded their existing relationships with their general practice, and most achieved their lifestyle modification goals. While the study highlighted the complex nature of the change required, many of the new or enhanced processes implemented as part of the intervention could be scaled up to improve the systems approach to prevention. Overcoming the barriers to change, such as the perception of CVD prevention as a 'hard sell', is going to rely on improving the value proposition for all stakeholders. The study provided a detailed understanding of the work required to implement a complex cardiovascular disease prevention intervention within general practice. The findings highlighted the need for multiple strategies that engage all stakeholders. Normalisation process theory was a useful framework for guiding change implementation.

  8. Understanding Wave-mean Flow Feedbacks and Tropospheric Annular Variability

    NASA Astrophysics Data System (ADS)

    Lorenz, D. J.

    2016-12-01

    The structure of internal tropospheric variability is important for determining the impact of the stratosphere on the troposphere. This study aims to better understand the fundamental dynamical mechanisms that control the feedbacks between the eddies and the mean flow, which in turn select the tropospheric annular mode. Recent work using Rossby Wave Chromatography suggests that "barotropic processes", which directly impact the meridional propagation of wave activity (specifically the reflectivity of the poleward flank of the mid-latitude jet), are more important for the positive feedback between the annular mode and the eddies than "baroclinic processes", which involve changes in the generation of wave activity by baroclinic instability. In this study, experiments with a fully nonlinear quasi-geostrophic model are discussed which provide independent confirmation of the importance of barotropic versus baroclinic processes. The experiments take advantage of the steady-state balance at upper-levels between the meridional gradient in diabatic heating and the second derivative of the upper-level EP flux divergence. Simulations with standard Newtonian heating are compared to simulations with constant-in-time heating taken from the climatology of the standard run and it is found that the forced annular mode response to changes in surface friction is very similar. Moreover, as expected from the annular mode response, the eddy momentum fluxes are also very similar. This is despite the fact that the upper-level EP flux divergence is very different between the two simulations (upper-level EP flux divergence must remain constant in the constant heating simulation while in the standard simulation there is no such constraint). The upper-level balances are maintained by a large change in the baroclinic wave source (i.e. vertical EP flux), which is accompanied by little momentum flux change. Therefore the eddy momentum fluxes appear to be relatively insensitive to the wave activity source. A more detailed comparison suggests a helpful rule-of-thumb relating the amplitude of the baroclinic wave source to the upper-level vorticity flux forced by this wave source.

  9. Insights into Protein–Ligand Interactions: Mechanisms, Models, and Methods

    PubMed Central

    Du, Xing; Li, Yi; Xia, Yuan-Ling; Ai, Shi-Meng; Liang, Jing; Sang, Peng; Ji, Xing-Lai; Liu, Shu-Qun

    2016-01-01

    Molecular recognition, which is the process of biological macromolecules interacting with each other or various small molecules with a high specificity and affinity to form a specific complex, constitutes the basis of all processes in living organisms. Proteins, an important class of biological macromolecules, realize their functions through binding to themselves or other molecules. A detailed understanding of the protein–ligand interactions is therefore central to understanding biology at the molecular level. Moreover, knowledge of the mechanisms responsible for the protein-ligand recognition and binding will also facilitate the discovery, design, and development of drugs. In the present review, first, the physicochemical mechanisms underlying protein–ligand binding, including the binding kinetics, thermodynamic concepts and relationships, and binding driving forces, are introduced and rationalized. Next, three currently existing protein-ligand binding models—the “lock-and-key”, “induced fit”, and “conformational selection”—are described and their underlying thermodynamic mechanisms are discussed. Finally, the methods available for investigating protein–ligand binding affinity, including experimental and theoretical/computational approaches, are introduced, and their advantages, disadvantages, and challenges are discussed. PMID:26821017

  10. Towards validated chemistry at extreme conditions: reactive MD simulations of shocked Polyvinyl Nitrate and Nitromethane

    NASA Astrophysics Data System (ADS)

    Islam, Md Mahbubul; Strachan, Alejandro

    A detailed atomistic-level understanding of the ultrafast chemistry of detonation processes of high energy materials is crucial to understand their performance and safety. Recent advances in laser shocks and ultra-fast spectroscopy is yielding the first direct experimental evidence of chemistry at extreme conditions. At the same time, reactive molecular dynamics (MD) in current high-performance computing platforms enable an atomic description of shock-induced chemistry with length and timescales approaching those of experiments. We use MD simulations with the reactive force field ReaxFF to investigate the shock-induced chemical decomposition mechanisms of polyvinyl nitrate (PVN) and nitromethane (NM). The effect of shock pressure on chemical reaction mechanisms and kinetics of both the materials are investigated. For direct comparison of our simulation results with experimentally derived IR absorption data, we performed spectral analysis using atomistic velocity at various shock conditions. The combination of reactive MD simulations and ultrafast spectroscopy enables both the validation of ReaxFF at extreme conditions and contributes to the interpretation of the experimental data relating changes in spectral features to atomic processes. Office of Naval Research MURI program.

  11. Communicating Flood Risk with Street-Level Data

    NASA Astrophysics Data System (ADS)

    Sanders, B. F.; Matthew, R.; Houston, D.; Cheung, W. H.; Karlin, B.; Schubert, J.; Gallien, T.; Luke, A.; Contreras, S.; Goodrich, K.; Feldman, D.; Basolo, V.; Serrano, K.; Reyes, A.

    2015-12-01

    Coastal communities around the world face significant and growing flood risks that require an accelerating adaptation response, and fine-resolution urban flood models could serve a pivotal role in enabling communities to meet this need. Such models depict impacts at the level of individual buildings and land parcels or "street level" - the same spatial scale at which individuals are best able to process flood risk information - constituting a powerful tool to help communities build better understandings of flood vulnerabilities and identify cost-effective interventions. To measure understanding of flood risk within a community and the potential impact of street-level models, we carried out a household survey of flood risk awareness in Newport Beach, California, a highly urbanized coastal lowland that presently experiences nuisance flooding from high tides, waves and rainfall and is expected to experience a significant increase in flood frequency and intensity with climate change. Interviews were completed with the aid of a wireless-enabled tablet device that respondents could use to identify areas they understood to be at risk of flooding and to view either a Federal Emergency Management Agency (FEMA) flood map or a more detailed map prepared with a hydrodynamic urban coastal flood model (UCI map) built with grid cells as fine as 3 m resolution and validated with historical flood data. Results indicate differences in the effectiveness of the UCI and FEMA maps at communicating the spatial distribution of flood risk, gender differences in how the maps affect flood understanding, and spatial biases in the perception of flood vulnerabilities.

  12. On-Call Communication in Orthopaedic Trauma: "A Picture Is Worth a Thousand Words"--A Survey of OTA Members.

    PubMed

    Molina, Cesar S; Callan, Alexandra K; Burgos, Eduardo J; Mir, Hassan R

    2015-05-01

    To quantify the effects of varying clinical communication styles (verbal and pictorial) on the ability of orthopaedic trauma surgeons in understanding an injury and formulate an initial management plan. A Research Electronic Data Capture survey was e-mailed to all OTA members. Respondents quantified (5-point Likert scale) how confident they felt understanding an injury and establishing an initial management plan based on the information provided for 5 common orthopaedic trauma scenarios. Three verbal descriptions were created for each scenario and categorized as limited, moderate, or detailed. The questions were repeated with the addition of a radiographic image and then repeated a third time including a clinical photograph. Statistical evaluation consisted of descriptive statistics and Kruskal-Wallis analyses using STATA (version 12.0). Of the 221 respondents, there were a total of 95 who completed the entire survey. Nearly all were currently taking call (92/95 = 96.8%) and the majority were fellowship trained (79/95 = 83.2%). Most practice at a level I trauma center (58/95 = 61.1%) and work with orthopaedic residents (62/95 = 65.3%). There was a significant increase in confidence scores between a limited, moderate, and detailed description in all clinical scenarios for understanding the injury and establishing an initial management plan (P < 0.05). There was a significant difference in confidence scores between all 3 types of evidence presented (verbal, verbal + x-ray, verbal + x-ray + photograph) in both understanding and managing the injury for limited and moderate descriptions (P < 0.001). No differences were seen when adding pictorial information to the detailed verbal description. When comparing confidence scores between a detailed description without images and a limited description that includes radiographs and a photograph, no difference in confidence levels was seen in 7 of the 10 scenarios (P > 0.05). The addition of images in the form of radiographs and/or clinical photographs greatly improves the confidence of orthopaedic trauma surgeons in understanding injuries and establishing initial management plans with limited verbal information (P < 0.001). The inclusion of x-rays and photographs raises the confidence for understanding and management with limited verbal information to the level of a detailed verbal description in most scenarios. Mobile technology allows for easy secure transfer of images that can make up for the lack of available information from limited verbal descriptions because of the knowledge base of communicating providers.

  13. Understanding the Federal Proposal Review Process.

    ERIC Educational Resources Information Center

    Cavin, Janis I.

    Information on the peer review process for the evaluation of federal grant proposals is presented to help college grants administrators and faculty develop good proposals. This guidebook provides an overview of the policies and conventions that govern the review and selection of proposals for funding, and details the review procedures of the…

  14. Biology Diagrams: Tools To Think With.

    ERIC Educational Resources Information Center

    Kindfield, Ann C. H.

    Subcellular processes like meiosis are frequently problematic for learners because they are complex and, except for the extent that they can be observed under a light microscope, occur outside of our direct experience. More detailed characterization of what underlies various degrees of student understanding of a process is required to more fully…

  15. X-Ray Photoelectron Spectroscopy and Ultrahigh Vacuum Contactless Capacitance-Voltage Characterization of Novel Oxide-Free InP Passivation Process Using a Silicon Surface Quantum Well

    NASA Astrophysics Data System (ADS)

    Takahashi, Hiroshi; Hashizume, Tamotsu; Hasegawa, Hideki

    1999-02-01

    In order to understand and optimize a novel oxide-free InP passivation process using a silicon surface quantum well, a detailed in situ X-ray photoelectron spectroscopy (XPS) and ultrahigh vacuum (UHV) contactless capacitance-voltage (C-V) study of the interface was carried out. Calculation of quantum levels in the silicon quantum well was performed on the basis of the band lineup of the strained Si3N4/Si/InP interface and the result indicated that the interface should become free of gap states when the silicon layer thickness is below 5 Å. Experimentally, such a delicate Si3N4/Si/InP structure was realized by partial nitridation of a molecular beam epitaxially (MBE) grown pseudomorphic silicon layer using an electron cyclotron resonance (ECR) N2 plasma. The progress of nitridation was investigated in detail by angle-resolved XPS. A newly developed UHV contactless C-V method realized in situ characterization of surface electronic properties of InP at each processing step for passivation. It was found that the interface state density decreased substantially into the 1010 cm-2 eV-1 range by optimizing the nitridation process of the silicon layer. It was concluded that both the surface bond termination and state removal by quantum confinement are responsible for the NSS reduction.

  16. A method for work modeling at complex systems: towards applying information systems in family health care units.

    PubMed

    Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques

    2012-01-01

    Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.

  17. Synthesis, delivery and regulation of eukaryotic heme and Fe-S cluster cofactors.

    PubMed

    Barupala, Dulmini P; Dzul, Stephen P; Riggs-Gelasco, Pamela Jo; Stemmler, Timothy L

    2016-02-15

    In humans, the bulk of iron in the body (over 75%) is directed towards heme- or Fe-S cluster cofactor synthesis, and the complex, highly regulated pathways in place to accomplish biosynthesis have evolved to safely assemble and load these cofactors into apoprotein partners. In eukaryotes, heme biosynthesis is both initiated and finalized within the mitochondria, while cellular Fe-S cluster assembly is controlled by correlated pathways both within the mitochondria and within the cytosol. Iron plays a vital role in a wide array of metabolic processes and defects in iron cofactor assembly leads to human diseases. This review describes progress towards our molecular-level understanding of cellular heme and Fe-S cluster biosynthesis, focusing on the regulation and mechanistic details that are essential for understanding human disorders related to the breakdown in these essential pathways. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Automated terrestrial laser scanning with near-real-time change detection - monitoring of the Séchilienne landslide

    NASA Astrophysics Data System (ADS)

    Kromer, Ryan A.; Abellán, Antonio; Hutchinson, D. Jean; Lato, Matt; Chanut, Marie-Aurelie; Dubois, Laurent; Jaboyedoff, Michel

    2017-05-01

    We present an automated terrestrial laser scanning (ATLS) system with automatic near-real-time change detection processing. The ATLS system was tested on the Séchilienne landslide in France for a 6-week period with data collected at 30 min intervals. The purpose of developing the system was to fill the gap of high-temporal-resolution TLS monitoring studies of earth surface processes and to offer a cost-effective, light, portable alternative to ground-based interferometric synthetic aperture radar (GB-InSAR) deformation monitoring. During the study, we detected the flux of talus, displacement of the landslide and pre-failure deformation of discrete rockfall events. Additionally, we found the ATLS system to be an effective tool in monitoring landslide and rockfall processes despite missing points due to poor atmospheric conditions or rainfall. Furthermore, such a system has the potential to help us better understand a wide variety of slope processes at high levels of temporal detail.

  19. The Effects of Buoyancy and Dilution on the Structure and Lift-Off of Coflow Laminar Diffusion Flames

    NASA Technical Reports Server (NTRS)

    Walsh, Kevin T.; Long, Marshall B.; Smooke, Mitchell D.

    1999-01-01

    The ability to predict the coupled effects of complex transport phenomena with detailed chemical kinetics in diffusion flames is critical in the modeling of turbulent reacting flows and in understanding the processes by which soot formation and radiative transfer take place. In addition, an understanding of those factors that affect flame extinction in diffusion flames is critical in the suppression of fires and in improving engine efficiency. A goal of this work is to bring to microgravity flame studies the detailed experimental and numerical tools that have been used to study ground-based systems. This will lead to a more detailed understanding of the interaction of convection, diffusion and chemistry in a nonbuoyant environment. To better understand these phenomena, experimental and computational studies of a coflow laminar diffusion flame have been carried out. To date, these studies have focused on a single set of flow conditions, in which a nitrogen-diluted methane fuel stream (65% methane by volume) was surrounded by an air coflow, with exit velocities matched at 35 cm/s. Of particular interest is the change in flame shape due to the absence of buoyant forces, as well as the amount of diluent in the fuel stream and the coflow velocity. As a sensitive marker of changes in the flame shape, the number densities of excited-state CH (A(exp 2 delta) denoted CH*), and excited-state OH (A(exp 2 sigma, denoted OH*) are measured. CH* and OH* number densities are deconvoluted from line-of-sight chemiluminescence measurements made on the NASA KC135 reduced-gravity aircraft. Measured signal levels are calibrated, post-flight, with Rayleigh scattering. In extending the study to microgravity conditions, improvements to the computational model have been made and new calculations performed for a range of gravity conditions. In addition, modifications to the experimental approach were required as a consequence of the constraints imposed by existing microgravity facilities. Results from the computations and experiments are presented.

  20. Understanding Local Ecology: Syllabus for Monitoring Water Quality.

    ERIC Educational Resources Information Center

    Iowa Univ., Iowa City.

    This syllabus gives detailed information on monitoring water quality for teachers and students. It tells how to select a sample site; how to measure physical characteristics such as temperature, turbidity, and stream velocity; how to measure chemical parameters such as alkalinity, dissolved oxygen levels, phosphate levels, and ammonia nitrogen…

  1. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    ERIC Educational Resources Information Center

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  2. Very Tiny Rocks: Site-Specific, Size-Dependent Reaction Kinetics at Nanoparticle-Water Interfaces

    NASA Astrophysics Data System (ADS)

    Rustad, J. R.

    2008-12-01

    One of the most fundamental challenges in geochemistry is to be able to understand the rates and mechanisms of elementary reactions that describe chemical processes occurring at mineral-water interfaces. One of the reasons for the primitive conceptual state of reaction kinetics in solid earth geochemistry is that it is very difficult to identify defensible elementary reactions where theoretical predictions can be made and the results can tested experimentally at the same length and time scale of the prediction. For example, the most fundamental predictor of complexation kinetics in aqueous solution is the characteristic water exchange rate, which are well known for the aquo ions and vary by 20 orders of magnitude even for simple trivalent ions. In contrast, for interfacial reactions, it was not even known whether water exchange rates were faster or slower than equivalent metal sites in solution, prohibiting any quantitive understanding of mineral reaction kinetics at the molecular level. Recent advances in synthesis and characterization of materials at nanometer length scales has been able to bridge the gap in scale, and nanometer-sized minerals have given us our first quantitative understanding of elementary reaction rates for fundamental processes involving water and hydroxide exchange reactions. I describe the results of molecular dynamics calculation and experimental measurement of the rates of water, hydroxide, and proton exchange reactions on nanoparticle surfaces. The calculations already show that transition state theory is completely inadequate to understand the rates of even the simplest elementary reactions. Furthermore, the mechanistic implications of rate parameters such as activation volume and activation enthalpy may be different in moving from aquo ions to interfaces. Is a molecular understanding of geochemical processes really needed? One might have asked a biologist at the turn of the century whether studying the structure of proteins would ever be useful for curing disease. True molecular level understanding of interfacial interactions has the potential to revolutionize geology, allowing unprecedented detail and accuracy in such important contexts as climate reconstruction and tectonic history. Geology has an inevitable molecular future.

  3. Optimization of droplets for UV-NIL using coarse-grain simulation of resist flow

    NASA Astrophysics Data System (ADS)

    Sirotkin, Vadim; Svintsov, Alexander; Zaitsev, Sergey

    2009-03-01

    A mathematical model and numerical method are described, which make it possible to simulate ultraviolet ("step and flash") nanoimprint lithography (UV-NIL) process adequately even using standard Personal Computers. The model is derived from 3D Navier-Stokes equations with the understanding that the resist motion is largely directed along the substrate surface and characterized by ultra-low values of the Reynolds number. By the numerical approximation of the model, a special finite difference method is applied (a coarse-grain method). A coarse-grain modeling tool for detailed analysis of resist spreading in UV-NIL at the structure-scale level is tested. The obtained results demonstrate the high ability of the tool to calculate optimal dispensing for given stamp design and process parameters. This dispensing provides uniform filled areas and a homogeneous residual layer thickness in UV-NIL.

  4. Interplay of multiple synaptic plasticity features in filamentary memristive devices for neuromorphic computing

    NASA Astrophysics Data System (ADS)

    La Barbera, Selina; Vincent, Adrien F.; Vuillaume, Dominique; Querlioz, Damien; Alibart, Fabien

    2016-12-01

    Bio-inspired computing represents today a major challenge at different levels ranging from material science for the design of innovative devices and circuits to computer science for the understanding of the key features required for processing of natural data. In this paper, we propose a detail analysis of resistive switching dynamics in electrochemical metallization cells for synaptic plasticity implementation. We show how filament stability associated to joule effect during switching can be used to emulate key synaptic features such as short term to long term plasticity transition and spike timing dependent plasticity. Furthermore, an interplay between these different synaptic features is demonstrated for object motion detection in a spike-based neuromorphic circuit. System level simulation presents robust learning and promising synaptic operation paving the way to complex bio-inspired computing systems composed of innovative memory devices.

  5. Upper atmospheric gravity wave details revealed in nightglow satellite imagery

    PubMed Central

    Miller, Steven D.; Straka, William C.; Yue, Jia; Smith, Steven M.; Alexander, M. Joan; Hoffmann, Lars; Setvák, Martin; Partain, Philip T.

    2015-01-01

    Gravity waves (disturbances to the density structure of the atmosphere whose restoring forces are gravity and buoyancy) comprise the principal form of energy exchange between the lower and upper atmosphere. Wave breaking drives the mean upper atmospheric circulation, determining boundary conditions to stratospheric processes, which in turn influence tropospheric weather and climate patterns on various spatial and temporal scales. Despite their recognized importance, very little is known about upper-level gravity wave characteristics. The knowledge gap is mainly due to lack of global, high-resolution observations from currently available satellite observing systems. Consequently, representations of wave-related processes in global models are crude, highly parameterized, and poorly constrained, limiting the description of various processes influenced by them. Here we highlight, through a series of examples, the unanticipated ability of the Day/Night Band (DNB) on the NOAA/NASA Suomi National Polar-orbiting Partnership environmental satellite to resolve gravity structures near the mesopause via nightglow emissions at unprecedented subkilometric detail. On moonless nights, the Day/Night Band observations provide all-weather viewing of waves as they modulate the nightglow layer located near the mesopause (∼90 km above mean sea level). These waves are launched by a variety of physical mechanisms, ranging from orography to convection, intensifying fronts, and even seismic and volcanic events. Cross-referencing the Day/Night Band imagery with conventional thermal infrared imagery also available helps to discern nightglow structures and in some cases to attribute their sources. The capability stands to advance our basic understanding of a critical yet poorly constrained driver of the atmospheric circulation. PMID:26630004

  6. Upper atmospheric gravity wave details revealed in nightglow satellite imagery.

    PubMed

    Miller, Steven D; Straka, William C; Yue, Jia; Smith, Steven M; Alexander, M Joan; Hoffmann, Lars; Setvák, Martin; Partain, Philip T

    2015-12-08

    Gravity waves (disturbances to the density structure of the atmosphere whose restoring forces are gravity and buoyancy) comprise the principal form of energy exchange between the lower and upper atmosphere. Wave breaking drives the mean upper atmospheric circulation, determining boundary conditions to stratospheric processes, which in turn influence tropospheric weather and climate patterns on various spatial and temporal scales. Despite their recognized importance, very little is known about upper-level gravity wave characteristics. The knowledge gap is mainly due to lack of global, high-resolution observations from currently available satellite observing systems. Consequently, representations of wave-related processes in global models are crude, highly parameterized, and poorly constrained, limiting the description of various processes influenced by them. Here we highlight, through a series of examples, the unanticipated ability of the Day/Night Band (DNB) on the NOAA/NASA Suomi National Polar-orbiting Partnership environmental satellite to resolve gravity structures near the mesopause via nightglow emissions at unprecedented subkilometric detail. On moonless nights, the Day/Night Band observations provide all-weather viewing of waves as they modulate the nightglow layer located near the mesopause (∼ 90 km above mean sea level). These waves are launched by a variety of physical mechanisms, ranging from orography to convection, intensifying fronts, and even seismic and volcanic events. Cross-referencing the Day/Night Band imagery with conventional thermal infrared imagery also available helps to discern nightglow structures and in some cases to attribute their sources. The capability stands to advance our basic understanding of a critical yet poorly constrained driver of the atmospheric circulation.

  7. 2016 winter campaigns on ClNO2 and N2O5 at a mountain top and a semi-rural site in southern China: overview and initial results

    NASA Astrophysics Data System (ADS)

    Wang, T.; Wang, W.; Yun, H.; Tham, Y. J.; Xia, M.; Yu, C.; Wang, Z.; Zhang, N.; Cui, L.; Poon, S.; Lee, S.; Ou, Y.; Yue, D.; Zhai, Y.

    2017-12-01

    In the past decade, heterogeneous uptake of dinitrogen pentoxide (N2O5) on aerosol and subsequent production of nitryl chloride (ClNO2) have been found to impact the oxidative capacity, NOx fate, and the formation of aerosol nitrate and photochemical ozone. However, the detailed processes and effects are not completely understand for diverse environments of the globe. Our previous measurements at a mountain top (957 m a.s.l) in Hong Kong in winter 2013 revealed elevated concentrations of N2O5 (up to 7.7 ppb) and ClNO2 (up to 4.7 ppb) and that the polluted air masses originated from inland urban areas of the Pearl River delta (PRD). To understand the detailed uptake processes, an intensive measurement campaign was conducted at the same site (Tai Mo Shan, TMS) during October-December 2016 and at a semi-rural site (Heshan) in the center of the PRD in January 2017. Key parameters related to N2O5 and ClNO2 processes, including aerosol ionic composition, aerosol number and size, volatile organic compounds as well as ozone, NOx and NOy, were measured during the two campaigns. Elevated (up to 3.4 ppb) ClNO2 concentrations were observed at the mountain site on many nights a few hours after sunset, and extremely high levels of ClNO2 (up to 8.3 ppb) were measured in the inland site during a heavy pollution event. The meteorological conditions and variations of ClNO2 will be examined with concurrently measured parameters to elucidate factors determining N2O5 uptake and ClNO2 production. The 2016 results at TMS will be compared with those from 2013.

  8. Validation study and routine control monitoring of moist heat sterilization procedures.

    PubMed

    Shintani, Hideharu

    2012-06-01

    The proposed approach to validation of steam sterilization in autoclaves follows the basic life cycle concepts applicable to all validation programs. Understand the function of sterilization process, develop and understand the cycles to carry out the process, and define a suitable test or series of tests to confirm that the function of the process is suitably ensured by the structure provided. Sterilization of product and components and parts that come in direct contact with sterilized product is the most critical of pharmaceutical processes. Consequently, this process requires a most rigorous and detailed approach to validation. An understanding of the process requires a basic understanding of microbial death, the parameters that facilitate that death, the accepted definition of sterility, and the relationship between the definition and sterilization parameters. Autoclaves and support systems need to be designed, installed, and qualified in a manner that ensures their continued reliability. Lastly, the test program must be complete and definitive. In this paper, in addition to validation study, documentation of IQ, OQ and PQ concretely were described.

  9. An accurate method to measure alpha-emitting natural radionuclides in atmospheric filters: Application in two NORM industries

    NASA Astrophysics Data System (ADS)

    Lozano, R. L.; Bolívar, J. P.; San Miguel, E. G.; García-Tenorio, R.; Gázquez, M. J.

    2011-12-01

    In this work, an accurate method for the measurement of natural alpha-emitting radionuclides from aerosols collected in air filters is presented and discussed in detail. The knowledge of the levels of several natural alpha-emitting radionuclides (238U, 234U, 232Th, 230Th, 228Th, 226Ra and 210Po) in atmospheric aerosols is essential not only for a better understanding of the several atmospheric processes and changes, but also for a proper evaluation of the potential doses, which can inadvertently be received by the population via inhalation. The proposed method takes into account the presence of intrinsic amounts of these radionuclides in the matrices of the quartz filters used, as well as the possible variation in the humidity of the filters throughout the collection process. In both cases, the corrections necessary in order to redress these levels have been evaluated and parameterized. Furthermore, a detailed study has been performed into the optimisation of the volume of air to be sampled in order to increase the accuracy in the determination of the radionuclides. The method as a whole has been applied for the determination of the activity concentrations of U- and Th-isotopes in aerosols collected at two NORM (Naturally Occurring Radioactive Material) industries located in the southwest of Spain. Based on the levels found, a conservative estimation has been performed to yield the additional committed effective doses to which the workers are potentially susceptible due to inhalation of anthropogenic material present in the environment of these two NORM industries.

  10. Regulatory-associated protein of TOR (RAPTOR) alters the hormonal and metabolic composition of Arabidopsis seeds, controlling seed morphology, viability and germination potential.

    PubMed

    Salem, Mohamed A; Li, Yan; Wiszniewski, Andrew; Giavalisco, Patrick

    2017-11-01

    Target of Rapamycin (TOR) is a positive regulator of growth and development in all eukaryotes, which positively regulates anabolic processes like protein synthesis, while repressing catabolic processes, including autophagy. To better understand TOR function we decided to analyze its role in seed development and germination. We therefore performed a detailed phenotypic analysis using mutants of the REGULATORY-ASSOCIATED PROTEIN OF TOR 1B (RAPTOR1B), a conserved TOR interactor, acting as a scaffold protein, which recruits substrates for the TOR kinase. Our results show that raptor1b plants produced seeds that were delayed in germination and less resistant to stresses, leading to decreased viability. These physiological phenotypes were accompanied by morphological changes including decreased seed-coat pigmentation and reduced production of seed-coat mucilage. A detailed molecular analysis revealed that many of these morphological changes were associated with significant changes of the metabolic content of raptor1b seeds, including elevated levels of free amino acids, as well as reduced levels of protective secondary metabolites and storage proteins. Most of these observed changes were accompanied by significantly altered phytohormone levels in the raptor1b seeds, with increases in abscisic acid, auxin and jasmonic acid, which are known to inhibit germination. Delayed germination and seedling growth, observed in the raptor1b seeds, could be partially restored by the exogenous supply of gibberellic acid, indicating that TOR is at the center of a regulatory hub controlling seed metabolism, maturation and germination. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  11. Hurricane Georges' Landfall in the Dominican Republic: Detailed Airborne Doppler Radar Imagery

    NASA Technical Reports Server (NTRS)

    Geerts, B.; Heymsfield, G. M.; Tian, L.; Halverson, J. B.; Guillory, A.; Mejia, M. I.

    1999-01-01

    Current understanding of landfalling tropical cyclones is limited, especially with regard to convective scale processes. On 22 September 1998 Hurricane Georges made landfall on the island of Hispaniola, leaving behind a trail of death and devastation, largely the result of excessive rainfall, not sea level surge or wind. Detailed airborne measurements were taken as part of the Third Convection and Moisture Experiment (CAMEX-3). Of Particular interest are the ER-2 nadir X-band Doppler radar (EDOP) data, which provide a first-time high-resolution view of the precipitation and airflow changes as a hurricane interacts with mountainous terrain. The circulation of hurricane Georges underwent an obvious transition during landfall, evident in the rapid increase in minimum sea-level pressure, the subsidence of the eyewall anvil, and a decrease in average ice concentrations in the eyewall. The eye, as seen in satellite imagery, disappeared, but contrary to current understanding, this was not due to eyewall contraction but rather to convective eruption within the eye. The main convective event within the eye, with upper-level updraft magnitudes near 20 m/s and 89 GHz brightness temperatures below 100 K, occurred when the eye moved over the Cordillera Central, the island's main mountain chain. The location, intensity and evolution of this convection indicate that it was coupled to the surface orography. It is likely that surface rain rates increased during landfall, because of effective droplet collection, both in the convection and in the more widespread stratiform rainfall areas over the island. Evidence for this is the increase in radar reflectivity below the bright band of 1-2 dB/km down to ground-level. Such increase was absent offshore. Such low-level rain enhancement, which cannot be detected in satellite images of upwelling infrared or microwave radiation, must be due to the ascent of boundary-layer air over the topography.

  12. Transitional millisecond pulsars in the low-level accretion state

    NASA Astrophysics Data System (ADS)

    Jaodard, Amruta D.; Hessels, Jason W. T.; Archibald, Anne; Bogdanov, Slavko; Deller, Adam; Hernandez Santisteban, Juan; Patruno, Alessandro; D'Angelo, Caroline; Bassa, Cees; Amruta Jaodand

    2018-01-01

    In the canonical pulsar recycling scenario, a slowly spinning neutron star can be rejuvenated to rapid spin rates by the transfer of angular momentum and mass from a binary companion star. Over the last decade, the discovery of three transitional millisecond pulsars (tMSPs) has allowed us to study recycling in detail. These systems transition between accretion-powered (X-ray) and rotation-powered (radio) pulsar states within just a few days, raising questions such as: what triggers the state transition, when does the recycling process truly end, and what will the radio pulsar’s final spin rate be? Systematic multi-wavelength campaigns over the last decade have provided critical insights: multi-year-long, low-level accretion states showing coherent X-ray pulsations; extremely stable, bi-modal X-ray light curves; outflows probed by radio continuum emission; a surprising gamma-ray brightening during accretion, etc. In my thesis I am trying to bring these clues together to understand the low-level accretion process that recycles a pulsar. For example, recently we timed PSR J1023+0038 in the accretion state and found it to be spinning down ~26% faster compared to the non-accreting radio pulsar state. We are currently conducting simultaneous multi-wavelength campaigns (XMM, HST, Kepler and VLA) to understand the global variability of the accretion flow, as well as high-energy Fermi-LAT observations to probe the gamma-ray emission mechanism. I will highlight these recent developments, while also presenting a broad overview of tMSPs as exciting new laboratories to test low-level accretion onto magnetized neutron stars.

  13. Mechanism of Inhibition of Cholesteryl Ester Transfer Protein by Small Molecule Inhibitors.

    PubMed

    Chirasani, Venkat R; Sankar, Revathi; Senapati, Sanjib

    2016-08-25

    Cholesteryl ester transfer protein (CETP) facilitates the bidirectional exchange of cholesteryl esters and triglycerides between high-density lipoproteins and low- or very low-density lipoproteins. Recent studies have shown that the impairment of lipid exchange processes of CETP can be an effective strategy for the treatment of cardiovascular diseases (CVDs). Understanding the molecular mechanism of CETP inhibition has, therefore, attracted tremendous attention in recent past. In this study, we explored the detailed mechanism of CETP inhibition by a series of recently reported small molecule inhibitors that are currently under preclinical testing. Our results from molecular dynamics simulations and protein-ligand docking studies suggest that the hydrophobic interactions between the CETP core tunnel residues and inhibitor moieties play a pivotal role, and physical occlusion of the CETP tunnel by these small molecules is the primary mechanism of CETP inhibition. Interestingly, bound inhibitors were found to increase the plasticity of CETP, which was explained by principal component analysis that showed a larger space of sampling of CETP C-domain due to inhibitor binding. The atomic-level details presented here could help accelerate the structure-based drug-discovery processes targeting CETP for CVD therapeutics.

  14. Regulation of Plant Cellular and Organismal Development by SUMO.

    PubMed

    Elrouby, Nabil

    2017-01-01

    This chapter clearly demonstrates the breadth and spectrum of the processes that SUMO regulates during plant development. The gross phenotypes observed in mutants of the SUMO conjugation and deconjugation enzymes reflect these essential roles, and detailed analyses of these mutants under different growth conditions revealed roles in biotic and abiotic stress responses, phosphate starvation, nitrate and sulphur metabolism, freezing and drought tolerance and response to excess copper. SUMO functions also intersect with those regulated by several hormones such as salicylic acid , abscisic acid , gibberellins and auxin, and detailed studies provide mechanistic clues of how sumoylation may regulate these processes. The regulation of COP1 and PhyB functions by sumoylation provides very strong evidence that SUMO is heavily involved in the regulation of light signaling in plants. At the cellular and subcellular levels, SUMO regulates meristem architecture, the switch from the mitotic cycle into the endocycle, meiosis, centromere decondensation and exit from mitosis, transcriptional control, and release from transcriptional silencing. Most of these advances in our understanding of SUMO functions during plant development emerged over the past 6-7 years, and they may only predict a prominent rise of SUMO as a major regulator of eukaryotic cellular and organismal growth and development.

  15. The Need for Speed: Neuroendocrine Regulation of Socially-controlled Sex Change.

    PubMed

    Lamm, Melissa S; Liu, Hui; Gemmell, Neil J; Godwin, John R

    2015-08-01

    Socially-controlled functional sex change in fishes is a dramatic example of adaptive reproductive plasticity. Functional gonadal sex change can occur within a week while behavioral sex change can begin within minutes. Significant progress has been made in understanding the neuroendocrine bases of this phenomenon at both the gonadal and the neurobiological levels, but a detailed mechanistic understanding remains elusive. We are working with sex-changing wrasses to identify evolutionarily-conserved neuroendocrine pathways underlying this reproductive adaptation. One key model is the bluehead wrasse (Thalassoma bifasciatum), in which sex change is well studied at the behavioral, ecological, and neuroendocrine levels. Bluehead wrasses show rapid increases in aggressive and courtship behaviors with sex change that do not depend on the presence of gonads. The display of male-typical behavior is correlated with the expression of arginine vasotocin, and experiments support a role for this neuropeptide. Estrogen synthesis is also critical in the process. Female bluehead wrasses have higher abundance of aromatase mRNA in the brain and gonads, and estrogen implants block behavioral sex change. While established methods have advanced our understanding of sex change, a full understanding will require new approaches and perspectives. First, contributions of other neuroendocrine systems should be better characterized, particularly glucocorticoid and thyroid signaling. Second, advances in genomics for non-traditional model species should allow conserved mechanisms to be identified with a key next-step being manipulative tests of these mechanisms. Finally, advances in genomics now also allow study of the role of epigenetic modifications and other regulatory mechanisms in the dramatic alterations across the sex-change process. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  16. Structure-function relations in physiology education: Where's the mechanism?

    PubMed

    Lira, Matthew E; Gardner, Stephanie M

    2017-06-01

    Physiology demands systems thinking: reasoning within and between levels of biological organization and across different organ systems. Many physiological mechanisms explain how structures and their properties interact at one level of organization to produce emergent functions at a higher level of organization. Current physiology principles, such as structure-function relations, selectively neglect mechanisms by not mentioning this term explicitly. We explored how students characterized mechanisms and functions to shed light on how students make sense of these terms. Students characterized mechanisms as 1 ) processes that occur at levels of organization lower than that of functions; and 2 ) as detailed events with many steps involved. We also found that students produced more variability in how they characterized functions compared with mechanisms: students characterized functions in relation to multiple levels of organization and multiple definitions. We interpret these results as evidence that students see mechanisms as holding a more narrow definition than used in the biological sciences, and that students struggle to coordinate and distinguish mechanisms from functions due to cognitive processes germane to learning in many domains. We offer the instructional suggestion that we scaffold student learning by affording students opportunities to relate and also distinguish between these terms so central to understanding physiology. Copyright © 2017 the American Physiological Society.

  17. Saudi EFL Learners' Reported Reading Problems and Strategic Processing of Text Types: A Think-Aloud Case Study

    ERIC Educational Resources Information Center

    Alkhaleefah, Tarek A.

    2017-01-01

    Research in strategy use needs to provide comprehensive and detailed qualitative discussion of individual cases and their strategic processing of texts to deepen our understanding of the cognitive and metacognitive processes readers resort to when reading different texts for different purposes. Hence, the present paper aims to provide in-depth and…

  18. Critical Success Factors in the Curriculum Alignment Process: The Case of the College of Business at Abu Dhabi University

    ERIC Educational Resources Information Center

    Camba, Pitzel; Krotov, Vlad

    2015-01-01

    The main goals of this article are to (a) assist business schools in understanding the curriculum alignment process, and (b) uncover critical success factors in curriculum alignment. Based on a case study conducted at the College of Business at Abu Dhabi University, a detailed curriculum alignment process description is provided. The process…

  19. Knowledge elicitation for an operator assistant system in process control tasks

    NASA Technical Reports Server (NTRS)

    Boy, Guy A.

    1988-01-01

    A knowledge based system (KBS) methodology designed to study human machine interactions and levels of autonomy in allocation of process control tasks is presented. Users are provided with operation manuals to assist them in normal and abnormal situations. Unfortunately, operation manuals usually represent only the functioning logic of the system to be controlled. The user logic is often totally different. A method is focused on which illicits user logic to refine a KBS shell called an Operator Assistant (OA). If the OA is to help the user, it is necessary to know what level of autonomy gives the optimal performance of the overall man-machine system. For example, for diagnoses that must be carried out carefully by both the user and the OA, interactions are frequent, and processing is mostly sequential. Other diagnoses can be automated, in which the case the OA must be able to explain its reasoning in an appropriate level of detail. OA structure was used to design a working KBS called HORSES (Human Orbital Refueling System Expert System). Protocol analysis of pilots interacting with this system reveals that the a-priori analytical knowledge becomes more structured with training and the situation patterns more complex and dynamic. This approach can improve the a-priori understanding of human and automatic reasoning.

  20. Cathode surface effects and H.F.-behaviour of vacuum arcs

    NASA Astrophysics Data System (ADS)

    Fu, Yan Hong

    To gain a better understanding of the essential processes occurring during a vacuum arc interruption for the further development of the vacuum arc circuit breaker, cathode spot behavior, current interruption, dielectrical recovery and overvoltage generation are investigated. An experimental study on cathode spot behavior of the DC vacuum arc in relation to cathode surface roughness and a qualitative physical model to interpret the results are reported. An experimental investigation on the High Frequency (HF) current interruption, multiple recognitions and voltage escalation phenomena is reported. A calculation program to predict the level of overvoltages generated by the operation of a vacuum breaker in a realistic single phase circuit is developed. Detailed results are summarized.

  1. Modeling Physiological Systems in the Human Body as Networks of Quasi-1D Fluid Flows

    NASA Astrophysics Data System (ADS)

    Staples, Anne

    2008-11-01

    Extensive research has been done on modeling human physiology. Most of this work has been aimed at developing detailed, three-dimensional models of specific components of physiological systems, such as a cell, a vein, a molecule, or a heart valve. While efforts such as these are invaluable to our understanding of human biology, if we were to construct a global model of human physiology with this level of detail, computing even a nanosecond in this computational being's life would certainly be prohibitively expensive. With this in mind, we derive the Pulsed Flow Equations, a set of coupled one-dimensional partial differential equations, specifically designed to capture two-dimensional viscous, transport, and other effects, and aimed at providing accurate and fast-to-compute global models for physiological systems represented as networks of quasi one-dimensional fluid flows. Our goal is to be able to perform faster-than-real time simulations of global processes in the human body on desktop computers.

  2. Auditory and Linguistic Processes in the Perception of Intonation Contours.

    ERIC Educational Resources Information Center

    Studdert-Kennedy, Michael; Hadding, Kerstin

    By examining the relations among sections of the fundamental frequency contour used in judging an utterance as a question or statement, the experiment described in this report seeks a more detailed understanding of auditory-linguistic interaction in the perception of intonation contours. The perceptual process may be divided into stages (auditory,…

  3. The Role of Visualization in Computer Science Education

    ERIC Educational Resources Information Center

    Fouh, Eric; Akbar, Monika; Shaffer, Clifford A.

    2012-01-01

    Computer science core instruction attempts to provide a detailed understanding of dynamic processes such as the working of an algorithm or the flow of information between computing entities. Such dynamic processes are not well explained by static media such as text and images, and are difficult to convey in lecture. The authors survey the history…

  4. Investigating Test-Taking Behaviors Using Timing and Process Data

    ERIC Educational Resources Information Center

    Lee, Yi-Hsuan; Haberman, Shelby J.

    2016-01-01

    The use of computer-based assessments makes the collection of detailed data that capture examinees' progress in the tests and time spent on individual actions possible. This article presents a study using process and timing data to aid understanding of an international language assessment and the examinees. Issues regarding test-taking strategies,…

  5. Structure and Management of an Engineering Senior Design Course.

    PubMed

    Tanaka, Martin L; Fischer, Kenneth J

    2016-07-01

    The design of products and processes is an important area in engineering. Students in engineering schools learn fundamental principles in their courses but often lack an opportunity to apply these methods to real-world problems until their senior year. This article describes important elements that should be incorporated into a senior capstone design course. It includes a description of the general principles used in engineering design and a discussion of why students often have difficulty with application and revert to trial and error methods. The structure of a properly designed capstone course is dissected and its individual components are evaluated. Major components include assessing resources, identifying projects, establishing teams, understanding requirements, developing conceptual designs, creating detailed designs, building prototypes, testing performance, and final presentations. In addition to the course design, team management and effective mentoring are critical to success. This article includes suggested guidelines and tips for effective design team leadership, attention to detail, investment of time, and managing project scope. Furthermore, the importance of understanding business culture, displaying professionalism, and considerations of different types of senior projects is discussed. Through a well-designed course and proper mentoring, students will learn to apply their engineering skills and gain basic business knowledge that will prepare them for entry-level positions in industry.

  6. Some opinions on the review process of research papers destined for publication.

    PubMed

    Roohi, Ehsan; Mahian, Omid

    2015-06-01

    The current paper discusses the peer review process in journals that publish research papers purveying new science and understandings (scientific journals). Different aspects of peer review including the selection of reviewers, the review process and the decision policy of editor are discussed in details. Here, the pros and cons of different conventional methods of review processes are mentioned. Finally, a suggestion is presented for the review process of scientific papers.

  7. Detail in architecture: Between arts & crafts

    NASA Astrophysics Data System (ADS)

    Dulencin, Juraj

    2016-06-01

    Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students' bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its economic cost, an actual price of real implementation. The detail determines not only the physical expression, it becomes the characteristic feature from which the rest of the building is derived. This course motivates students to surpass mere technical calculations learned from books towards sophistication and refinement, pragmatism and experimentation, and encourages a shift from feasibility to perfection.

  8. Simulation of groundwater flow in the glacial aquifer system of northeastern Wisconsin with variable model complexity

    USGS Publications Warehouse

    Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.

    2017-05-04

    The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle tracking is anticipated to evaluate if these model design considerations are similarly important for understanding the primary modeling objective - to simulate reasonable groundwater age distributions.

  9. What are the Outstanding Questions in Biosphere-Atmosphere Exchange?

    NASA Astrophysics Data System (ADS)

    Katul, G.; Finnigan, J.

    2002-12-01

    Our answer to this question has three parts. At the highest level human population is now so large (and will grow to 10-12 million by 2050 before stabilizing and reversing) and our capacity to influence the biosphere is so profound that we must now treat human, biophysical and geophysical processes as a single interacting system: the Anthroposphere. The Anthroposphere is a complex adaptive system (CAS) in the sense that action of humans individually or in groups (as large as nations) are a response to an environment that includes not just the geosphere and biosphere but the sum of human interactions as well. These actions, in turn, will change this system that provides critical ecosystem services that we are just now realizing are finite and threatened. At present we lack the tools and understanding even to frame key questions about the Anthroposphere although the emerging discipline of Complex System Science is hinting at ways this might be done. At the next level we address the problems posed by cross-scale interactions central to the state and function of the geosphere and biosphere, where processes at the cellular level, (e.g. photosynthesis and respiration), have consequences on the global level and vice-versa through a sequence of non-linear upscale and downscale interactions. Currently there are large gaps in our understanding of this scale cascade that inhibit our ability to parameterize, model and predict phenomena such as the rapid shifts in climate that recent studies of the paleo record have revealed. In short, we need to quantify the risk that our current actions may cause abrupt climate changes that may then feed upon the Anthrosphere. Unless we can reduce the uncertainty with which we describe quantitatively the non-human parts of the Anthroposphere, the injection of models of human interaction, initially tentative, may reduce our predictive ability to unusable levels. Finally we can detail a set of key processes that are themselves critical to geosphere-biosphere functioning but which are poorly understood as yet.

  10. A microphysical pathway analysis to investigate aerosol effects on convective clouds

    NASA Astrophysics Data System (ADS)

    Heikenfeld, Max; White, Bethan; Labbouz, Laurent; Stier, Philip

    2017-04-01

    The impact of aerosols on ice- and mixed-phase processes in convective clouds remains highly uncertain, which has strong implications for estimates of the role of aerosol-cloud interactions in the climate system. The wide range of interacting microphysical processes are still poorly understood and generally not resolved in global climate models. To understand and visualise these processes and to conduct a detailed pathway analysis, we have added diagnostic output of all individual process rates for number and mass mixing ratios to two commonly-used cloud microphysics schemes (Thompson and Morrison) in WRF. This allows us to investigate the response of individual processes to changes in aerosol conditions and the propagation of perturbations throughout the development of convective clouds. Aerosol effects on cloud microphysics could strongly depend on the representation of these interactions in the model. We use different model complexities with regard to aerosol-cloud interactions ranging from simulations with different levels of fixed cloud droplet number concentration (CDNC) as a proxy for aerosol, to prognostic CDNC with fixed modal aerosol distributions. Furthermore, we have implemented the HAM aerosol model in WRF-chem to also perform simulations with a fully interactive aerosol scheme. We employ a hierarchy of simulation types to understand the evolution of cloud microphysical perturbations in atmospheric convection. Idealised supercell simulations are chosen to present and test the analysis methods for a strongly confined and well-studied case. We then extend the analysis to large case study simulations of tropical convection over the Amazon rainforest. For both cases we apply our analyses to individually tracked convective cells. Our results show the impact of model uncertainties on the understanding of aerosol-convection interactions and have implications for improving process representation in models.

  11. Folding Back and the Dynamical Growth of Mathematical Understanding: Elaborating the Pirie-Kieren Theory

    ERIC Educational Resources Information Center

    Martin, Lyndon C.

    2008-01-01

    The study reported here extends the work of Pirie and Kieren on the nature and growth of mathematical understanding. The research examines in detail a key aspect of their theory, the process of 'folding back', and develops a theoretical framework of categories and sub-categories that more fully describe the phenomenon. This paper presents an…

  12. Implementing An Image Understanding System Architecture Using Pipe

    NASA Astrophysics Data System (ADS)

    Luck, Randall L.

    1988-03-01

    This paper will describe PIPE and how it can be used to implement an image understanding system. Image understanding is the process of developing a description of an image in order to make decisions about its contents. The tasks of image understanding are generally split into low level vision and high level vision. Low level vision is performed by PIPE -a high performance parallel processor with an architecture specifically designed for processing video images at up to 60 fields per second. High level vision is performed by one of several types of serial or parallel computers - depending on the application. An additional processor called ISMAP performs the conversion from iconic image space to symbolic feature space. ISMAP plugs into one of PIPE's slots and is memory mapped into the high level processor. Thus it forms the high speed link between the low and high level vision processors. The mechanisms for bottom-up, data driven processing and top-down, model driven processing are discussed.

  13. Energy transfer and radiative recombination processes in (Gd, Lu)3Ga3Al2O12:Pr3+ scintillators

    NASA Astrophysics Data System (ADS)

    Wu, Yuntao; Ren, Guohao

    2013-10-01

    (GdxLu3-x)Ga3Al2O12:0.3 at.%Pr (x = 0.025, 0.05, 0.1, 0.2, 0.4, 0.6) (GLGAG:Pr) polycrystalline powders are prepared by solid-state reaction method. To better understand the luminescence mechanism, the optical diffuse reflectance, photoluminescence emission and excitation, X-ray excited luminescence spectra and decay kinetics of GLGAG:Pr were investigated in detailed, allowing the determination of energy transfer from 5d state of Pr3+ to 4f state of Gd3+, and the non-radiative relaxation from 5d to 4f state of Pr3+. Besides, the former process plays more negative role in the emission quenching of GLGAG:Pr than later one. Pr3+ ion is regarded as an ineffective activation ion in Gd-based multicomponent aluminate garnets. In addition, the wavelength-resolved thermoluminescence spectra of GLGAG:Pr were studied after UV and X-ray irradiation. It is revealed that the localized recombination processes from electron traps to lower lying 4f levels of Pr3+ occurs without populating the higher 5d levels of Pr3+.

  14. Osmoregulation, bioenergetics and oxidative stress in coastal marine invertebrates: raising the questions for future research.

    PubMed

    Rivera-Ingraham, Georgina A; Lignot, Jehan-Hervé

    2017-05-15

    Osmoregulation is by no means an energetically cheap process, and its costs have been extensively quantified in terms of respiration and aerobic metabolism. Common products of mitochondrial activity are reactive oxygen and nitrogen species, which may cause oxidative stress by degrading key cell components, while playing essential roles in cell homeostasis. Given the delicate equilibrium between pro- and antioxidants in fueling acclimation responses, the need for a thorough understanding of the relationship between salinity-induced oxidative stress and osmoregulation arises as an important issue, especially in the context of global changes and anthropogenic impacts on coastal habitats. This is especially urgent for intertidal/estuarine organisms, which may be subject to drastic salinity and habitat changes, leading to redox imbalance. How do osmoregulation strategies determine energy expenditure, and how do these processes affect organisms in terms of oxidative stress? What mechanisms are used to cope with salinity-induced oxidative stress? This Commentary aims to highlight the main gaps in our knowledge, covering all levels of organization. From an energy-redox perspective, we discuss the link between environmental salinity changes and physiological responses at different levels of biological organization. Future studies should seek to provide a detailed understanding of the relationship between osmoregulatory strategies and redox metabolism, thereby informing conservation physiologists and allowing them to tackle the new challenges imposed by global climate change. © 2017. Published by The Company of Biologists Ltd.

  15. The Avian Proghrelin System

    USDA-ARS?s Scientific Manuscript database

    To understand how the proghrelin system functions in regulating growth hormone release and food intake as well as defining its pleiotropic roles in such diverse physiological processes as energy homeostasis, gastrointestinal tract function and reproduction requires detailed knowledge of the structur...

  16. India Energy Outlook: End Use Demand in India to 2020

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de la Rue du Can, Stephane; McNeil, Michael; Sathaye, Jayant

    Integrated economic models have been used to project both baseline and mitigation greenhouse gas emissions scenarios at the country and the global level. Results of these scenarios are typically presented at the sectoral level such as industry, transport, and buildings without further disaggregation. Recently, a keen interest has emerged on constructing bottom up scenarios where technical energy saving potentials can be displayed in detail (IEA, 2006b; IPCC, 2007; McKinsey, 2007). Analysts interested in particular technologies and policies, require detailed information to understand specific mitigation options in relation to business-as-usual trends. However, the limit of information available for developing countries oftenmore » poses a problem. In this report, we have focus on analyzing energy use in India in greater detail. Results shown for the residential and transport sectors are taken from a previous report (de la Rue du Can, 2008). A complete picture of energy use with disaggregated levels is drawn to understand how energy is used in India and to offer the possibility to put in perspective the different sources of end use energy consumption. For each sector, drivers of energy and technology are indentified. Trends are then analyzed and used to project future growth. Results of this report provide valuable inputs to the elaboration of realistic energy efficiency scenarios.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuevas, F.A.; Curilef, S., E-mail: scurilef@ucn.cl; Plastino, A.R., E-mail: arplastino@ugr.es

    The spread of a wave-packet (or its deformation) is a very important topic in quantum mechanics. Understanding this phenomenon is relevant in connection with the study of diverse physical systems. In this paper we apply various 'spreading measures' to characterize the evolution of an initially localized wave-packet in a tight-binding lattice, with special emphasis on information-theoretical measures. We investigate the behavior of both the probability distribution associated with the wave packet and the concomitant probability current. Complexity measures based upon Renyi entropies appear to be particularly good descriptors of the details of the delocalization process. - Highlights: > Spread ofmore » highly localized wave-packet in the tight-binding lattice. > Entropic and information-theoretical characterization is used to understand the delocalization. > The behavior of both the probability distribution and the concomitant probability current is investigated. > Renyi entropies appear to be good descriptors of the details of the delocalization process.« less

  18. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    NASA Astrophysics Data System (ADS)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  19. Series: Pragmatic trials and real world evidence: Paper 8. Data collection and management.

    PubMed

    Meinecke, Anna-Katharina; Welsing, Paco; Kafatos, George; Burke, Des; Trelle, Sven; Kubin, Maria; Nachbaur, Gaelle; Egger, Matthias; Zuidgeest, Mira

    2017-11-01

    Pragmatic trials can improve our understanding of how treatments will perform in routine practice. In a series of eight papers, the GetReal Consortium has evaluated the challenges in designing and conducting pragmatic trials and their specific methodological, operational, regulatory, and ethical implications. The present final paper of the series discusses the operational and methodological challenges of data collection in pragmatic trials. A more pragmatic data collection needs to balance the delivery of highly accurate and complete data with minimizing the level of interference that data entry and verification induce with clinical practice. Furthermore, it should allow for the involvement of a representative sample of practices, physicians, and patients who prescribe/receive treatment in routine care. This paper discusses challenges that are related to the different methods of data collection and presents potential solutions where possible. No one-size-fits-all recommendation can be given for the collection of data in pragmatic trials, although in general the application of existing routinely used data-collection systems and processes seems to best suit the pragmatic approach. However, data access and privacy, the time points of data collection, the level of detail in the data, and the lack of a clear understanding of the data-collection process were identified as main challenges for the usage of routinely collected data in pragmatic trials. A first step should be to determine to what extent existing health care databases provide the necessary study data and can accommodate data collection and management. When more elaborate or detailed data collection or more structured follow-up is required, data collection in a pragmatic trial will have to be tailor-made, often using a hybrid approach using a dedicated electronic case report form (eCRF). In this case, the eCRF should be kept as simple as possible to reduce the burden for practitioners and minimize influence on routine clinical practice. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  20. The perception of emotion in body expressions.

    PubMed

    de Gelder, B; de Borst, A W; Watson, R

    2015-01-01

    During communication, we perceive and express emotional information through many different channels, including facial expressions, prosody, body motion, and posture. Although historically the human body has been perceived primarily as a tool for actions, there is now increased understanding that the body is also an important medium for emotional expression. Indeed, research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are processed and understood, at the behavioral and neural levels, with specific reference to their role in emotional communication. The first part of this review outlines brain regions and spectrotemporal dynamics underlying perception of isolated neutral and affective bodies, the second part details the contextual effects on body emotion recognition, and final part discusses body processing on a subconscious level. More specifically, research has shown that body expressions as compared with neutral bodies draw upon a larger network of regions responsible for action observation and preparation, emotion processing, body processing, and integrative processes. Results from neurotypical populations and masking paradigms suggest that subconscious processing of affective bodies relies on a specific subset of these regions. Moreover, recent evidence has shown that emotional information from the face, voice, and body all interact, with body motion and posture often highlighting and intensifying the emotion expressed in the face and voice. © 2014 John Wiley & Sons, Ltd.

  1. Using Websites to Convey Scientific Uncertainties for Volcanic Processes and Potential Hazards

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Lowenstern, J. B.; Hill, D. P.

    2005-12-01

    The Yellowstone Volcano Observatory (YVO) and Long Valley Observatory (LVO) websites have greatly increased the public's awareness and access to information about scientific uncertainties for volcanic processes by communicating at multiple levels of understanding and varied levels of detail. Our websites serve a broad audience ranging from visitors unaware of the calderas, to lay volcano enthusiasts, to scientists, federal agencies, and emergency managers. Both Yellowstone and Long Valley are highly visited tourist attractions with histories of caldera-forming eruptions large enough to alter global climate temporarily. Although it is much more likely that future activity would be on a small scale at either volcano, we are constantly posed questions about low-probability, high-impact events such as the caldera-forming eruption depicted in the recent BBC/Discovery movie, "Supervolcano". YVO and LVO website objectives include: providing monitoring data, explaining the likelihood of future events, summarizing research results, helping media provide reliable information, and expanding on information presented by the media. Providing detailed current information is a crucial website component as the public often searches online to augment information gained from often cryptic pronouncements by the media. In May 2005, for example, YVO saw an order of magnitude increase in page requests on the day MSNBC ran the misleading headline, "Yellowstone eruption threat high." The headline referred not to current events but a general rating of Yellowstone as one of 37 "high threat" volcanoes in the USGS National Volcano Early Warning System report. As websites become a more dominant source of information, we continuously revise our communication plans to make the most of this evolving medium. Because the internet gives equal access to all information providers, we find ourselves competing with various "doomsday" websites that sensationalize and distort the current understanding of natural systems. For example, many sites highlight a miscalculated repose period for caldera-forming eruptions at Yellowstone and conclude that a catastrophic eruption is overdue. Recent revisions on the YVO website have discussed how intervals are calculated and why the commonly quoted values are incorrect. Our aim is to reduce confusion by providing clear, simple explanations that highlight the process by which scientists reach conclusions and calculate associated uncertainties.

  2. The surface elevation table and marker horizon technique: A protocol for monitoring wetland elevation dynamics

    USGS Publications Warehouse

    James C. Lynch,; Phillippe Hensel,; Cahoon, Donald R.

    2015-01-01

    The National Park Service, in response to the growing evidence and awareness of the effects of climate change on federal lands, determined that monitoring wetland elevation change is a top priority in North Atlantic Coastal parks (Stevens et al, 2010). As a result, the NPS Northeast Coastal and Barrier Network (NCBN) in collaboration with colleagues from the U.S. Geological Survey (USGS) and The National Oceanic and Atmospheric Administration (NOAA) have developed a protocol for monitoring wetland elevation change and other processes important for determining the viability of wetland communities. Although focused on North Atlantic Coastal parks, this document is applicable to all coastal and inland wetland regions. Wetlands exist within a narrow range of elevation which is influenced by local hydrologic conditions. For coastal wetlands in particular, local hydrologic conditions may be changing as sea levels continue to rise. As sea level rises, coastal wetland systems may respond by building elevation to maintain favorable hydrologic conditions for their survival. This protocol provides the reader with instructions and guidelines on designing a monitoring plan or study to: A) Quantify elevation change in wetlands with the Surface Elevation Table (SET). B) Understand the processes that influence elevation change, including vertical accretion (SET and Marker Horizon methods). C) Survey the wetland surface and SET mark to a common reference datum to allow for comparing sample stations to each other and to local tidal datums. D) Survey the SET mark to monitor its relative stability. This document is divided into two parts; the main body that presents an overview of all aspects of monitoring wetland elevation dynamics, and a collection of Standard Operating Procedures (SOP) that describes in detail how to perform or execute each step of the methodology. Detailed instruction on the installation, data collection, data management and analysis are provided in this report and associated SOP’s. A better understanding of these processes will help to determine the present and future viability of coastal wetlands managed by NPS and can help address measures that will ensure these communities exist into the future.

  3. Measurement and control of detailed electronic properties in a single molecule break junction.

    PubMed

    Wang, Kun; Hamill, Joseph; Zhou, Jianfeng; Guo, Cunlan; Xu, Bingqian

    2014-01-01

    The lack of detailed experimental controls has been one of the major obstacles hindering progress in molecular electronics. While large fluctuations have been occurring in the experimental data, specific details, related mechanisms, and data analysis techniques are in high demand to promote our physical understanding at the single-molecule level. A series of modulations we recently developed, based on traditional scanning probe microscopy break junctions (SPMBJs), have helped to discover significant properties in detail which are hidden in the contact interfaces of a single-molecule break junction (SMBJ). For example, in the past we have shown that the correlated force and conductance changes under the saw tooth modulation and stretch-hold mode of PZT movement revealed inherent differences in the contact geometries of a molecular junction. In this paper, using a bias-modulated SPMBJ and utilizing emerging data analysis techniques, we report on the measurement of the altered alignment of the HOMO of benzene molecules with changing the anchoring group which coupled the molecule to metal electrodes. Further calculations based on Landauer fitting and transition voltage spectroscopy (TVS) demonstrated the effects of modulated bias on the location of the frontier molecular orbitals. Understanding the alignment of the molecular orbitals with the Fermi level of the electrodes is essential for understanding the behaviour of SMBJs and for the future design of more complex devices. With these modulations and analysis techniques, fruitful information has been found about the nature of the metal-molecule junction, providing us insightful clues towards the next step for in-depth study.

  4. Pectin engineering to modify product quality in potato.

    PubMed

    Ross, Heather A; Morris, Wayne L; Ducreux, Laurence J M; Hancock, Robert D; Verrall, Susan R; Morris, Jenny A; Tucker, Gregory A; Stewart, Derek; Hedley, Pete E; McDougall, Gordon J; Taylor, Mark A

    2011-10-01

    Although processed potato tuber texture is an important trait that influences consumer preference, a detailed understanding of tuber textural properties at the molecular level is lacking. Previous work has identified tuber pectin methyl esterase (PME) activity as a potential factor impacting on textural properties, and the expression of a gene encoding an isoform of PME (PEST1) was associated with cooked tuber textural properties. In this study, a transgenic approach was undertaken to investigate further the impact of the PEST1 gene. Antisense and over-expressing potato lines were generated. In over-expressing lines, tuber PME activity was enhanced by up to 2.3-fold; whereas in antisense lines, PME activity was decreased by up to 62%. PME isoform analysis indicated that the PEST1 gene encoded one isoform of PME. Analysis of cell walls from tubers from the over-expressing lines indicated that the changes in PME activity resulted in a decrease in pectin methylation. Analysis of processed tuber texture demonstrated that the reduced level of pectin methylation in the over-expressing transgenic lines was associated with a firmer processed texture. Thus, there is a clear link between PME activity, pectin methylation and processed tuber textural properties. © 2011 The Authors. Plant Biotechnology Journal © 2011 Society for Experimental Biology, Association of Applied Biologists and Blackwell Publishing Ltd.

  5. DNAproDB: an interactive tool for structural analysis of DNA–protein complexes

    PubMed Central

    Sagendorf, Jared M.

    2017-01-01

    Abstract Many biological processes are mediated by complex interactions between DNA and proteins. Transcription factors, various polymerases, nucleases and histones recognize and bind DNA with different levels of binding specificity. To understand the physical mechanisms that allow proteins to recognize DNA and achieve their biological functions, it is important to analyze structures of DNA–protein complexes in detail. DNAproDB is a web-based interactive tool designed to help researchers study these complexes. DNAproDB provides an automated structure-processing pipeline that extracts structural features from DNA–protein complexes. The extracted features are organized in structured data files, which are easily parsed with any programming language or viewed in a browser. We processed a large number of DNA–protein complexes retrieved from the Protein Data Bank and created the DNAproDB database to store this data. Users can search the database by combining features of the DNA, protein or DNA–protein interactions at the interface. Additionally, users can upload their own structures for processing privately and securely. DNAproDB provides several interactive and customizable tools for creating visualizations of the DNA–protein interface at different levels of abstraction that can be exported as high quality figures. All functionality is documented and freely accessible at http://dnaprodb.usc.edu. PMID:28431131

  6. Respiratory factors in the uptake and excretion of anesthetics. 1965.

    PubMed

    Epstein, R M; Papper, E M

    1998-01-01

    We have considered some of the ways in which respiration can affect the gas exchange process. The simplest relationships are purely physical and relate to the speed with which the lung and tissues can be filled or emptied. More complex relationships involve a consideration of the interplay between blood and gas in the lung and the effects of gas exchange on respiratory volumes themselves. Finally, some examples of the importance of physiologic alteration produced by, and producing respiratory shifts during, gas uptake processes were presented briefly. The detailed interpretation of gas exchange phenomena demands more quantitative information of this sort, concerning not only the respiratory but the circulatory and tissue level variations affecting uptake during anesthesia. Nevertheless, understanding of the principles and application of such data as are available can go far toward removing the handicaps of empirical practice from the day-to-day administration of anesthetic agents to human beings.

  7. Identification of Upper and Lower Level Yield Strength in Materials

    PubMed Central

    Valíček, Jan; Harničárová, Marta; Kopal, Ivan; Palková, Zuzana; Kušnerová, Milena; Panda, Anton; Šepelák, Vladimír

    2017-01-01

    This work evaluates the possibility of identifying mechanical parameters, especially upper and lower yield points, by the analytical processing of specific elements of the topography of surfaces generated with abrasive waterjet technology. We developed a new system of equations, which are connected with each other in such a way that the result of a calculation is a comprehensive mathematical–physical model, which describes numerically as well as graphically the deformation process of material cutting using an abrasive waterjet. The results of our model have been successfully checked against those obtained by means of a tensile test. The main prospect for future applications of the method presented in this article concerns the identification of mechanical parameters associated with the prediction of material behavior. The findings of this study can contribute to a more detailed understanding of the relationships: material properties—tool properties—deformation properties. PMID:28832526

  8. Complete protein-protein association kinetics in atomic detail revealed by molecular dynamics simulations and Markov modelling

    NASA Astrophysics Data System (ADS)

    Plattner, Nuria; Doerr, Stefan; de Fabritiis, Gianni; Noé, Frank

    2017-10-01

    Protein-protein association is fundamental to many life processes. However, a microscopic model describing the structures and kinetics during association and dissociation is lacking on account of the long lifetimes of associated states, which have prevented efficient sampling by direct molecular dynamics (MD) simulations. Here we demonstrate protein-protein association and dissociation in atomistic resolution for the ribonuclease barnase and its inhibitor barstar by combining adaptive high-throughput MD simulations and hidden Markov modelling. The model reveals experimentally consistent intermediate structures, energetics and kinetics on timescales from microseconds to hours. A variety of flexibly attached intermediates and misbound states funnel down to a transition state and a native basin consisting of the loosely bound near-native state and the tightly bound crystallographic state. These results offer a deeper level of insight into macromolecular recognition and our approach opens the door for understanding and manipulating a wide range of macromolecular association processes.

  9. Identification of Upper and Lower Level Yield Strength in Materials.

    PubMed

    Valíček, Jan; Harničárová, Marta; Kopal, Ivan; Palková, Zuzana; Kušnerová, Milena; Panda, Anton; Šepelák, Vladimír

    2017-08-23

    This work evaluates the possibility of identifying mechanical parameters, especially upper and lower yield points, by the analytical processing of specific elements of the topography of surfaces generated with abrasive waterjet technology. We developed a new system of equations, which are connected with each other in such a way that the result of a calculation is a comprehensive mathematical-physical model, which describes numerically as well as graphically the deformation process of material cutting using an abrasive waterjet. The results of our model have been successfully checked against those obtained by means of a tensile test. The main prospect for future applications of the method presented in this article concerns the identification of mechanical parameters associated with the prediction of material behavior. The findings of this study can contribute to a more detailed understanding of the relationships: material properties-tool properties-deformation properties.

  10. Melt-growth dynamics in CdTe crystals

    DOE PAGES

    Zhou, X. W.; Ward, D. K.; Wong, B. M.; ...

    2012-06-01

    We use a new, quantum-mechanics-based bond-order potential (BOP) to reveal melt growth dynamics and fine scale defect formation mechanisms in CdTe crystals. Previous molecular dynamics simulations of semiconductors have shown qualitatively incorrect behavior due to the lack of an interatomic potential capable of predicting both crystalline growth and property trends of many transitional structures encountered during the melt → crystal transformation. Here, we demonstrate successful molecular dynamics simulations of melt growth in CdTe using a BOP that significantly improves over other potentials on property trends of different phases. Our simulations result in a detailed understanding of defect formation during themore » melt growth process. Equally important, we show that the new BOP enables defect formation mechanisms to be studied at a scale level comparable to empirical molecular dynamics simulation methods with a fidelity level approaching quantum-mechanical methods.« less

  11. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  12. Care erosion in hospitals: Problems in reflective nursing practice and the role of cognitive dissonance.

    PubMed

    de Vries, Jan; Timmins, Fiona

    2016-03-01

    Care erosion - gradual decline in care level - is an important problem in health care today. Unfortunately, the mechanism whereby it occurs is complex and poorly understood. This paper seeks to address this by emphasising problems in reflective nursing practice. Critical reflection on quality of care which should drive good care instead spawns justifications, denial, and trivialisation of deficient care. This perpetuates increasingly poor care levels. We argue that cognitive dissonance theory provides a highly effective understanding of this process and suggest for this approach to be incorporated in all efforts to address care erosion. The paper includes a detailed discussion of examples and implications for practice, in particular the need to restore critical reflection in nursing, the importance of embracing strong values and standards, and the need for increased awareness of signs of care erosion. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. CVD-Enabled Graphene Manufacture and Technology

    PubMed Central

    2015-01-01

    Integrated manufacturing is arguably the most challenging task in the development of technology based on graphene and other 2D materials, particularly with regard to the industrial demand for “electronic-grade” large-area films. In order to control the structure and properties of these materials at the monolayer level, their nucleation, growth and interfacing needs to be understood to a level of unprecedented detail compared to existing thin film or bulk materials. Chemical vapor deposition (CVD) has emerged as the most versatile and promising technique to develop graphene and 2D material films into industrial device materials and this Perspective outlines recent progress, trends, and emerging CVD processing pathways. A key focus is the emerging understanding of the underlying growth mechanisms, in particular on the role of the required catalytic growth substrate, which brings together the latest progress in the fields of heterogeneous catalysis and classic crystal/thin-film growth. PMID:26240694

  14. The auditory neural network in man

    NASA Technical Reports Server (NTRS)

    Galambos, R.

    1975-01-01

    The principles of anatomy and physiology necessary for understanding brain wave recordings made from the scalp are briefly discussed. Brain waves evoked by sounds are then described and certain of their features are related to the physical aspects of the stimulus and the psychological state of the listener. It is proposed that data obtained through probes located outside the head can reveal a large amount of detail about brain activity. It is argued that analysis of such records enables one to detect the response of the nervous system to an acoustic message at the moment of its inception at the ear, and to follow the progress of the acoustic message up through the various brain levels as progressively more complex operations are performed upon it. Even those brain events responsible for the highest level of signal processing - distinguishing between similar signals and making decisions about them - seem to generate characteristic and identifiable electrical waves.

  15. Ion Beam Processing.

    DTIC Science & Technology

    1987-03-13

    guides Taps for plastics Orthopedic implants (hip and knee joints, etc.) Extrusion spinnerettes Finishing rolls for copper rod Extrusion nozzles...detail in following sections. C. Comparison to Coating Techniques -,* Because ion implantation is a process that modifies surface properties it is often...Therefore, it is important to understand the differences between ion implantation and coating techniques, especially ion plating. The result of ion

  16. Research on Interventions for Adolescents with Learning Disabilities: A Meta-Analysis of Outcomes Related to Higher-Order Processing.

    ERIC Educational Resources Information Center

    Swanson, H. Lee

    2001-01-01

    Details meta-analysis of 58 intervention studies related to higher-order processing (i.e., problem solving) for adolescents with learning disabilities. Discusses factors that increased effect sizes: (1) measures of metacognition and text understanding; (2) instruction including advanced organizers, new skills, and extended practice; and (3)…

  17. Toward a Stress Process Model of Children's Exposure to Physical Family and Community Violence

    ERIC Educational Resources Information Center

    Foster, Holly; Brooks-Gunn, Jeanne

    2009-01-01

    Theoretically informed models are required to further the comprehensive understanding of children's ETV. We draw on the stress process paradigm to forward an overall conceptual model of ETV (ETV) in childhood and adolescence. Around this conceptual model, we synthesize research in four dominant areas of the literature which are detailed but often…

  18. Towards an Understanding of the Business Process Analyst: An Analysis of Competencies

    ERIC Educational Resources Information Center

    Sonteya, Thembela; Seymour, Lisa

    2012-01-01

    The increase in adoption of business process management (BPM) and service oriented architecture (SOA) has created a high demand for qualified professionals with a plethora of skills. However, despite the growing amount of literature available on the topics of BPM and SOA, little research has been conducted around developing a detailed list of…

  19. Understanding the mechanisms of amorphous creep through molecular simulation

    NASA Astrophysics Data System (ADS)

    Cao, Penghui; Short, Michael P.; Yip, Sidney

    2017-12-01

    Molecular processes of creep in metallic glass thin films are simulated at experimental timescales using a metadynamics-based atomistic method. Space-time evolutions of the atomic strains and nonaffine atom displacements are analyzed to reveal details of the atomic-level deformation and flow processes of amorphous creep in response to stress and thermal activations. From the simulation results, resolved spatially on the nanoscale and temporally over time increments of fractions of a second, we derive a mechanistic explanation of the well-known variation of creep rate with stress. We also construct a deformation map delineating the predominant regimes of diffusional creep at low stress and high temperature and deformational creep at high stress. Our findings validate the relevance of two original models of the mechanisms of amorphous plasticity: one focusing on atomic diffusion via free volume and the other focusing on stress-induced shear deformation. These processes are found to be nonlinearly coupled through dynamically heterogeneous fluctuations that characterize the slow dynamics of systems out of equilibrium.

  20. The biology of deception: emotion and morphine.

    PubMed

    Stefano, G B; Fricchione, G L

    1995-01-01

    The biology of deception suggests that denial-like processes are at the core of the cognitive coping. In this regard, with cognitive ability, one associates or assumes that this process occurs by way of a 'rational' mind. Such a detailed cognitive process as being rational would also lead, counter intuitively, to inactivity and or major delays in conclusion reaching. Thus, our perceived rationality may also be a deceptive behavioral response. Of equal noteworthyness, man is also 'emotional'. We surmise that emotion represents the pre-cognitive short-cut to overcome this potential for excessive rationality. In this light, we may explain certain psychiatric disorders such as obsessive-compulsive behavior as emotional extremes dealing with cognitive habits used to bind anxiety operating most probably at the pre-cognitive level. Given recent discoveries in neuroimmunology and an understanding of naturally occurring morphine as both an immune and neurological down-regulatory substance we hypothesize that abnormalities associated with emotional extremes may be due, in part, to morphinergic imbalances.

  1. Automated quantitative histology reveals vascular morphodynamics during Arabidopsis hypocotyl secondary growth.

    PubMed

    Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S

    2014-02-11

    Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001.

  2. Pulsed Electrochemical Mass Spectrometry for Operando Tracking of Interfacial Processes in Small-Time-Constant Electrochemical Devices such as Supercapacitors.

    PubMed

    Batisse, Nicolas; Raymundo-Piñero, Encarnación

    2017-11-29

    A more detailed understanding of the electrode/electrolyte interface degradation during the charging cycle in supercapacitors is of great interest for exploring the voltage stability range and therefore the extractable energy. The evaluation of the gas evolution during the charging, discharging, and aging processes is a powerful tool toward determining the stability and energy capacity of supercapacitors. Here, we attempt to fit the gas analysis resolution to the time response of a low-gas-generation power device by adopting a modified pulsed electrochemical mass spectrometry (PEMS) method. The pertinence of the method is shown using a symmetric carbon/carbon supercapacitor operating in different aqueous electrolytes. The differences observed in the gas levels and compositions as a function of the cell voltage correlate to the evolution of the physicochemical characteristics of the carbon electrodes and to the electrochemical performance, giving a complete picture of the processes taking place at the electrode/electrolyte interface.

  3. Automated quantitative histology reveals vascular morphodynamics during Arabidopsis hypocotyl secondary growth

    PubMed Central

    Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S

    2014-01-01

    Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001 PMID:24520159

  4. Metabolomics - the complementary field in systems biology: a review on obesity and type 2 diabetes.

    PubMed

    Abu Bakar, Mohamad Hafizi; Sarmidi, Mohamad Roji; Cheng, Kian-Kai; Ali Khan, Abid; Suan, Chua Lee; Zaman Huri, Hasniza; Yaakob, Harisun

    2015-07-01

    Metabolomic studies on obesity and type 2 diabetes mellitus have led to a number of mechanistic insights into biomarker discovery and comprehension of disease progression at metabolic levels. This article reviews a series of metabolomic studies carried out in previous and recent years on obesity and type 2 diabetes, which have shown potential metabolic biomarkers for further evaluation of the diseases. Literature including journals and books from Web of Science, Pubmed and related databases reporting on the metabolomics in these particular disorders are reviewed. We herein discuss the potential of reported metabolic biomarkers for a novel understanding of disease processes. These biomarkers include fatty acids, TCA cycle intermediates, carbohydrates, amino acids, choline and bile acids. The biological activities and aetiological pathways of metabolites of interest in driving these intricate processes are explained. The data from various publications supported metabolomics as an effective strategy in the identification of novel biomarkers for obesity and type 2 diabetes. Accelerating interest in the perspective of metabolomics to complement other fields in systems biology towards the in-depth understanding of the molecular mechanisms underlying the diseases is also well appreciated. In conclusion, metabolomics can be used as one of the alternative approaches in biomarker discovery and the novel understanding of pathophysiological mechanisms in obesity and type 2 diabetes. It can be foreseen that there will be an increasing research interest to combine metabolomics with other omics platforms towards the establishment of detailed mechanistic evidence associated with the disease processes.

  5. Holocene evolution of Apalachicola Bay, Florida

    USGS Publications Warehouse

    Osterman, Lisa E.; Twichell, David C.

    2011-01-01

    A program of geophysical mapping and vibracoring was conducted in 2007 to better understand the geologic evolution of Apalachicola Bay and its response to sea-level rise. A detailed geologic history could help better understand how this bay may respond to both short-term (for example, storm surge) and long-term sea-level rise. The results of this study were published (Osterman and others, 2009) as part of a special issue of Geo-Marine Letters that documents early results from the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility Project.

  6. Consciousness as a graded and an all-or-none phenomenon: A conceptual analysis.

    PubMed

    Windey, Bert; Cleeremans, Axel

    2015-09-01

    The issue whether consciousness is a graded or an all-or-none phenomenon has been and continues to be a debate. Both contradictory accounts are supported by solid evidence. Starting from a level of processing framework allowing for states of partial awareness, here we further elaborate our view that visual experience, as it is most often investigated in the literature, is both graded and all-or-none. Low-level visual experience is graded, whereas high-level visual experience is all-or-none. We then present a conceptual analysis starting from the notion that consciousness is a general concept. We specify a number of different subconcepts present in the literature on consciousness, and outline how each of them may be seen as either graded, all-or-none, or both. We argue that such specifications are necessary to lead to a detailed and integrated understanding of how consciousness should be conceived of as graded and all-or-none. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. The application of quasi-steady approximation in atomic kinetics in simulation of hohlraum radiation drive

    NASA Astrophysics Data System (ADS)

    Ren, Guoli; Pei, Wenbing; Lan, Ke; Gu, Peijun; Li, Xin; Institute of Applied Physics; Computional Mathematics Team

    2011-10-01

    In current routine 2D simulation of hohlraum physics, we adopt the principal-quantum- number(n-level) average atom model(AAM). However, the experimental frequency-dependant radiative drive differs from our n-level simulated drive, which reminds us the need of a more detailed atomic kinetics description. The orbital-quantum-number(nl-level) AAM is a natural consideration but the in-line calculation consumes much more resources. We use a new method to built up a nl-level bound electron distribution using in-line n-level calculated plasma condition (such as temperature, density, average ionization degree). We name this method ``quasi-steady approximation.'' Using the re-built nl-level bound electron distribution (Pnl) , we acquire a new hohlraum radiative drive by post-processing. Comparison with the n-level post-processed hohlraum drive shows that we get an almost identical radiation flux but with more-detailed frequency-dependant structures.

  8. Special issue on mercury in Canada's North: summary and recommendations for future research.

    PubMed

    Chételat, John; Braune, Birgit; Stow, Jason; Tomlinson, Scott

    2015-03-15

    Important scientific advances have been made over the last decade in identifying the environmental fate of mercury and the processes that control its cycling in the Canadian Arctic. This special issue includes a series of six detailed reviews that summarize the main findings of a scientific assessment undertaken by the Government of Canada's Northern Contaminants Program. It was the first assessment to focus exclusively on mercury pollution in the Canadian Arctic. Key findings, as detailed in the reviews, relate to sources and long-range transport of mercury to the Canadian Arctic, its cycling within marine, freshwater, and terrestrial environments, and its bioaccumulation in, and effects on, the biota that live there. While these accomplishments are significant, the complex nature of the mercury cycle continues to provide challenges in characterizing and quantifying the relationships of mercury sources and transport processes with mercury levels in biota and biological effects of mercury exposure. Of particular concern are large uncertainties in our understanding of the processes that are contributing to increasing mercury concentrations in some Arctic fish and wildlife. Specific recommendations are provided for future research and monitoring of the environmental impacts of anthropogenic mercury emissions, influences of climate change, and the effectiveness of mitigation strategies for mercury in the Canadian Arctic. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  9. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    NASA Astrophysics Data System (ADS)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  10. Downscaling SSPs in the GBM Delta - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    NASA Astrophysics Data System (ADS)

    Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila

    2016-04-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  11. Effect of D2O on growth properties and chemical structure of annual ryegrass (Lolium multiflorum)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Barbara R; Bali, Garima; Reeves, David T

    2014-01-01

    In present paper, we report the production and detailed structural analysis of deuterium-enriched rye grass (Lolium multiflorum) for neutron scattering experiments. An efficient method to produce deuterated biomass was developed by designing hydroponic perfusion chambers. In preliminary studies, the partial deuterated rye samples were grown in increasing levels of D2O to study the seed germination and the level of deuterium incorporation as a function of D2O concentration. Solution NMR method indicated 36.9 % deuterium incorporation in 50 % D2O grown annual rye samples and further significant increase in the deuterium incorporation level was observed by germinating the rye seedlings inmore » H2O and growing in 50 % D2O inside the perfusion chambers. Moreover, in an effort to compare the substrate characteristics related to enzymatic hydrolysis on deuterated and protiated version of biomass, annual rye grown in 50 % D2O was selected for detailed biomass characterization studies. The compositional analyses, degree of polymerization and cellulose crystallinity were compared with its protiated control. The cellulose molecular weight indicated slight variation with deuteration; however, hemicellulose molecular weights and cellulose crystallinity remain unaffected with the deuteration. Besides the minor differences in biomass components, the development of deuterated biomass for neutron scattering application is essential to understand the complex biomass conversion processes.« less

  12. Presentation Of Comparative Data for Transportation Planning Studies

    DOT National Transportation Integrated Search

    1997-01-01

    Clear, yet detailed, presentations of transportation planning data to lay groups as well as to technical groups is becoming more and more of a necessity in the planning process. Presentation of technical information in understandable terms has become...

  13. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  14. Understanding how replication processes can maintain systems away from equilibrium using Algorithmic Information Theory.

    PubMed

    Devine, Sean D

    2016-02-01

    Replication can be envisaged as a computational process that is able to generate and maintain order far-from-equilibrium. Replication processes, can self-regulate, as the drive to replicate can counter degradation processes that impact on a system. The capability of replicated structures to access high quality energy and eject disorder allows Landauer's principle, in conjunction with Algorithmic Information Theory, to quantify the entropy requirements to maintain a system far-from-equilibrium. Using Landauer's principle, where destabilising processes, operating under the second law of thermodynamics, change the information content or the algorithmic entropy of a system by ΔH bits, replication processes can access order, eject disorder, and counter the change without outside interventions. Both diversity in replicated structures, and the coupling of different replicated systems, increase the ability of the system (or systems) to self-regulate in a changing environment as adaptation processes select those structures that use resources more efficiently. At the level of the structure, as selection processes minimise the information loss, the irreversibility is minimised. While each structure that emerges can be said to be more entropically efficient, as such replicating structures proliferate, the dissipation of the system as a whole is higher than would be the case for inert or simpler structures. While a detailed application to most real systems would be difficult, the approach may well be useful in understanding incremental changes to real systems and provide broad descriptions of system behaviour. Copyright © 2016 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  15. Increased interestingness of extraneous details in a multimedia science presentation leads to decreased learning.

    PubMed

    Mayer, Richard E; Griffith, Emily; Jurkowitz, Ilana T N; Rothman, Daniel

    2008-12-01

    In Experiment 1, students received an illustrated booklet, PowerPoint presentation, or narrated animation that explained 6 steps in how a cold virus infects the human body. The material included 6 high-interest details mainly about the role of viruses in sex or death (high group) or 6 low-interest details consisting of facts and health tips about viruses (low group). The low group outperformed the high group across all 3 media on a subsequent test of problem-solving transfer (d = .80) but not retention (d = .05). In Experiment 2, students who studied a PowerPoint lesson explaining the steps in how digestion works performed better on a problem-solving transfer test if the lesson contained 7 low-interest details rather than 7 high-interest details (d = .86), but the groups did not differ on retention (d = .26). In both experiments, as the interestingness of details was increased, student understanding decreased (as measured by transfer). Results are consistent with a cognitive theory of multimedia learning, in which highly interesting details sap processing capacity away from deeper cognitive processing of the core material during learning. PsycINFO Database Record (c) 2008 APA, all rights reserved.

  16. A Homegrown Design for Data Warehousing: A District Customizes Its Own Process for Generating Detailed Information about Students in Real Time

    ERIC Educational Resources Information Center

    Thompson, Terry J.; Gould, Karen J.

    2005-01-01

    In recent years the Metropolitan School District of Wayne Township in Indianapolis has been awash in data. In attempts to improve levels of student achievement, the authors collected all manner of statistical details about students and schools and attempted to perform data analysis as part of the school improvement process. The authors were never…

  17. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    PubMed Central

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649

  18. Multiscale modeling of mucosal immune responses

    PubMed Central

    2015-01-01

    Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM. Background Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Implementation Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. Conclusion We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation. PMID:26329787

  19. Multiscale modeling of mucosal immune responses.

    PubMed

    Mei, Yongguo; Abedi, Vida; Carbo, Adria; Zhang, Xiaoying; Lu, Pinyi; Philipson, Casandra; Hontecillas, Raquel; Hoops, Stefan; Liles, Nathan; Bassaganya-Riera, Josep

    2015-01-01

    Computational techniques are becoming increasingly powerful and modeling tools for biological systems are of greater needs. Biological systems are inherently multiscale, from molecules to tissues and from nano-seconds to a lifespan of several years or decades. ENISI MSM integrates multiple modeling technologies to understand immunological processes from signaling pathways within cells to lesion formation at the tissue level. This paper examines and summarizes the technical details of ENISI, from its initial version to its latest cutting-edge implementation. Object-oriented programming approach is adopted to develop a suite of tools based on ENISI. Multiple modeling technologies are integrated to visualize tissues, cells as well as proteins; furthermore, performance matching between the scales is addressed. We used ENISI MSM for developing predictive multiscale models of the mucosal immune system during gut inflammation. Our modeling predictions dissect the mechanisms by which effector CD4+ T cell responses contribute to tissue damage in the gut mucosa following immune dysregulation.Computational modeling techniques are playing increasingly important roles in advancing a systems-level mechanistic understanding of biological processes. Computer simulations guide and underpin experimental and clinical efforts. This study presents ENteric Immune Simulator (ENISI), a multiscale modeling tool for modeling the mucosal immune responses. ENISI's modeling environment can simulate in silico experiments from molecular signaling pathways to tissue level events such as tissue lesion formation. ENISI's architecture integrates multiple modeling technologies including ABM (agent-based modeling), ODE (ordinary differential equations), SDE (stochastic modeling equations), and PDE (partial differential equations). This paper focuses on the implementation and developmental challenges of ENISI. A multiscale model of mucosal immune responses during colonic inflammation, including CD4+ T cell differentiation and tissue level cell-cell interactions was developed to illustrate the capabilities, power and scope of ENISI MSM.

  20. [The concept of "understanding" (Verstehen) in Karl Jaspers].

    PubMed

    Villareal, Helena; Aragona, Massimiliano

    2014-01-01

    This article explores the relationship between empathy and psychopathology. It deals with the concept of "understanding" in Jaspers' General Psychopathology, 100 years after the publication of its first edition. The Jaspersian proposal has the person and his/her experience as its primary object of study, just as in Ortegas' vital reason. Jaspers' understanding is not rational but empathetic, based on the co-presence of emotional content and detailed descriptions. Jaspers' methodology is essentially pluralistic, considering both explanation and understanding, necessary for psychopathology. Despite certain limits, the concept of understanding is the backbone of the psychopathological reasoning, and has proven useful over a century of clinical practice. However, it needs a review covering the recent epistemological and clinical findings. "To be understandable" is a relational property that emerges from a semiotic process. Therefore, an effective psychology should encompass an inter-subjective process, and get away from strict rationalism.

  1. Transcriptomic Crosstalk between Fungal Invasive Pathogens and Their Host Cells: Opportunities and Challenges for Next-Generation Sequencing Methods

    PubMed Central

    Enguita, Francisco J.; Costa, Marina C.; Fusco-Almeida, Ana Marisa; Mendes-Giannini, Maria José; Leitão, Ana Lúcia

    2016-01-01

    Fungal invasive infections are an increasing health problem. The intrinsic complexity of pathogenic fungi and the unmet clinical need for new and more effective treatments requires a detailed knowledge of the infection process. During infection, fungal pathogens are able to trigger a specific transcriptional program in their host cells. The detailed knowledge of this transcriptional program will allow for a better understanding of the infection process and consequently will help in the future design of more efficient therapeutic strategies. Simultaneous transcriptomic studies of pathogen and host by high-throughput sequencing (dual RNA-seq) is an unbiased protocol to understand the intricate regulatory networks underlying the infectious process. This protocol is starting to be applied to the study of the interactions between fungal pathogens and their hosts. To date, our knowledge of the molecular basis of infection for fungal pathogens is still very limited, and the putative role of regulatory players such as non-coding RNAs or epigenetic factors remains elusive. The wider application of high-throughput transcriptomics in the near future will help to understand the fungal mechanisms for colonization and survival, as well as to characterize the molecular responses of the host cell against a fungal infection. PMID:29376924

  2. Factors that affect implementation of a nurse staffing directive: results from a qualitative multi-case evaluation.

    PubMed

    Robinson, Claire H; Annis, Ann M; Forman, Jane; Krein, Sarah L; Yankey, Nicholas; Duffy, Sonia A; Taylor, Beth; Sales, Anne E

    2016-08-01

    To assess implementation of the Veterans Health Administration staffing methodology directive. In 2010 the Veterans Health Administration promulgated a staffing methodology directive for inpatient nursing units to address staffing and budget forecasting. A qualitative multi-case evaluation approach assessed staffing methodology implementation. Semi-structured telephone interviews were conducted from March - June 2014 with Nurse Executives and their teams at 21 facilities. Interviews focused on the budgeting process, implementation experiences, use of data, leadership support, and training. An implementation score was created for each facility using a 4-point rating scale. The scores were used to select three facilities (low, medium and high implementation) for more detailed case studies. After analysing interview summaries, the evaluation team developed a four domain scoring structure: (1) integration of staffing methodology into budget development; (2) implementation of the Directive elements; (3) engagement of leadership and staff; and (4) use of data to support the staffing methodology process. The high implementation facility had leadership understanding and endorsement of staffing methodology, confidence in and ability to work with data, and integration of staffing methodology results into the budgeting process. The low implementation facility reported poor leadership engagement and little understanding of data sources and interpretation. Implementation varies widely across facilities. Implementing staffing methodology in facilities with complex and changing staffing needs requires substantial commitment at all organizational levels especially for facilities that have traditionally relied on historical levels to budget for staffing. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  3. Autonomic nervous system correlates in movement observation and motor imagery

    PubMed Central

    Collet, C.; Di Rienzo, F.; El Hoyek, N.; Guillot, A.

    2013-01-01

    The purpose of the current article is to provide a comprehensive overview of the literature offering a better understanding of the autonomic nervous system (ANS) correlates in motor imagery (MI) and movement observation. These are two high brain functions involving sensori-motor coupling, mediated by memory systems. How observing or mentally rehearsing a movement affect ANS activity has not been extensively investigated. The links between cognitive functions and ANS responses are not so obvious. We will first describe the organization of the ANS whose main purposes are controlling vital functions by maintaining the homeostasis of the organism and providing adaptive responses when changes occur either in the external or internal milieu. We will then review how scientific knowledge evolved, thus integrating recent findings related to ANS functioning, and show how these are linked to mental functions. In turn, we will describe how movement observation or MI may elicit physiological responses at the peripheral level of the autonomic effectors, thus eliciting autonomic correlates to cognitive activity. Key features of this paper are to draw a step-by step progression from the understanding of ANS physiology to its relationships with high mental processes such as movement observation or MI. We will further provide evidence that mental processes are co-programmed both at the somatic and autonomic levels of the central nervous system (CNS). We will thus detail how peripheral physiological responses may be analyzed to provide objective evidence that MI is actually performed. The main perspective is thus to consider that, during movement observation and MI, ANS activity is an objective witness of mental processes. PMID:23908623

  4. Atoms of recognition in human and computer vision.

    PubMed

    Ullman, Shimon; Assif, Liav; Fetaya, Ethan; Harari, Daniel

    2016-03-08

    Discovering the visual features and representations used by the brain to recognize objects is a central problem in the study of vision. Recently, neural network models of visual object recognition, including biological and deep network models, have shown remarkable progress and have begun to rival human performance in some challenging tasks. These models are trained on image examples and learn to extract features and representations and to use them for categorization. It remains unclear, however, whether the representations and learning processes discovered by current models are similar to those used by the human visual system. Here we show, by introducing and using minimal recognizable images, that the human visual system uses features and processes that are not used by current models and that are critical for recognition. We found by psychophysical studies that at the level of minimal recognizable images a minute change in the image can have a drastic effect on recognition, thus identifying features that are critical for the task. Simulations then showed that current models cannot explain this sensitivity to precise feature configurations and, more generally, do not learn to recognize minimal images at a human level. The role of the features shown here is revealed uniquely at the minimal level, where the contribution of each feature is essential. A full understanding of the learning and use of such features will extend our understanding of visual recognition and its cortical mechanisms and will enhance the capacity of computational models to learn from visual experience and to deal with recognition and detailed image interpretation.

  5. Collecting conditions usage metadata to optimize current and future ATLAS software and processing

    NASA Astrophysics Data System (ADS)

    Rinaldi, L.; Barberis, D.; Formica, A.; Gallas, E. J.; Oda, S.; Rybkin, G.; Verducci, M.; ATLAS Collaboration

    2017-10-01

    Conditions data (for example: alignment, calibration, data quality) are used extensively in the processing of real and simulated data in ATLAS. The volume and variety of the conditions data needed by different types of processing are quite diverse, so optimizing its access requires a careful understanding of conditions usage patterns. These patterns can be quantified by mining representative log files from each type of processing and gathering detailed information about conditions usage for that type of processing into a central repository.

  6. Image processing techniques revealing the relationship between the field-measured ambient gamma dose equivalent rate and geological conditions at a granitic area, Velence Mountains, Hungary

    NASA Astrophysics Data System (ADS)

    Beltran Torres, Silvana; Petrik, Attila; Zsuzsanna Szabó, Katalin; Jordan, Gyozo; Szabó, Csaba

    2017-04-01

    In order to estimate the annual dose that the public receive from natural radioactivity, the identification of the potential risk areas is required which, in turn, necessitates understanding the relationship between the spatial distribution of natural radioactivity and the geogenic risk factors (e.g., rock types, dykes, faults, soil conditions, etc.). A detailed spatial analysis of ambient gamma dose equivalent rate was performed in the western side of Velence Mountains, the largest outcropped granitic area in Hungary. In order to assess the role of local geology in the spatial distribution of ambient gamma dose rates, field measurements were carried out at ground level at 300 sites along a 250 m x 250 m regular grid in a total surface of 14.7 km2. Digital image processing methods were applied to identify anomalies, heterogeneities and spatial patterns in the measured gamma dose rates, including local maxima and minima determination, digital cross sections, gradient magnitude and gradient direction, second derivative profile curvature, local variability, lineament density, 2D autocorrelation and directional variogram analyses. Statistical inference showed that different gamma dose rate levels are associated with the rock types (i.e., Carboniferous granite, Pleistocene colluvial, proluvial, deluvial sediments and talus, and Pannonian sand and pebble), with the highest level on the Carboniferous granite including outlying values. Moreover, digital image processing revealed that linear gamma dose rate spatial features are parallel to the SW-NE dyke system and possibly to the NW-SE main fractures. The results of this study underline the importance of understanding the role of geogenic risk factors influencing the ambient gamma dose rate received by public. The study also demonstrates the power of the image processing techniques for the identification of spatial pattern in field-measured geogenic radiation.

  7. Thermographic Measurements of the Commercial Laser Powder Bed Fusion Process at NIST

    PubMed Central

    Lane, Brandon; Moylan, Shawn; Whitenton, Eric; Ma, Li

    2016-01-01

    Measurement of the high-temperature melt pool region in the laser powder bed fusion (L-PBF) process is a primary focus of researchers to further understand the dynamic physics of the heating, melting, adhesion, and cooling which define this commercially popular additive manufacturing process. This paper will detail the design, execution, and results of high speed, high magnification in-situ thermographic measurements conducted at the National Institute of Standards and Technology (NIST) focusing on the melt pool region of a commercial L-PBF process. Multiple phenomena are observed including plasma plume and hot particle ejection from the melt region. The thermographic measurement process will be detailed with emphasis on the ‘measurability’ of observed phenomena and the sources of measurement uncertainty. Further discussion will relate these thermographic results to other efforts at NIST towards L-PBF process finite element simulation and development of in-situ sensing and control methodologies. PMID:28058036

  8. Thermographic Measurements of the Commercial Laser Powder Bed Fusion Process at NIST.

    PubMed

    Lane, Brandon; Moylan, Shawn; Whitenton, Eric; Ma, Li

    2016-01-01

    Measurement of the high-temperature melt pool region in the laser powder bed fusion (L-PBF) process is a primary focus of researchers to further understand the dynamic physics of the heating, melting, adhesion, and cooling which define this commercially popular additive manufacturing process. This paper will detail the design, execution, and results of high speed, high magnification in-situ thermographic measurements conducted at the National Institute of Standards and Technology (NIST) focusing on the melt pool region of a commercial L-PBF process. Multiple phenomena are observed including plasma plume and hot particle ejection from the melt region. The thermographic measurement process will be detailed with emphasis on the 'measurability' of observed phenomena and the sources of measurement uncertainty. Further discussion will relate these thermographic results to other efforts at NIST towards L-PBF process finite element simulation and development of in-situ sensing and control methodologies.

  9. Evolutionary cell biology: functional insight from "endless forms most beautiful".

    PubMed

    Richardson, Elisabeth; Zerr, Kelly; Tsaousis, Anastasios; Dorrell, Richard G; Dacks, Joel B

    2015-12-15

    In animal and fungal model organisms, the complexities of cell biology have been analyzed in exquisite detail and much is known about how these organisms function at the cellular level. However, the model organisms cell biologists generally use include only a tiny fraction of the true diversity of eukaryotic cellular forms. The divergent cellular processes observed in these more distant lineages are still largely unknown in the general scientific community. Despite the relative obscurity of these organisms, comparative studies of them across eukaryotic diversity have had profound implications for our understanding of fundamental cell biology in all species and have revealed the evolution and origins of previously observed cellular processes. In this Perspective, we will discuss the complexity of cell biology found across the eukaryotic tree, and three specific examples of where studies of divergent cell biology have altered our understanding of key functional aspects of mitochondria, plastids, and membrane trafficking. © 2015 Richardson et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  10. Dissecting cancer evolution at the macro-heterogeneity and micro-heterogeneity scale.

    PubMed

    Barber, Louise J; Davies, Matthew N; Gerlinger, Marco

    2015-02-01

    Intratumour heterogeneity complicates biomarker discovery and treatment personalization, and pervasive cancer evolution is a key mechanism leading to therapy failure and patient death. Thus, understanding subclonal heterogeneity architectures and cancer evolution processes is critical for the development of effective therapeutic approaches which can control or thwart cancer evolutionary plasticity. Current insights into heterogeneity are mainly limited to the macroheterogeneity level, established by cancer subclones that have undergone significant clonal expansion. Novel single cell sequencing and blood-based subclonal tracking technologies are enabling detailed insights into microheterogeneity and the dynamics of clonal evolution. We assess how this starts to delineate the rules governing cancer evolution and novel angles for more effective therapeutic intervention. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Production, properties, and industrial food application of lactic acid bacteria-derived exopolysaccharides.

    PubMed

    Zannini, Emanuele; Waters, Deborah M; Coffey, Aidan; Arendt, Elke K

    2016-02-01

    Exopolysaccharides (EPS)-producing lactic acid bacteria (LAB) are industrially important microorganisms in the development of functional food products and are used as starter cultures or coadjutants to develop fermented foods. There is large variability in EPS production by LAB in terms of chemical composition, quantity, molecular size, charge, presence of side chains, and rigidity of the molecules. The main body of the review will cover practical aspects concerning the structural diversity structure of EPS, and their concrete application in food industries is reported in details. To strengthen the food application and process feasibility of LAB EPS at industrial level, a future academic research should be combined with industrial input to understand the technical shortfalls that EPS can address.

  12. Analysis of user activities on popular medical forums

    NASA Astrophysics Data System (ADS)

    Kamalov, M. V.; Dobrynin, V. Y.; Balykina, Y. E.; Martynov, R. S.

    2017-10-01

    The paper is devoted to detailed investigation of users’ behavior and level of expertise on online medical forums. Two popular forums were analyzed in terms of presence of experts who answer health related questions and participate in discussions. This study provides insight into the quality of medical information that one can get from the web resources, and also illustrates relationship between approved medical experts and popular authors of the considered forums. During experiments several machine learning and natural language processing methods were evaluated against to available web content to get further understanding of structure and distribution of information about medicine available online nowadays. As a result of this study the hypothesis of existing correlation between approved medical experts and popular authors has been rejected.

  13. Computational models of neuromodulation.

    PubMed

    Fellous, J M; Linster, C

    1998-05-15

    Computational modeling of neural substrates provides an excellent theoretical framework for the understanding of the computational roles of neuromodulation. In this review, we illustrate, with a large number of modeling studies, the specific computations performed by neuromodulation in the context of various neural models of invertebrate and vertebrate preparations. We base our characterization of neuromodulations on their computational and functional roles rather than on anatomical or chemical criteria. We review the main framework in which neuromodulation has been studied theoretically (central pattern generation and oscillations, sensory processing, memory and information integration). Finally, we present a detailed mathematical overview of how neuromodulation has been implemented at the single cell and network levels in modeling studies. Overall, neuromodulation is found to increase and control computational complexity.

  14. The impact of "omic" and imaging technologies on assessing the host immune response to biodefence agents.

    PubMed

    Tree, Julia A; Flick-Smith, Helen; Elmore, Michael J; Rowland, Caroline A

    2014-01-01

    Understanding the interactions between host and pathogen is important for the development and assessment of medical countermeasures to infectious agents, including potential biodefence pathogens such as Bacillus anthracis, Ebola virus, and Francisella tularensis. This review focuses on technological advances which allow this interaction to be studied in much greater detail. Namely, the use of "omic" technologies (next generation sequencing, DNA, and protein microarrays) for dissecting the underlying host response to infection at the molecular level; optical imaging techniques (flow cytometry and fluorescence microscopy) for assessing cellular responses to infection; and biophotonic imaging for visualising the infectious disease process. All of these technologies hold great promise for important breakthroughs in the rational development of vaccines and therapeutics for biodefence agents.

  15. Investigating Magmatic Processes in the Lower Levels of Mantle-derived Magmatic Systems: The Age & Emplacement of the Kunene Anorthosite Complex (SW Angola)

    NASA Astrophysics Data System (ADS)

    Hayes, B.; Bybee, G. M.; Owen-Smith, T.; Lehmann, J.; Brower, A. M.; Ashwal, L. D.; Hill, C. M.

    2017-12-01

    Our understanding of mantle-derived magmatic systems has shifted from a notion of upper crustal, melt-dominated magma chambers that feed short-lived volcanic eruptions, to a view of more long-lived trans-crustal, mush-dominated systems. Proterozoic massif-type anorthosite systems are voluminous, plagioclase-dominated plutonic suites with ubiquitous intermediate compositions (An 50 ± 10) that represent mantle-derived magmas initially ponded at Moho depths and crystallized polybarically until emplacement at mid-crustal levels. Thus, these systems provide unique insight into magma storage and processing in the lower reaches of the magma mush column, where such interpretation has previously relied on cumulate xenoliths in lavas, geophysical data and experimental/numerical modeling. We present new CA-ID-TIMS ages and a series of detailed field observations from the largest Proterozoic anorthosite massif on Earth, the Kunene Anorthosite Complex (KAC) of SW Angola. Field structures indicate that (i) the bulk of the material was emplaced in the form of crystal mushes, as both plutons and sheet-like intrusions; (ii) prolonged magmatism led to cumulate disaggregation (block structure development) and remobilization, producing considerable textural heterogeneity; (iii) crystal-rich magmatic flow induced localized recrystallization and the development of protoclastic (mortar) textures; and (iv) late residual melts were able to migrate locally prior to complete solidification. Dating of pegmatitic pods entrained from cumulate zones at the base of the crust (1500 ± 13 Ma) and their host anorthosites (1375-1438 Ma) reveals time periods in the range of 60-120 Myr between the earliest products of the system and the final mushes emplaced at higher crustal levels. Therefore, the KAC represents a complex, mushy magmatic system that developed over a long period of time. Not only do these observations help in refining our understanding of Proterozoic anorthosite petrogenesis, they also allow us to place constraints on the types of magmatic processes that operate in the lower levels of other trans-crustal magmatic systems.

  16. Cause of vocal fold scar.

    PubMed

    Allen, Jacqui

    2010-12-01

    The prolonged debilitation, loss of income, and decrement in quality of life caused by vocal fold scar is exacerbated by our inability to successfully treat this difficult problem. As technology focuses on developing innovative treatments, we need to fully appreciate and understand the mechanisms giving rise to glottal scar, on both a macroscopic and microscopic level. This review examines recent literature pertaining to the gross and molecular mechanisms which give rise to vocal fold scar. Mechanisms of vocal fold scar production have been examined in both macroscopic and microscopic detail. Trauma and injury involving any aspect of the lamina propria, particularly the deeper layers, may result in epithelial tethering and scar formation. At the molecular level, early inflammatory cytokines activate and recruit fibroblasts which then drive the fibrotic cascade. Transforming growth factor-β enhances fibrosis and is balanced by tissue matrix metalloproteinases and hepatocyte growth factor activity. Molecular signaling offers novel opportunities to intervene in scar formation. New work investigating the cause of vocal fold scar identifies complex molecular processes leading to fibrosis in the lamina propria. Improved mechanistic understanding offers insight into prevention strategies and possible targets for antifibrotic therapies that may help prevent or treat this debilitating condition.

  17. Phoneme Similarity and Confusability

    ERIC Educational Resources Information Center

    Bailey, T.M.; Hahn, U.

    2005-01-01

    Similarity between component speech sounds influences language processing in numerous ways. Explanation and detailed prediction of linguistic performance consequently requires an understanding of these basic similarities. The research reported in this paper contrasts two broad classes of approach to the issue of phoneme similarity-theoretically…

  18. Process mining techniques: an application to time management

    NASA Astrophysics Data System (ADS)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  19. The levels of analysis revisited

    PubMed Central

    MacDougall-Shackleton, Scott A.

    2011-01-01

    The term levels of analysis has been used in several ways: to distinguish between ultimate and proximate levels, to categorize different kinds of research questions and to differentiate levels of reductionism. Because questions regarding ultimate function and proximate mechanisms are logically distinct, I suggest that distinguishing between these two levels is the best use of the term. Integrating across levels in research has potential risks, but many benefits. Consideration at one level can help generate novel hypotheses at the other, define categories of behaviour and set criteria that must be addressed. Taking an adaptationist stance thus strengthens research on proximate mechanisms. Similarly, it is critical for researchers studying adaptation and function to have detailed knowledge of proximate mechanisms that may constrain or modulate evolutionary processes. Despite the benefits of integrating across ultimate and proximate levels, failure to clearly identify levels of analysis, and whether or not hypotheses are exclusive alternatives, can create false debates. Such non-alternative hypotheses may occur between or within levels, and are not limited to integrative approaches. In this review, I survey different uses of the term levels of analysis and the benefits of integration, and highlight examples of false debate within and between levels. The best integrative biology reciprocally uses ultimate and proximate hypotheses to generate a more complete understanding of behaviour. PMID:21690126

  20. A detailed view on sulphur metabolism at the cellular and whole-plant level illustrates challenges in metabolite flux analyses.

    PubMed

    Rennenberg, Heinz; Herschbach, Cornelia

    2014-11-01

    Understanding the dynamics of physiological process in the systems biology era requires approaches at the genome, transcriptome, proteome, and metabolome levels. In this context, metabolite flux experiments have been used in mapping metabolite pathways and analysing metabolic control. In the present review, sulphur metabolism was taken to illustrate current challenges of metabolic flux analyses. At the cellular level, restrictions in metabolite flux analyses originate from incomplete knowledge of the compartmentation network of metabolic pathways. Transport of metabolites through membranes is usually not considered in flux experiments but may be involved in controlling the whole pathway. Hence, steady-state and snapshot readings need to be expanded to time-course studies in combination with compartment-specific metabolite analyses. Because of species-specific differences, differences between tissues, and stress-related responses, the quantitative significance of different sulphur sinks has to be elucidated; this requires the development of methods for whole-sulphur metabolome approaches. Different cell types can contribute to metabolite fluxes to different extents at the tissue and organ level. Cell type-specific analyses are needed to characterize these contributions. Based on such approaches, metabolite flux analyses can be expanded to the whole-plant level by considering long-distance transport and, thus, the interaction of roots and the shoot in metabolite fluxes. However, whole-plant studies need detailed empirical and mathematical modelling that have to be validated by experimental analyses. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  1. Safety impacts of reduced visibility in inclement weather.

    DOT National Transportation Integrated Search

    2017-04-04

    This research seeks to better understand the ramifications of inclement weather on safety from a perspective of visibility. Visibility conditions at the time of a crash are rarely documented at a high level of detail. While vision is a key component ...

  2. Planning for and surviving a BCM audit.

    PubMed

    Freestone, Mandy; Lee, Michael

    2008-01-01

    Business continuity management (BCM) is moving progressively higher up the agendas of boardroom executives due to growing regulator, insurer and investor interest in risk management and BCM activity. With increasing pressure across all sectors, BCM has become an integral part of any effective corporate governance framework. Boardroom executives and senior management are thus now expected to provide an appropriate level of business continuity preparedness to better protect shareholder, investor and other stakeholder interests. The purpose of this paper is to build a link across the 'chasm' that separates the auditee from the auditor. The paper attempts to illuminate understanding about the process undertaken by an auditor when reviewing the BCM process. It details the steps the BCM auditor typically undertakes, and provides practical guidance as to the types of documentation and other supporting evidence required during the process. Additionally, the paper attempts to dispel commonly-held misconceptions about the BCM audit process. Executives, senior management and BCM practitioners will all benefit from the practical guidance offered in this paper, to assist in planning for and surviving a BCM audit.

  3. LES ARM Symbiotic Simulation and Observation (LASSO) Implementation Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson Jr., WI; Vogelmann, AM

    2015-09-01

    This document illustrates the design of the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) workflow to provide a routine, high-resolution modeling capability to augment the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s high-density observations. LASSO will create a powerful new capability for furthering ARM’s mission to advance understanding of cloud, radiation, aerosol, and land-surface processes. The combined observational and modeling elements will enable a new level of scientific inquiry by connecting processes and context to observations and providing needed statistics for details that cannot be measured. The result will be improved process understandingmore » that facilitates concomitant improvements in climate model parameterizations. The initial LASSO implementation will be for ARM’s Southern Great Plains site in Oklahoma and will focus on shallow convection, which is poorly simulated by climate models due in part to clouds’ typically small spatial scale compared to model grid spacing, and because the convection involves complicated interactions of microphysical and boundary layer processes.« less

  4. The development of a hydrologic-hydraulic representation of an urbanscape: the case study of Nashville, Tennessee

    NASA Astrophysics Data System (ADS)

    Sedlar, F.; Ivanov, V. Y.; Shao, J.; Narayan, U.; Nardi, F.; Adams, T. E.; Merwade, V.; Wright, D. B.; Kim, J.; Fatichi, S.; Rakhmatulina, E.

    2013-12-01

    Incorporating elevation data into coupled hydraulic and hydrologic models with the use of triangulated irregular networks (TINs) provides a detailed and highly customizable representation of the original domain. Until recently the resolution of such digital elevation models was 1 or 1/3 arc second (10-30 meters). Aided by the use of LiDAR, digital elevation models are now available at the 1/9 arc second resolution (1-3 meters). With elevation data at this level of resolution watershed details that are overlooked at a 10-30 meter resolution can now be resolved and incorporated into the TIN. For urban flood modeling this implies that street level features can be resolved. However to provide a useful picture of the flooding as a whole, this data would need to be integrated across a citywide scale. To prove the feasibility, process, and capabilities of generating such a detailed and large scale TIN, we present a case study of Nashville, TN, USA, during the May 1-2, 2010 flooding, a 1,000 year storm event. With the use of ArcGIS, HEC-RAS, Triangle, and additionally developed processing methodologies, an approach is developed to generate a hydrologically relevant and detailed TIN of the entire urbanscape of Nashville. This TIN incorporates three separate aspects; the watershed, the floodplain, and the city. The watershed component contains the elevation data for the delineated watershed, roughly 1,000 km2 at 1-3 meter resolution. The floodplain encompasses over 300 channel cross sections of the Cumberland River and a delineated floodplain. The city element comprises over 500,000 buildings and all major roadways within the watershed. Once generated, the resulting triangulation of the TIN is optimized with the Triangle software for input to the coupled hydraulic and hydrological model, tRIBS-OFM. Hydrologically relevant areas such as the floodplain are densified and constraints are set on the minimum triangle area for the entire TIN. Upon running the coupled hydraulic and hydrological model with the appropriate forcings, the spatial dynamics of the flooding will then be resolved at a street level across the entire city. The analysis capabilities afforded at this resolution and across such a large area will facilitate urban flood predictions coupled with hydrologic forecasts as well as a better understanding of the spatial dynamics of urban flooding.

  5. Process development for green part printing using binder jetting additive manufacturing

    NASA Astrophysics Data System (ADS)

    Miyanaji, Hadi; Orth, Morgan; Akbar, Junaid Muhammad; Yang, Li

    2018-05-01

    Originally developed decades ago, the binder jetting additive manufacturing (BJ-AM) process possesses various advantages compared to other additive manufacturing (AM) technologies such as broad material compatibility and technological expandability. However, the adoption of BJ-AM has been limited by the lack of knowledge with the fundamental understanding of the process principles and characteristics, as well as the relatively few systematic design guideline that are available. In this work, the process design considerations for BJ-AM in green part fabrication were discussed in detail in order to provide a comprehensive perspective of the design for additive manufacturing for the process. Various process factors, including binder saturation, in-process drying, powder spreading, powder feedstock characteristics, binder characteristics and post-process curing, could significantly affect the printing quality of the green parts such as geometrical accuracy and part integrity. For powder feedstock with low flowability, even though process parameters could be optimized to partially offset the printing feasibility issue, the qualities of the green parts will be intrinsically limited due to the existence of large internal voids that are inaccessible to the binder. In addition, during the process development, the balanced combination between the saturation level and in-process drying is of critical importance in the quality control of the green parts.

  6. Neuroinflammation: The Devil is in the Details

    PubMed Central

    DiSabato, Damon; Quan, Ning; Godbout, Jonathan P.

    2016-01-01

    There is significant interest in understanding inflammatory responses within the brain and spinal cord. Inflammatory responses that are centralized within the brain and spinal cord are generally referred to as “neuroinflammatory”. Aspects of neuroinflammation vary within the context of disease, injury, infection or stress. The context, course, and duration of these inflammatory responses are all critical aspects in the understanding of these processes and their corresponding physiological, biochemical and behavioral consequences. Microglia, innate immune cells of the central nervous system (CNS), play key roles in mediating these neuroinflammatory responses. Because the connotation of neuroinflammation is inherently negative and maladaptive, the majority of research focus is on the pathological aspects of neuroinflammation. There are, however, several degrees of neuroinflammatory responses, some of which are positive. In many circumstances including CNS injury, there is a balance of inflammatory and intrinsic repair processes that influences functional recovery. In addition, there are several other examples where communication between the brain and immune system involves neuroinflammatory processes that are beneficial and adaptive. The purpose of this review is to distinguish different variations of neuroinflammation in a context-specific manner and detail both positive and negative aspects of neuroinflammatory processes. PMID:26990767

  7. Extreme values and the level-crossing problem: An application to the Feller process

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume

    2014-04-01

    We review the question of the extreme values attained by a random process. We relate it to level crossings to one boundary (first-passage problems) as well as to two boundaries (escape problems). The extremes studied are the maximum, the minimum, the maximum absolute value, and the range or span. We specialize in diffusion processes and present detailed results for the Wiener and Feller processes.

  8. Cognitive Network Modeling as a Basis for Characterizing Human Communication Dynamics and Belief Contagion in Technology Adoption

    NASA Technical Reports Server (NTRS)

    Hutto, Clayton; Briscoe, Erica; Trewhitt, Ethan

    2012-01-01

    Societal level macro models of social behavior do not sufficiently capture nuances needed to adequately represent the dynamics of person-to-person interactions. Likewise, individual agent level micro models have limited scalability - even minute parameter changes can drastically affect a model's response characteristics. This work presents an approach that uses agent-based modeling to represent detailed intra- and inter-personal interactions, as well as a system dynamics model to integrate societal-level influences via reciprocating functions. A Cognitive Network Model (CNM) is proposed as a method of quantitatively characterizing cognitive mechanisms at the intra-individual level. To capture the rich dynamics of interpersonal communication for the propagation of beliefs and attitudes, a Socio-Cognitive Network Model (SCNM) is presented. The SCNM uses socio-cognitive tie strength to regulate how agents influence--and are influenced by--one another's beliefs during social interactions. We then present experimental results which support the use of this network analytical approach, and we discuss its applicability towards characterizing and understanding human information processing.

  9. Improving the appropriateness of antipsychotic prescribing in nursing homes: a mixed-methods process evaluation of an academic detailing intervention.

    PubMed

    Desveaux, L; Saragosa, M; Rogers, J; Bevan, L; Loshak, H; Moser, A; Feldman, S; Regier, L; Jeffs, L; Ivers, N M

    2017-05-26

    In 2014, nursing home administration and government officials were facing increasing public and media scrutiny around the variation of antipsychotic medication (APM) prescribing across Ontario nursing homes. In response, policy makers partnered to test an academic detailing (AD) intervention to address appropriate prescribing of APM in nursing homes in a cluster-randomized trial. This mixed-methods study aimed to explore how and why the AD intervention may have resulted in changes in the nursing home context. The objectives were to understand how the intervention was implemented, explore contextual factors associated with implementation, and examine impact of the intervention on prescribing. Administrative data for the primary outcome of the full randomized trial will not be available for a minimum of 1 year. Therefore, this paper reports the findings of a planned, quantitative interim trial analysis assessed mean APM dose and prescribing prevalence at baseline and 3 and 6 months across 40 nursing homes (18 intervention, 22 control). Patient-level administrative data regarding prescribing were analyzed using generalized linear mixed effects regression. Semi-structured interviews were conducted with nursing home staff from the intervention group to explore opinions and experiences of the AD intervention. Interviews were analyzed using the framework method, with constructs from the Consolidated Framework for Implementation Research (CFIR) applied as pre-defined deductive codes. Open coding was applied when emerging themes did not align with CFIR constructs. Qualitative and quantitative findings were triangulated to examine points of divergence to understand how the intervention may work and to identify areas for future opportunities and areas for improvement. No significant differences were observed in prescribing outcomes. A total of 22 interviews were conducted, including four academic detailers and 18 nursing home staff. Constructs within the CFIR domains of Outer Setting, Inner Setting, and Characteristics of Individuals presented barriers to antipsychotic prescribing. Intervention Source, Evidence Strength and Quality, and Adaptability explained participant engagement in the AD intervention; nursing homes that exhibited a Tension for Change and Leadership Engagement reported positive changes in processes and communication. Participants described their experiences with the intervention against the backdrop of a range of factors that influence APM prescribing in nursing homes that exist at the system, facility, provider, and resident levels. In this context, the perceived credibility and flexibility of the intervention were critical features that explained engagement with and potential impact of the intervention. Development of a common language across the team to enable communication was reported as a proximal outcome that may eventually have an effect on APM prescribing rates. Process evaluations may be useful during early stages of evaluation to understand how the intervention is working and how it might work better. Qualitative results suggest the lack of early changes observed in prescribing may reflect the number of upstream factors that need to change for APM rates to decrease. ClinicalTrials.gov, NCT02604056.

  10. First Principles Simulations of Ice Nucleation at Metal Surfaces

    NASA Astrophysics Data System (ADS)

    Michaelides, Angelos

    2005-03-01

    Ice nucleation at solid surfaces is of relevance to countless scientific and technological processes. In particular the nucleation of ice nano-crystals on metal surfaces is often a key first step in cloud formation and corrosion [1]. Yet unfortunately this remains one of the most poorly understood natural phenomena; severely lacking in atomic level understanding. Here, we discuss detailed density functional theory studies aimed at putting our understanding of ice nucleation at metals on a much firmer footing. Specifically the properties of H2O hexamers - the smallest `building blocks' of ice - adsorbed on a number of close-packed transition metal surfaces have been examined. We find that the competing influences of substrate reactivity and hexamer-substrate epitaxial mismatch conspire to yield a rich variety of (novel) hexameric ice structures, some of which have been observed by recent scanning tunnelling microscopy experiments [2]. [1] H.R. Pruppacher and J.D. Klett, Microphysics of Clouds and Precipitation, (Kluwer, Dordrecht, 2003). [2] K. Morgenstern, et al., (To be published).

  11. Vesicular trafficking of immune mediators in human eosinophils revealed by immunoelectron microscopy.

    PubMed

    Melo, Rossana C N; Weller, Peter F

    2016-10-01

    Electron microscopy (EM)-based techniques are mostly responsible for our current view of cell morphology at the subcellular level and continue to play an essential role in biological research. In cells from the immune system, such as eosinophils, EM has helped to understand how cells package and release mediators involved in immune responses. Ultrastructural investigations of human eosinophils enabled visualization of secretory processes in detail and identification of a robust, vesicular trafficking essential for the secretion of immune mediators via a non-classical secretory pathway associated with secretory (specific) granules. This vesicular system is mainly organized as large tubular-vesicular carriers (Eosinophil Sombrero Vesicles - EoSVs) actively formed in response to cell activation and provides a sophisticated structural mechanism for delivery of granule-stored mediators. In this review, we highlight the application of EM techniques to recognize pools of immune mediators at vesicular compartments and to understand the complex secretory pathway within human eosinophils involved in inflammatory and allergic responses. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Plasmonic hot carrier dynamics in solid-state and chemical systems for energy conversion

    DOE PAGES

    Narang, Prineha; Sundararaman, Ravishankar; Atwater, Harry A.

    2016-06-11

    Surface plasmons provide a pathway to efficiently absorb and confine light in metallic nanostructures, thereby bridging photonics to the nano scale. The decay of surface plasmons generates energetic ‘hot’ carriers, which can drive chemical reactions or be injected into semiconductors for nano-scale photochemical or photovoltaic energy conversion. Novel plasmonic hot carrier devices and architectures continue to be demonstrated, but the complexity of the underlying processes make a complete microscopic understanding of all the mechanisms and design considerations for such devices extremely challenging.Here,we review the theoretical and computational efforts to understand and model plasmonic hot carrier devices.We split the problem intomore » three steps: hot carrier generation, transport and collection, and review theoretical approaches with the appropriate level of detail for each step along with their predictions. As a result, we identify the key advances necessary to complete the microscopic mechanistic picture and facilitate the design of the next generation of devices and materials for plasmonic energy conversion.« less

  13. First principles molecular dynamics of metal/water interfaces under bias potential

    NASA Astrophysics Data System (ADS)

    Pedroza, Luana; Brandimarte, Pedro; Rocha, Alexandre; Fernandez-Serra, Marivi

    2014-03-01

    Understanding the interaction of the water-metal system at an atomic level is extremely important in electrocatalysts for fuel cells, photocatalysis among other systems. The question of the interface energetics involves a detailed study of the nature of the interactions between water-water and water-substrate. A first principles description of all components of the system is the most appropriate methodology in order to advance understanding of electrochemically processes. In this work we describe, using first principles molecular dynamics simulations, the dynamics of a combined surface(Au and Pd)/water system both in the presence and absence of an external bias potential applied to the electrodes, as one would come across in electrochemistry. This is accomplished using a combination of density functional theory (DFT) and non-equilibrium Green's functions methods (NEGF), thus accounting for the fact that one is dealing with an out-of-equilibrium open system, with and without van der Waals interactions. DOE Early Career Award No. DE-SC0003871.

  14. Silencing of Transposable Elements by piRNAs in Drosophila: An Evolutionary Perspective.

    PubMed

    Luo, Shiqi; Lu, Jian

    2017-06-01

    Transposable elements (TEs) are DNA sequences that can move within the genome. TEs have greatly shaped the genomes, transcriptomes, and proteomes of the host organisms through a variety of mechanisms. However, TEs generally disrupt genes and destabilize the host genomes, which substantially reduce fitness of the host organisms. Understanding the genomic distribution and evolutionary dynamics of TEs will greatly deepen our understanding of the TE-mediated biological processes. Most TE insertions are highly polymorphic in Drosophila melanogaster, providing us a good system to investigate the evolution of TEs at the population level. Decades of theoretical and experimental studies have well established "transposition-selection" population genetics model, which assumes that the equilibrium between TE replication and purifying selection determines the copy number of TEs in the genome. In the last decade, P-element-induced wimpy testis (PIWI)-interacting RNAs (piRNAs) were demonstrated to be master repressors of TE activities in Drosophila. The discovery of piRNAs revolutionized our understanding of TE repression, because it reveals that the host organisms have evolved an adaptive mechanism to defend against TE invasion. Tremendous progress has been made to understand the molecular mechanisms by which piRNAs repress active TEs, although many details in this process remain to be further explored. The interaction between piRNAs and TEs well explains the molecular mechanisms underlying hybrid dysgenesis for the I-R and P-M systems in Drosophila, which have puzzled evolutionary biologists for decades. The piRNA repression pathway provides us an unparalleled system to study the co-evolutionary process between parasites and host organisms. Copyright © 2017 Beijing Institute of Genomics, Chinese Academy of Sciences and Genetics Society of China. Production and hosting by Elsevier B.V. All rights reserved.

  15. 10. DETAIL OF CONDEMNED MATERIAL CHUTE IN NORTHWEST CORNER OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. DETAIL OF CONDEMNED MATERIAL CHUTE IN NORTHWEST CORNER OF 4TH LEVEL; GOVERNMENT INSPECTORS COULD CONDEMN DISEASED OR CONTAMINATED CARCASSES AT ANY POINT DURING THE DISASSEMBLY PROCESS; CONDEMNED MATERIAL WAS 'SIDETRACKED,' WEIGHED, AND DROPPED THROUGH THE CHUTE INTO A HUGE CHOPPER ON LEVEL 3; NOTE SCALE ON OVERHEAD CONVEYOR RAIL - Rath Packing Company, Beef Killing Building, Sycamore Street between Elm & Eighteenth Streets, Waterloo, Black Hawk County, IA

  16. Using Bayesian Learning to Classify College Algebra Students by Understanding in Real-Time

    ERIC Educational Resources Information Center

    Cousino, Andrew

    2013-01-01

    The goal of this work is to provide instructors with detailed information about their classes at each assignment during the term. The information is both on an individual level and at the aggregate level. We used the large number of grades, which are available online these days, along with data-mining techniques to build our models. This enabled…

  17. Understanding extreme sea levels for coastal impact and adaptation analysis

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Hinkel, J.; Dangendorf, S.; Slangen, A.

    2016-12-01

    Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels, because increasing damage due to extreme events, such as storm surges and tropical cyclones, is one of the major consequences of sea level rise and climate change. In fact, the IPCC has highlighted in its AR4 report that "societal impacts of sea level change primarily occur via the extreme levels rather than as a direct consequence of mean sea level changes". Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future mean sea level; different scenarios were developed with process-based or semi-empirical models and used for coastal impact assessments at various spatial scales to guide coastal management and adaptation efforts. The uncertainties in future sea level rise are typically accounted for by analyzing the impacts associated with a range of scenarios leading to a vertical displacement of the distribution of extreme sea-levels. And indeed most regional and global studies find little or no evidence for changes in storminess with climate change, although there is still low confidence in the results. However, and much more importantly, there is still a limited understanding of present-day extreme sea-levels which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of extreme sea-levels. The bias of these models varies spatially and can reach values much larger than the expected sea level rise; but it can be accounted for in most regions making use of in-situ measurements; (2) Statistical models used for determining present-day extreme sea-level exceedance probabilities. There is no universally accepted approach to obtain such values for flood risk assessments and while substantial research has explored inter-model uncertainties for mean sea level, we explore here, for the first time, inter-model uncertainties for extreme sea-levels at large spatial scales and compare them to the uncertainties in mean sea level projections.

  18. An evaluation of semi-automated methods for collecting ecosystem-level data in temperate marine systems.

    PubMed

    Griffin, Kingsley J; Hedge, Luke H; González-Rivero, Manuel; Hoegh-Guldberg, Ove I; Johnston, Emma L

    2017-07-01

    Historically, marine ecologists have lacked efficient tools that are capable of capturing detailed species distribution data over large areas. Emerging technologies such as high-resolution imaging and associated machine-learning image-scoring software are providing new tools to map species over large areas in the ocean. Here, we combine a novel diver propulsion vehicle (DPV) imaging system with free-to-use machine-learning software to semi-automatically generate dense and widespread abundance records of a habitat-forming algae over ~5,000 m 2 of temperate reef. We employ replicable spatial techniques to test the effectiveness of traditional diver-based sampling, and better understand the distribution and spatial arrangement of one key algal species. We found that the effectiveness of a traditional survey depended on the level of spatial structuring, and generally 10-20 transects (50 × 1 m) were required to obtain reliable results. This represents 2-20 times greater replication than have been collected in previous studies. Furthermore, we demonstrate the usefulness of fine-resolution distribution modeling for understanding patterns in canopy algae cover at multiple spatial scales, and discuss applications to other marine habitats. Our analyses demonstrate that semi-automated methods of data gathering and processing provide more accurate results than traditional methods for describing habitat structure at seascape scales, and therefore represent vastly improved techniques for understanding and managing marine seascapes.

  19. Analyzing extreme sea levels for broad-scale impact and adaptation studies

    NASA Astrophysics Data System (ADS)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.

    2017-12-01

    Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels (ESL), because increasing damage due to extreme events is one of the major consequences of sea-level rise (SLR) and climate change. Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future SLR; different scenarios were developed with process-based or semi-empirical models and used for coastal impact studies at various temporal and spatial scales to guide coastal management and adaptation efforts. Uncertainties in future SLR are typically accounted for by analyzing the impacts associated with a range of scenarios and model ensembles. ESL distributions are then displaced vertically according to the SLR scenarios under the inherent assumption that we have perfect knowledge on the statistics of extremes. However, there is still a limited understanding of present-day ESL which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of storm surge water levels, and (2) statistical models used for determining present-day ESL exceedance probabilities. There is no universally accepted approach to obtain such values for broad-scale flood risk assessments and while substantial research has explored SLR uncertainties, we quantify, for the first time globally, key uncertainties in ESL estimates. We find that contemporary ESL uncertainties exceed those from SLR projections and, assuming that we meet the Paris agreement, the projected SLR itself by the end of the century. Our results highlight the necessity to further improve our understanding of uncertainties in ESL estimates through (1) continued improvement of numerical and statistical models to simulate and analyze coastal water levels and (2) exploit the rich observational database and continue data archeology to obtain longer time series and remove model bias. Finally, ESL uncertainties need to be integrated with SLR uncertainties. Otherwise, important improvements in providing more robust SLR projections are of less benefit for broad-scale impact and adaptation studies and decision processes.

  20. Understanding for Teaching for Understanding.

    ERIC Educational Resources Information Center

    Kieren, Thomas E.

    1990-01-01

    Outlines a model of mathematical understanding as a whole, dynamic, nonlinear, recursive growing process, entailing "folding back" for the reconstruction of inner level knowing. Presents examples from seventh graders' work. Discusses teacher awareness of student level of understanding, and implications for development of mathematics…

  1. Spatially explicit modeling in ecology: A review

    USGS Publications Warehouse

    DeAngelis, Donald L.; Yurek, Simeon

    2017-01-01

    The use of spatially explicit models (SEMs) in ecology has grown enormously in the past two decades. One major advancement has been that fine-scale details of landscapes, and of spatially dependent biological processes, such as dispersal and invasion, can now be simulated with great precision, due to improvements in computer technology. Many areas of modeling have shifted toward a focus on capturing these fine-scale details, to improve mechanistic understanding of ecosystems. However, spatially implicit models (SIMs) have played a dominant role in ecology, and arguments have been made that SIMs, which account for the effects of space without specifying spatial positions, have an advantage of being simpler and more broadly applicable, perhaps contributing more to understanding. We address this debate by comparing SEMs and SIMs in examples from the past few decades of modeling research. We argue that, although SIMs have been the dominant approach in the incorporation of space in theoretical ecology, SEMs have unique advantages for addressing pragmatic questions concerning species populations or communities in specific places, because local conditions, such as spatial heterogeneities, organism behaviors, and other contingencies, produce dynamics and patterns that usually cannot be incorporated into simpler SIMs. SEMs are also able to describe mechanisms at the local scale that can create amplifying positive feedbacks at that scale, creating emergent patterns at larger scales, and therefore are important to basic ecological theory. We review the use of SEMs at the level of populations, interacting populations, food webs, and ecosystems and argue that SEMs are not only essential in pragmatic issues, but must play a role in the understanding of causal relationships on landscapes.

  2. Dissecting pigment architecture of individual photosynthetic antenna complexes in solution

    DOE PAGES

    Wang, Quan; Moerner, W. E.

    2015-10-05

    Oligomerization plays a critical role in shaping the light-harvesting properties of many photosynthetic pigment-protein complexes, but a detailed understanding of this process at the level of individual pigments is still lacking. To study the effects of oligomerization, we designed a single-molecule approach to probe the photophysical properties of individual pigment sites as a function of protein assembly state. Our method, based on the principles of anti-Brownian electrokinetic trapping of single fluorescent proteins, step-wise photobleaching, and multiparameter spectroscopy, allows pigment-specific spectroscopic information on single multipigment antennae to be recorded in a nonperturbative aqueous environment with unprecedented detail. We focus on themore » monomer-to-trimer transformation of allophycocyanin (APC), an important antenna protein in cyanobacteria. Here, our data reveal that the two chemically identical pigments in APC have different roles. One (α) is the functional pigment that red-shifts its spectral properties upon trimer formation, whereas the other (β) is a "protective" pigment that persistently quenches the excited state of α in the prefunctional, monomer state of the protein. These results show how subtleties in pigment organization give rise to functionally important aspects of energy transfer and photoprotection in antenna complexes. Finally, the method developed here should find immediate application in understanding the emergent properties of other natural and artificial light-harvesting systems.« less

  3. Cell-based tissue engineering strategies used in the clinical repair of articular cartilage.

    PubMed

    Huang, Brian J; Hu, Jerry C; Athanasiou, Kyriacos A

    2016-08-01

    One of the most important issues facing cartilage tissue engineering is the inability to move technologies into the clinic. Despite the multitude of current research in the field, it is known that 90% of new drugs that advance past animal studies fail clinical trials. The objective of this review is to provide readers with an understanding of the scientific details of tissue engineered cartilage products that have demonstrated a certain level of efficacy in humans, so that newer technologies may be developed upon this foundation. Compared to existing treatments, such as microfracture or autologous chondrocyte implantation, a tissue engineered product can potentially provide more consistent clinical results in forming hyaline repair tissue and in filling the entirety of the defect. The various tissue engineering strategies (e.g., cell expansion, scaffold material, media formulations, biomimetic stimuli, etc.) used in forming these products, as collected from published literature, company websites, and relevant patents, are critically discussed. The authors note that many details about these products remain proprietary, not all information is made public, and that advancements to the products are continuously made. Nevertheless, by understanding the design and production processes of these emerging technologies, one can gain tremendous insight into how to best use them and also how to design the next generation of tissue engineered cartilage products. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Cell-based tissue engineering strategies used in the clinical repair of articular cartilage

    PubMed Central

    Huang, Brian J.; Hu, Jerry C.; Athanasiou, Kyriacos A.

    2016-01-01

    One of the most important issues facing cartilage tissue engineering is the inability to move technologies into the clinic. Despite the multitude of review articles on the paradigm of biomaterials, signals, and cells, it is reported that 90% of new drugs that advance past animal studies fail clinical trials (1). The intent of this review is to provide readers with an understanding of the scientific details of tissue engineered cartilage products that have demonstrated a certain level of efficacy in humans, so that newer technologies may be developed upon this foundation. Compared to existing treatments, such as microfracture or autologous chondrocyte implantation, a tissue engineered product can potentially provide more consistent clinical results in forming hyaline repair tissue and in filling the entirety of the defect. The various tissue engineering strategies (e.g., cell expansion, scaffold material, media formulations, biomimetic stimuli, etc.) used in forming these products, as collected from published literature, company websites, and relevant patents, are critically discussed. The authors note that many details about these products remain proprietary, not all information is made public, and that advancements to the products are continuously made. Nevertheless, by fully understanding the design and production processes of these emerging technologies, one can gain tremendous insight into how to best use them and also how to design the next generation of tissue engineered cartilage products. PMID:27177218

  5. The Effect of Hole Quality on the Fatigue Life of 2024-T3 Aluminum Alloy Sheet

    NASA Technical Reports Server (NTRS)

    Everett, Richard A., Jr.

    2004-01-01

    This paper presents the results of a study whose main objective was to determine which type of fabrication process would least affect the fatigue life of an open-hole structural detail. Since the open-hole detail is often the fundamental building block for determining the stress concentration of built-up structural parts, it is important to understand any factor that can affect the fatigue life of an open hole. A test program of constant-amplitude fatigue tests was conducted on five different sets of test specimens each made using a different hole fabrication process. Three of the sets used different mechanical drilling procedures while a fourth and fifth set were mechanically drilled and then chemically polished. Two sets of specimens were also tested under spectrum loading to aid in understanding the effects of residual compressive stresses on fatigue life. Three conclusions were made from this study. One, the residual compressive stresses caused by the hole-drilling process increased the fatigue life by two to three times over specimens that were chemically polished after the holes were drilled. Second, the chemical polishing process does not appear to adversely affect the fatigue life. Third, the chemical polishing process will produce a stress-state adjacent to the hole that has insignificant machining residual stresses.

  6. Monoterpenes are the largest source of summertime organic aerosol in the southeastern United States

    EPA Science Inventory

    The chemical complexity of atmospheric organic aerosol (OA) has caused substantial uncertainties in understanding its origins and environmental impacts. Here, we provide constraints on OA origins through compositional characterization with molecular-level details. Our results sug...

  7. Implementing a Standardised Annual Programme Review Process in a Third-Level Institution

    ERIC Educational Resources Information Center

    Wickham, Sheelagh; Brady, Malcolm; Ingle, Sarah; McMullan, Caroline; Nic Giolla Mhichíl, Mairéad; Walshe, Ray

    2017-01-01

    Purpose: Ideally, quality should be, and is, an integral element of education, yet capturing and articulating quality is not simple. Programme quality reviews in third-level education can demonstrate quality and identify areas for improvement, offering many potential benefits. However, details on the process of quality programme review are limited…

  8. Examples of landscape indicators for assessing environmental conditions and problems in urban and suburban areas

    USGS Publications Warehouse

    Martin-Duque, J. F.; Godfrey, A.; Diez, A.; Cleaves, E.; Pedraza, J.; Sanz, M.A.; Carrasco, R.M.; Bodoque, J.; Brebbia, C.A.; Martin-Duque, J.F.; Wadhwa, L.C.

    2002-01-01

    Geo-indicators can help to assess environmental conditions in city urban and suburban areas. Those indicators should be meaningful for understanding environmental changes. From examples of Spanish and American cities, geo-indicators for assessing environmental conditions and changes in urban and suburban areas are proposed. The paper explore two types of geo-indicators. The first type presents general information that can be used to indicate the presence of a broad array of geologic conditions, either favouring or limiting various kinds of uses of the land. The second type of geo-indicator is the one most commonly used, and as a group most easily understood; these are site and problem specific and they are generally used after a problem is identified. Among them, watershed processes, seismicity and physiographic diversity are explained in more detail. A second dimension that is considered when discussing geo-indicators is the issue of scale. Broad scale investigations, covering extensive areas are only efficient at cataloguing general conditions common to much of the area or some outstanding feature within the area. This type of information is best used for policy type decisions. Detailed scale investigations can provide information about local conditions, but are not efficient at cataloguing vast areas. Information gathered at the detailed level is necessary for project design and construction.

  9. Evaluation of risk and benefit in thermal effusivity sensor for monitoring lubrication process in pharmaceutical product manufacturing.

    PubMed

    Uchiyama, Jumpei; Kato, Yoshiteru; Uemoto, Yoshifumi

    2014-08-01

    In the process design of tablet manufacturing, understanding and control of the lubrication process is important from various viewpoints. A detailed analysis of thermal effusivity data in the lubrication process was conducted in this study. In addition, we evaluated the risk and benefit in the lubrication process by a detailed investigation. It was found that monitoring of thermal effusivity detected mainly the physical change of bulk density, which was changed by dispersal of the lubricant and the coating powder particle by the lubricant. The monitoring of thermal effusivity was almost the monitoring of bulk density, thermal effusivity could have a high correlation with tablet hardness. Moreover, as thermal effusivity sensor could detect not only the change of the conventional bulk density but also the fractional change of thermal conductivity and thermal capacity, two-phase progress of lubrication process could be revealed. However, each contribution of density, thermal conductivity, or heat capacity to thermal effusivity has the risk of fluctuation by formulation. After carefully considering the change factor with the risk to be changed by formulation, thermal effusivity sensor can be a useful tool for monitoring as process analytical technology, estimating tablet hardness and investigating the detailed mechanism of the lubrication process.

  10. Strengthening population health interventions: developing the CollaboraKTion Framework for Community-Based Knowledge Translation.

    PubMed

    Jenkins, Emily K; Kothari, Anita; Bungay, Vicky; Johnson, Joy L; Oliffe, John L

    2016-08-30

    Much of the research and theorising in the knowledge translation (KT) field has focused on clinical settings, providing little guidance to those working in community settings. In this study, we build on previous research in community-based KT by detailing the theory driven and empirically-informed CollaboraKTion framework. A case study design and ethnographic methods were utilised to gain an in-depth understanding of the processes for conducting a community-based KT study as a means to distilling the CollaboraKTion framework. Drawing on extensive field notes describing fieldwork observations and interactions as well as evidence from the participatory research and KT literature, we detail the processes and steps undertaken in this community-based KT study as well as their rationale and the challenges encountered. In an effort to build upon existing knowledge, Kitson and colleagues' co-KT framework, which provides guidance for conducting KT aimed at addressing population-level health, was applied as a coding structure to inform the current analysis. This approach was selected because it (1) supported the application of an existing community-based KT framework to empirical data and (2) provided an opportunity to contribute to the theory and practice gaps in the community-based KT literature through an inductively derived empirical example. Analysis revealed that community-based KT is an iterative process that can be viewed as comprising five overarching processes: (1) contacting and connecting; (2) deepening understandings; (3) adapting and applying the knowledge base; (4) supporting and evaluating continued action; and (5) transitioning and embedding as well as several key elements within each of these processes (e.g. building on existing knowledge, establishing partnerships). These empirically informed theory advancements in KT and participatory research traditions are summarised in the CollaboraKTion framework. We suggest that community-based KT researchers place less emphasis on enhancing uptake of specific interventions and focus on collaboratively identifying and creating changes to the contextual factors that influence health outcomes. The CollaboraKTion framework can be used to guide the development, implementation and evaluation of contextually relevant, evidence-informed initiatives aimed at improving population health, amid providing a foundation to leverage future research and practice in this emergent KT area.

  11. Wildlife disease ecology from the individual to the population: Insights from a long-term study of a naturally infected European badger population.

    PubMed

    McDonald, Jenni L; Robertson, Andrew; Silk, Matthew J

    2018-01-01

    Long-term individual-based datasets on host-pathogen systems are a rare and valuable resource for understanding the infectious disease dynamics in wildlife. A study of European badgers (Meles meles) naturally infected with bovine tuberculosis (bTB) at Woodchester Park in Gloucestershire (UK) has produced a unique dataset, facilitating investigation of a diverse range of epidemiological and ecological questions with implications for disease management. Since the 1970s, this badger population has been monitored with a systematic mark-recapture regime yielding a dataset of >15,000 captures of >3,000 individuals, providing detailed individual life-history, morphometric, genetic, reproductive and disease data. The annual prevalence of bTB in the Woodchester Park badger population exhibits no straightforward relationship with population density, and both the incidence and prevalence of Mycobacterium bovis show marked variation in space. The study has revealed phenotypic traits that are critical for understanding the social structure of badger populations along with mechanisms vital for understanding disease spread at different spatial resolutions. Woodchester-based studies have provided key insights into how host ecology can influence infection at different spatial and temporal scales. Specifically, it has revealed heterogeneity in epidemiological parameters; intrinsic and extrinsic factors affecting population dynamics; provided insights into senescence and individual life histories; and revealed consistent individual variation in foraging patterns, refuge use and social interactions. An improved understanding of ecological and epidemiological processes is imperative for effective disease management. Woodchester Park research has provided information of direct relevance to bTB management, and a better appreciation of the role of individual heterogeneity in disease transmission can contribute further in this regard. The Woodchester Park study system now offers a rare opportunity to seek a dynamic understanding of how individual-, group- and population-level processes interact. The wealth of existing data makes it possible to take a more integrative approach to examining how the consequences of individual heterogeneity scale to determine population-level pathogen dynamics and help advance our understanding of the ecological drivers of host-pathogen systems. © 2017 The Authors. Journal of Animal Ecology published by John Wiley & Sons Ltd on behalf of British Ecological Society.

  12. Polymorphic phase transitions: Macroscopic theory and molecular simulation.

    PubMed

    Anwar, Jamshed; Zahn, Dirk

    2017-08-01

    Transformations in the solid state are of considerable interest, both for fundamental reasons and because they underpin important technological applications. The interest spans a wide spectrum of disciplines and application domains. For pharmaceuticals, a common issue is unexpected polymorphic transformation of the drug or excipient during processing or on storage, which can result in product failure. A more ambitious goal is that of exploiting the advantages of metastable polymorphs (e.g. higher solubility and dissolution rate) while ensuring their stability with respect to solid state transformation. To address these issues and to advance technology, there is an urgent need for significant insights that can only come from a detailed molecular level understanding of the involved processes. Whilst experimental approaches at best yield time- and space-averaged structural information, molecular simulation offers unprecedented, time-resolved molecular-level resolution of the processes taking place. This review aims to provide a comprehensive and critical account of state-of-the-art methods for modelling polymorph stability and transitions between solid phases. This is flanked by revisiting the associated macroscopic theoretical framework for phase transitions, including their classification, proposed molecular mechanisms, and kinetics. The simulation methods are presented in tutorial form, focusing on their application to phase transition phenomena. We describe molecular simulation studies for crystal structure prediction and polymorph screening, phase coexistence and phase diagrams, simulations of crystal-crystal transitions of various types (displacive/martensitic, reconstructive and diffusive), effects of defects, and phase stability and transitions at the nanoscale. Our selection of literature is intended to illustrate significant insights, concepts and understanding, as well as the current scope of using molecular simulations for understanding polymorphic transitions in an accessible way, rather than claiming completeness. With exciting prospects in both simulation methods development and enhancements in computer hardware, we are on the verge of accessing an unprecedented capability for designing and developing dosage forms and drug delivery systems in silico, including tackling challenges in polymorph control on a rational basis. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. The role of water vapor in climate. A strategic research plan for the proposed GEWEX water vapor project (GVaP)

    NASA Technical Reports Server (NTRS)

    Starr, D. OC. (Editor); Melfi, S. Harvey (Editor)

    1991-01-01

    The proposed GEWEX Water Vapor Project (GVaP) addresses fundamental deficiencies in the present understanding of moist atmospheric processes and the role of water vapor in the global hydrologic cycle and climate. Inadequate knowledge of the distribution of atmospheric water vapor and its transport is a major impediment to progress in achieving a fuller understanding of various hydrologic processes and a capability for reliable assessment of potential climatic change on global and regional scales. GVap will promote significant improvements in knowledge of atmospheric water vapor and moist processes as well as in present capabilities to model these processes on global and regional scales. GVaP complements a number of ongoing and planned programs focused on various aspects of the hydrologic cycle. The goal of GVaP is to improve understanding of the role of water vapor in meteorological, hydrological, and climatological processes through improved knowledge of water vapor and its variability on all scales. A detailed description of the GVaP is presented.

  14. Algorithm for cellular reprogramming.

    PubMed

    Ronquist, Scott; Patterson, Geoff; Muir, Lindsey A; Lindsly, Stephen; Chen, Haiming; Brown, Markus; Wicha, Max S; Bloch, Anthony; Brockett, Roger; Rajapakse, Indika

    2017-11-07

    The day we understand the time evolution of subcellular events at a level of detail comparable to physical systems governed by Newton's laws of motion seems far away. Even so, quantitative approaches to cellular dynamics add to our understanding of cell biology. With data-guided frameworks we can develop better predictions about, and methods for, control over specific biological processes and system-wide cell behavior. Here we describe an approach for optimizing the use of transcription factors (TFs) in cellular reprogramming, based on a device commonly used in optimal control. We construct an approximate model for the natural evolution of a cell-cycle-synchronized population of human fibroblasts, based on data obtained by sampling the expression of 22,083 genes at several time points during the cell cycle. To arrive at a model of moderate complexity, we cluster gene expression based on division of the genome into topologically associating domains (TADs) and then model the dynamics of TAD expression levels. Based on this dynamical model and additional data, such as known TF binding sites and activity, we develop a methodology for identifying the top TF candidates for a specific cellular reprogramming task. Our data-guided methodology identifies a number of TFs previously validated for reprogramming and/or natural differentiation and predicts some potentially useful combinations of TFs. Our findings highlight the immense potential of dynamical models, mathematics, and data-guided methodologies for improving strategies for control over biological processes. Copyright © 2017 the Author(s). Published by PNAS.

  15. New approaches to quantifying aerosol influence on the cloud radiative effect

    DOE PAGES

    Feingold, Graham; McComiskey, Allison; Yamaguchi, Takanobu; ...

    2016-02-01

    The topic of cloud radiative forcing associated with the atmospheric aerosol has been the focus of intense scrutiny for decades. The enormity of the problem is reflected in the need to understand aspects such as aerosol composition, optical properties, cloud condensation, and ice nucleation potential, along with the global distribution of these properties, controlled by emissions, transport, transformation, and sinks. Equally daunting is that clouds themselves are complex, turbulent, microphysical entities and, by their very nature, ephemeral and hard to predict. Atmospheric general circulation models represent aerosol–cloud interactions at ever-increasing levels of detail, but these models lack the resolution tomore » represent clouds and aerosol–cloud interactions adequately. There is a dearth of observational constraints on aerosol–cloud interactions. In this paper, we develop a conceptual approach to systematically constrain the aerosol–cloud radiative effect in shallow clouds through a combination of routine process modeling and satellite and surface-based shortwave radiation measurements. Finally, we heed the call to merge Darwinian and Newtonian strategies by balancing microphysical detail with scaling and emergent properties of the aerosol–cloud radiation system.« less

  16. New approaches to quantifying aerosol influence on the cloud radiative effect

    PubMed Central

    Feingold, Graham; McComiskey, Allison; Yamaguchi, Takanobu; Johnson, Jill S.; Carslaw, Kenneth S.; Schmidt, K. Sebastian

    2016-01-01

    The topic of cloud radiative forcing associated with the atmospheric aerosol has been the focus of intense scrutiny for decades. The enormity of the problem is reflected in the need to understand aspects such as aerosol composition, optical properties, cloud condensation, and ice nucleation potential, along with the global distribution of these properties, controlled by emissions, transport, transformation, and sinks. Equally daunting is that clouds themselves are complex, turbulent, microphysical entities and, by their very nature, ephemeral and hard to predict. Atmospheric general circulation models represent aerosol−cloud interactions at ever-increasing levels of detail, but these models lack the resolution to represent clouds and aerosol−cloud interactions adequately. There is a dearth of observational constraints on aerosol−cloud interactions. We develop a conceptual approach to systematically constrain the aerosol−cloud radiative effect in shallow clouds through a combination of routine process modeling and satellite and surface-based shortwave radiation measurements. We heed the call to merge Darwinian and Newtonian strategies by balancing microphysical detail with scaling and emergent properties of the aerosol−cloud radiation system. PMID:26831092

  17. New approaches to quantifying aerosol influence on the cloud radiative effect.

    PubMed

    Feingold, Graham; McComiskey, Allison; Yamaguchi, Takanobu; Johnson, Jill S; Carslaw, Kenneth S; Schmidt, K Sebastian

    2016-05-24

    The topic of cloud radiative forcing associated with the atmospheric aerosol has been the focus of intense scrutiny for decades. The enormity of the problem is reflected in the need to understand aspects such as aerosol composition, optical properties, cloud condensation, and ice nucleation potential, along with the global distribution of these properties, controlled by emissions, transport, transformation, and sinks. Equally daunting is that clouds themselves are complex, turbulent, microphysical entities and, by their very nature, ephemeral and hard to predict. Atmospheric general circulation models represent aerosol-cloud interactions at ever-increasing levels of detail, but these models lack the resolution to represent clouds and aerosol-cloud interactions adequately. There is a dearth of observational constraints on aerosol-cloud interactions. We develop a conceptual approach to systematically constrain the aerosol-cloud radiative effect in shallow clouds through a combination of routine process modeling and satellite and surface-based shortwave radiation measurements. We heed the call to merge Darwinian and Newtonian strategies by balancing microphysical detail with scaling and emergent properties of the aerosol-cloud radiation system.

  18. Medical Representatives' Intention to Use Information Technology in Pharmaceutical Marketing.

    PubMed

    Kwak, Eun-Seon; Chang, Hyejung

    2016-10-01

    Electronic detailing (e-detailing), the use of electronic devices to facilitate sales presentations to physicians, has been adopted and expanded in the pharmaceutical industry. To maximize the potential outcome of e-detailing, it is important to understand medical representatives (MRs)' behavior and attitude to e-detailing. This study investigates how information technology devices such as laptop computers and tablet PCs are utilized in pharmaceutical marketing, and it analyzes the factors influencing MRs' intention to use devices. This study has adopted and modified the theory of Roger's diffusion of innovation model and the technology acceptance model. To test the model empirically, a questionnaire survey was conducted with 221 MRs who were working in three multinational or eleven domestic pharmaceutical companies in Korea. Overall, 28% and 35% of MRs experienced using laptop computers and tablet PCs in pharmaceutical marketing, respectively. However, the rates were different across different groups of MRs, categorized by age, education level, position, and career. The results showed that MRs' intention to use information technology devices was significantly influenced by perceived usefulness in general. Perceived ease of use, organizational and individual innovativeness, and several MR characteristics were also found to have significant impacts. This study provides timely information about e-detailing devices to marketing managers and policy makers in the pharmaceutical industry for successful marketing strategy development by understanding the needs of MRs' intention to use information technology. Further in-depth study should be conducted to understand obstacles and limitations and to improve the strategies for better marketing tools.

  19. Review of the Global Models Used Within Phase 1 of the Chemistry-Climate Model Initiative (CCMI)

    NASA Technical Reports Server (NTRS)

    Morgenstern, Olaf; Hegglin, Michaela I.; Rozanov, Eugene; O’Connor, Fiona M.; Abraham, N. Luke; Akiyoshi, Hideharu; Archibald, Alexander T.; Bekki, Slimane; Butchart, Neal; Chipperfield, Martyn P.; hide

    2017-01-01

    We present an overview of state-of-the-art chemistry-climate and chemistry transport models that are used within phase 1 of the Chemistry-Climate Model Initiative (CCMI-1). The CCMI aims to conduct a detailed evaluation of participating models using process-oriented diagnostics derived from observations in order to gain confidence in the models' projections of the stratospheric ozone layer, tropospheric composition, air quality, where applicable global climate change, and the interactions between them. Interpretation of these diagnostics requires detailed knowledge of the radiative, chemical, dynamical, and physical processes incorporated in the models. Also an understanding of the degree to which CCMI-1 recommendations for simulations have been followed is necessary to understand model responses to anthropogenic and natural forcing and also to explain inter-model differences. This becomes even more important given the ongoing development and the ever-growing complexity of these models. This paper also provides an overview of the available CCMI-1 simulations with the aim of informing CCMI data users.

  20. Isotope and Chemical Methods in Support of the U.S. Geological Survey Science Strategy, 2003-2008

    USGS Publications Warehouse

    Rye, R.O.; Johnson, C.A.; Landis, G.P.; Hofstra, A.H.; Emsbo, P.; Stricker, C.A.; Hunt, A.G.; Rusk, B.G.

    2008-01-01

    Principal functions of the Mineral Resources Program are providing information to decision-makers related to mineral deposits on federal lands and predicting the environmental consequences of the mining or natural weathering of those deposits. Performing these functions requires that predictions be made of the likelihood of undiscovered deposits. The predictions are based on geologic and geoenvironmental models that are constructed for the various types of mineral deposits from detailed descriptions of actual deposits and detailed understanding of the processes that formed them. Over the past three decades the understanding of ore-forming processes has benefitted greatly from the integration of laboratory-based geochemical tools with field observations and other data sources. Under the aegis of the Evolution of Ore Deposits and Technology Transfer Project (EODTTP), a five-year effort that terminated in 2008, the Mineral Resources Program provided state-of-the-art analytical capabilities to support applications of several related geochemical tools.

  1. Global biogeochemical cycles: Studies of interaction and change, some views on the strategy of approach

    NASA Technical Reports Server (NTRS)

    Bolin, B.

    1984-01-01

    The global biosphere is an exceedingly complex system. To gain an understanding of its structure and dynamic features, it is necessary to increase knowledge about the detailed processes, but also to develop models of how global interactions take place. Attempts to analyze the detailed physical, chemical and biological processes need, in this context, to be guided by an advancement of understanding of the latter. It is necessary to develop a strategy of data gathering that serves both these purposes simultaneously. climate research during the last decade may serve as a useful example of how to approach this difficult problem in a systematic way. Large programs for data collection may easily become rigid and costly. While realizing the necessity of a systematic and long lasting effort of observing the atmosphere, the oceans, land and life on Earth, such a program must remain flexible enough to permit the modifications and even sometimes improvisations that are necessary to maintain a viable program.

  2. Satellite Test of Radiation Impact on Ramtron 512K FRAM

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Sayyah, Rana; Sims, W. Herb; Varnavas, Kosta A.; Ho, Fat D.

    2009-01-01

    The Memory Test Experiment is a space test of a ferroelectric memory device on a low Earth orbit satellite. The test consists of writing and reading data with a ferroelectric based memory device. Any errors are detected and are stored on board the satellite. The data is send to the ground through telemetry once a day. Analysis of the data can determine the kind of error that was found and will lead to a better understanding of the effects of space radiation on memory systems. The test will be one of the first flight demonstrations of ferroelectric memory in a near polar orbit which allows testing in a varied radiation environment. The memory devices being tested is a Ramtron Inc. 512K memory device. This paper details the goals and purpose of this experiment as well as the development process. The process for analyzing the data to gain the maximum understanding of the performance of the ferroelectric memory device is detailed.

  3. Cost-benefit decision circuitry: proposed modulatory role for acetylcholine.

    PubMed

    Fobbs, Wambura C; Mizumori, Sheri J Y

    2014-01-01

    In order to select which action should be taken, an animal must weigh the costs and benefits of possible outcomes associate with each action. Such decisions, called cost-benefit decisions, likely involve several cognitive processes (including memory) and a vast neural circuitry. Rodent models have allowed research to begin to probe the neural basis of three forms of cost-benefit decision making: effort-, delay-, and risk-based decision making. In this review, we detail the current understanding of the functional circuits that subserve each form of decision making. We highlight the extensive literature by detailing the ability of dopamine to influence decisions by modulating structures within these circuits. Since acetylcholine projects to all of the same important structures, we propose several ways in which the cholinergic system may play a local modulatory role that will allow it to shape these behaviors. A greater understanding of the contribution of the cholinergic system to cost-benefit decisions will permit us to better link the decision and memory processes, and this will help us to better understand and/or treat individuals with deficits in a number of higher cognitive functions including decision making, learning, memory, and language. © 2014 Elsevier Inc. All rights reserved.

  4. Loblolly pine foliar patterns and growth dynamics at age 12 in response to planting density and cultural intensity

    Treesearch

    Madison Katherine Akers; Michael Kane; Dehai Zhao; Richard F. Daniels; Robert O. Teskey

    2015-01-01

    Examining the role of foliage in stand development across a range of stand structures provides a more detailed understanding of the processes driving productivity and allows further development of process-based models for prediction. Productivity changes observed at the stand scale will be the integration of changes at the individual tree scale, but few studies have...

  5. Predicting wildfire occurrence distribution with spatial point process models and its uncertainty assessment: a case study in the Lake Tahoe Basin, USA

    Treesearch

    Jian Yang; Peter J. Weisberg; Thomas E. Dilts; E. Louise Loudermilk; Robert M. Scheller; Alison Stanton; Carl Skinner

    2015-01-01

    Strategic fire and fuel management planning benefits from detailed understanding of how wildfire occurrences are distributed spatially under current climate, and from predictive models of future wildfire occurrence given climate change scenarios. In this study, we fitted historical wildfire occurrence data from 1986 to 2009 to a suite of spatial point process (SPP)...

  6. Falcon: A Temporal Visual Analysis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.

    2016-09-05

    Flexible visible exploration of long, high-resolution time series from multiple sensor streams is a challenge in several domains. Falcon is a visual analytics approach that helps researchers acquire a deep understanding of patterns in log and imagery data. Falcon allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations with multiple levels of detail. These capabilities are applicable to the analysis of any quantitative time series.

  7. The scientific challenges to forecasting and nowcasting the magnetospheric response to space weather (Invited)

    NASA Astrophysics Data System (ADS)

    Hesse, M.; Kuznetsova, M. M.; Birn, J.; Pulkkinen, A. A.

    2013-12-01

    Space weather is different from terrestrial weather in an essential way. Terrestrial weather has benefitted from a long history of research, which has led to a deep and detailed level of understanding. In comparison, space weather is scientifically in its infancy. Many key processes in the causal chains from processes on the Sun to space weather effects in various locations in the heliosphere remain either poorly understood or not understood at all. Space weather is therefore, and will remain in the foreseeable future, primarily a research field. Extensive further research efforts are needed before we can reasonably expect the precision and fidelity of weather forecasts. For space weather within the Earth's magnetosphere, the coupling between solar wind and magnetosphere is of crucial importance. While past research has provided answers, often on qualitative levels, to some of the most fundamental questions, answers to some of the latter and the ability to predict quantitatively remain elusive. This presentation will provide an overview of pertinent aspects of solar wind-magnetospheric coupling, its importance for space weather near the Earth, and it will analyze the state of our ability to describe and predict its efficiency. It will conclude with a discussion of research activities, which are aimed at improving our ability to quantitatively forecast coupling processes.

  8. Development of an instrument to understand the child protective services decision-making process, with a focus on placement decisions.

    PubMed

    Dettlaff, Alan J; Christopher Graham, J; Holzman, Jesse; Baumann, Donald J; Fluke, John D

    2015-11-01

    When children come to the attention of the child welfare system, they become involved in a decision-making process in which decisions are made that have a significant effect on their future and well-being. The decision to remove children from their families is particularly complex; yet surprisingly little is understood about this decision-making process. This paper presents the results of a study to develop an instrument to explore, at the caseworker level, the context of the removal decision, with the objective of understanding the influence of the individual and organizational factors on this decision, drawing from the Decision Making Ecology as the underlying rationale for obtaining the measures. The instrument was based on the development of decision-making scales used in prior decision-making studies and administered to child protection caseworkers in several states. Analyses included reliability analyses, principal components analyses, and inter-correlations among the resulting scales. For one scale regarding removal decisions, a principal components analysis resulted in the extraction of two components, jointly identified as caseworkers' decision-making orientation, described as (1) an internal reference to decision-making and (2) an external reference to decision-making. Reliability analyses demonstrated acceptable to high internal consistency for 9 of the 11 scales. Full details of the reliability analyses, principal components analyses, and inter-correlations among the seven scales are discussed, along with implications for practice and the utility of this instrument to support the understanding of decision-making in child welfare. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Prediction of Detailed Enzyme Functions and Identification of Specificity Determining Residues by Random Forests

    PubMed Central

    Nagao, Chioko; Nagano, Nozomi; Mizuguchi, Kenji

    2014-01-01

    Determining enzyme functions is essential for a thorough understanding of cellular processes. Although many prediction methods have been developed, it remains a significant challenge to predict enzyme functions at the fourth-digit level of the Enzyme Commission numbers. Functional specificity of enzymes often changes drastically by mutations of a small number of residues and therefore, information about these critical residues can potentially help discriminate detailed functions. However, because these residues must be identified by mutagenesis experiments, the available information is limited, and the lack of experimentally verified specificity determining residues (SDRs) has hindered the development of detailed function prediction methods and computational identification of SDRs. Here we present a novel method for predicting enzyme functions by random forests, EFPrf, along with a set of putative SDRs, the random forests derived SDRs (rf-SDRs). EFPrf consists of a set of binary predictors for enzymes in each CATH superfamily and the rf-SDRs are the residue positions corresponding to the most highly contributing attributes obtained from each predictor. EFPrf showed a precision of 0.98 and a recall of 0.89 in a cross-validated benchmark assessment. The rf-SDRs included many residues, whose importance for specificity had been validated experimentally. The analysis of the rf-SDRs revealed both a general tendency that functionally diverged superfamilies tend to include more active site residues in their rf-SDRs than in less diverged superfamilies, and superfamily-specific conservation patterns of each functional residue. EFPrf and the rf-SDRs will be an effective tool for annotating enzyme functions and for understanding how enzyme functions have diverged within each superfamily. PMID:24416252

  10. Pre-Deployment Handbook: Timor-Leste

    DTIC Science & Technology

    2014-05-01

    events as opposed to the detail. In a community where literacy levels are low, the telling of stories in public is an important part of recording...Tempo Semanal is the main national newspaper. However, low literacy levels make this less effective as a means of sharing information . English... information that will assist in understanding the complex environment that is Timor-Leste. The research and analysis supports a range of contingencies

  11. Morphodynamic data assimilation used to understand changing coasts

    USGS Publications Warehouse

    Plant, Nathaniel G.; Long, Joseph W.

    2015-01-01

    Morphodynamic data assimilation blends observations with model predictions and comes in many forms, including linear regression, Kalman filter, brute-force parameter estimation, variational assimilation, and Bayesian analysis. Importantly, data assimilation can be used to identify sources of prediction errors that lead to improved fundamental understanding. Overall, models incorporating data assimilation yield better information to the people who must make decisions impacting safety and wellbeing in coastal regions that experience hazards due to storms, sea-level rise, and erosion. We present examples of data assimilation associated with morphologic change. We conclude that enough morphodynamic predictive capability is available now to be useful to people, and that we will increase our understanding and the level of detail of our predictions through assimilation of observations and numerical-statistical models.

  12. Assessing 16-Year-Old Students' Understanding of Aqueous Solution at Submicroscopic Level

    ERIC Educational Resources Information Center

    Devetak, Iztok; Vogrinc, Janez; Glazar, Sasa Aleksij

    2009-01-01

    Submicrorepresentations (SMR) could be an important element, not only for explaining the experimental observations to students, but also in the process of evaluating students' knowledge and identifying their chemical misconceptions. This study investigated the level of students' understanding of the solution concentration and the process of…

  13. The Populus holobiont: dissecting the effects of plant niches and genotype on the microbiome

    DOE PAGES

    Cregger, M. A.; Veach, A. M.; Yang, Z. K.; ...

    2018-02-12

    Microorganisms serve important functions within numerous eukaryotic host organisms. An understanding of the variation in the plant niche-level microbiome, from rhizosphere soils to plant canopies, is imperative to gain a better understanding of how both the structural and functional processes of microbiomes impact the health of the overall plant holobiome. Using Populus trees as a model ecosystem, we characterized the archaeal/bacterial and fungal microbiome across 30 different tissue-level niches within replicated Populus deltoides and hybrid Populus trichocarpa × deltoides individuals using 16S and ITS2 rRNA gene analyses. Our analyses indicate that archaeal/bacterial and fungal microbiomes varied primarily across broader plantmore » habitat classes (leaves, stems, roots, soils) regardless of plant genotype, except for fungal communities within leaf niches, which were greatly impacted by the host genotype. Differences between tree genotypes are evident in the elevated presence of two potential fungal pathogens, Marssonina brunnea and Septoria sp., on hybrid P. trichocarpa × deltoides trees which may in turn be contributing to divergence in overall microbiome composition. Archaeal/bacterial diversity increased from leaves, to stem, to root, and to soil habitats, whereas fungal diversity was the greatest in stems and soils. In conclusion, this study provides a holistic understanding of microbiome structure within a bioenergy relevant plant host, one of the most complete niche-level analyses of any plant. As such, it constitutes a detailed atlas or map for further hypothesis testing on the significance of individual microbial taxa within specific niches and habitats of Populus and a baseline for comparisons to other plant species.« less

  14. The Populus holobiont: dissecting the effects of plant niches and genotype on the microbiome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cregger, M. A.; Veach, A. M.; Yang, Z. K.

    Microorganisms serve important functions within numerous eukaryotic host organisms. An understanding of the variation in the plant niche-level microbiome, from rhizosphere soils to plant canopies, is imperative to gain a better understanding of how both the structural and functional processes of microbiomes impact the health of the overall plant holobiome. Using Populus trees as a model ecosystem, we characterized the archaeal/bacterial and fungal microbiome across 30 different tissue-level niches within replicated Populus deltoides and hybrid Populus trichocarpa × deltoides individuals using 16S and ITS2 rRNA gene analyses. Our analyses indicate that archaeal/bacterial and fungal microbiomes varied primarily across broader plantmore » habitat classes (leaves, stems, roots, soils) regardless of plant genotype, except for fungal communities within leaf niches, which were greatly impacted by the host genotype. Differences between tree genotypes are evident in the elevated presence of two potential fungal pathogens, Marssonina brunnea and Septoria sp., on hybrid P. trichocarpa × deltoides trees which may in turn be contributing to divergence in overall microbiome composition. Archaeal/bacterial diversity increased from leaves, to stem, to root, and to soil habitats, whereas fungal diversity was the greatest in stems and soils. In conclusion, this study provides a holistic understanding of microbiome structure within a bioenergy relevant plant host, one of the most complete niche-level analyses of any plant. As such, it constitutes a detailed atlas or map for further hypothesis testing on the significance of individual microbial taxa within specific niches and habitats of Populus and a baseline for comparisons to other plant species.« less

  15. Characterization of impaired processing of neuropeptides in the brains of endoprotease knockout mice.

    PubMed

    Beinfeld, Margery C

    2011-01-01

    With the development of mice in which individual proteolytic enzymes have been inactivated, it has been of great interest to see how loss of these enzymes alters the processing of neuropeptides. In the course of studying changes in the peptide cholecystokinin (CCK) and other neuropeptides in several of these knockout mice, it has become clear that neuropeptide processing is complex and regionally specific. The enzyme responsible for processing in one part of the brain may not be involved in other parts of the brain. It is essential to do a detailed dissection of the brain and analyze peptide levels in many brain regions to fully understand the role of the enzymes. Because loss of these proteases may trigger compensatory mechanisms which involve expression of the neuropeptides being studied or other proteases or accessory proteins, it is also important to examine how loss of an enzyme alters expression of the neuropeptides being studied as well as other proteins thought to be involved in neuropeptide processing. By determining how loss of an enzyme alters the molecular form(s) of the peptide that are made, additional mechanistic information can be obtained. This review will describe established methods to achieve these research goals.

  16. Hematology grants workshop.

    PubMed

    Ferrara, James L M; Schmaier, Alvin H

    2002-01-01

    The process of writing an NIH grant application is complex and difficult. Understanding critical details of the review process is a key to success. In this article the authors analyze the NIH grant application process from the reviewer's perspective. They discuss NIH review criteria and highlight the characteristics of successful grant applications. They also suggest specific strategies to improve applications in terms of timeliness, clarity, focus, and independence and cover the key elements to revising an application that is not funded initially.

  17. Process Design and Economics for the Production of Algal Biomass: Algal Biomass Production in Open Pond Systems and Processing Through Dewatering for Downstream Conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Ryan; Markham, Jennifer; Kinchin, Christopher

    2016-02-17

    This report describes in detail a set of aspirational design and process targets to better understand the realistic economic potential for the production of algal biomass for subsequent conversion to biofuels and/or coproducts, based on the use of open pond cultivation systems and a series of dewatering operations to concentrate the biomass up to 20 wt% solids (ash-free dry weight basis).

  18. Preferential Ty1 retromobility in mother cells and nonquiescent stationary phase cells is associated with increased concentrations of total Gag or processed Gag and is inhibited by exposure to a high concentration of calcium.

    PubMed

    Peifer, Andrew C; Maxwell, Patrick H

    2018-03-21

    Retrotransposons are abundant mobile DNA elements in eukaryotic genomes that are more active with age in diverse species. Details of the regulation and consequences of retrotransposon activity during aging remain to be determined. Ty1 retromobility in Saccharomyces cerevisiae is more frequent in mother cells compared to daughter cells, and we found that Ty1 was more mobile in nonquiescent compared to quiescent subpopulations of stationary phase cells. This retromobility asymmetry was absent in mutant strains lacking BRP1 that have reduced expression of the essential Pma1p plasma membrane proton pump, lacking the mRNA decay gene LSM1 , and in cells exposed to a high concentration of calcium. Mother cells had higher levels of Ty1 Gag protein than daughters. The proportion of protease-processed Gag decreased as cells transitioned to stationary phase, processed Gag was the dominant form in nonquiescent cells, but was virtually absent from quiescent cells. Treatment with calcium reduced total Gag levels and the proportion of processed Gag, particularly in mother cells. We also found that Ty1 reduced the fitness of proliferating but not stationary phase cells. These findings may be relevant to understanding regulation and consequences of retrotransposons during aging in other organisms, due to conserved impacts and regulation of retrotransposons.

  19. Preferential Ty1 retromobility in mother cells and nonquiescent stationary phase cells is associated with increased concentrations of total Gag or processed Gag and is inhibited by exposure to a high concentration of calcium

    PubMed Central

    Peifer, Andrew C.

    2018-01-01

    Retrotransposons are abundant mobile DNA elements in eukaryotic genomes that are more active with age in diverse species. Details of the regulation and consequences of retrotransposon activity during aging remain to be determined. Ty1 retromobility in Saccharomyces cerevisiae is more frequent in mother cells compared to daughter cells, and we found that Ty1 was more mobile in nonquiescent compared to quiescent subpopulations of stationary phase cells. This retromobility asymmetry was absent in mutant strains lacking BRP1 that have reduced expression of the essential Pma1p plasma membrane proton pump, lacking the mRNA decay gene LSM1, and in cells exposed to a high concentration of calcium. Mother cells had higher levels of Ty1 Gag protein than daughters. The proportion of protease-processed Gag decreased as cells transitioned to stationary phase, processed Gag was the dominant form in nonquiescent cells, but was virtually absent from quiescent cells. Treatment with calcium reduced total Gag levels and the proportion of processed Gag, particularly in mother cells. We also found that Ty1 reduced the fitness of proliferating but not stationary phase cells. These findings may be relevant to understanding regulation and consequences of retrotransposons during aging in other organisms, due to conserved impacts and regulation of retrotransposons. PMID:29562219

  20. The Comprehensive AOCMF Classification System: Mandible Fractures-Level 3 Tutorial

    PubMed Central

    Cornelius, Carl-Peter; Audigé, Laurent; Kunz, Christoph; Rudderman, Randal; Buitrago-Téllez, Carlos H.; Frodel, John; Prein, Joachim

    2014-01-01

    This tutorial outlines the details of the AOCMF image-based classification system for fractures of the mandibular arch (i.e. the non-condylar mandible) at the precision level 3. It is the logical expansion of the fracture allocation to topographic mandibular sites outlined in level 2, and is based on three-dimensional (3D) imaging techniques/computed tomography (CT)/cone beam CT). Level 3 allows an anatomical description of the individual conditions of the mandibular arch such as the preinjury dental state and the degree of alveolar atrophy. Trauma sequelae are then addressed: (1) tooth injuries and periodontal trauma, (2) fracture involvement of the alveolar process, (3) the degree of fracture fragmentation in three categories (none, minor, and major), and (4) the presence of bone loss. The grading of fragmentation needs a 3D evaluation of the fracture area, allowing visualization of the outer and inner mandibular cortices. To document these fracture features beyond topography the alphanumeric codes are supplied with distinctive appendices. This level 3 tutorial is accompanied by a brief survey of the peculiarities of the edentulous atrophic mandible. Illustrations and a few case examples serve as instruction and reference to improve the understanding and application of the presented features. PMID:25489389

  1. Understanding information synthesis in oral surgery for the design of systems for clinical information technology.

    PubMed

    Suebnukarn, Siriwan; Chanakarn, Piyawadee; Phisutphatthana, Sirada; Pongpatarat, Kanchala; Wongwaithongdee, Udom; Oupadissakoon, Chanekrid

    2015-12-01

    An understanding of the processes of clinical decision-making is essential for the development of health information technology. In this study we have analysed the acquisition of information during decision-making in oral surgery, and analysed cognitive tasks using a "think-aloud" protocol. We studied the techniques of processing information that were used by novices and experts as they completed 4 oral surgical cases modelled from data obtained from electronic hospital records. We studied 2 phases of an oral surgeon's preoperative practice including the "diagnosis and planning of treatment" and "preparing for a procedure". A framework analysis approach was used to analyse the qualitative data, and a descriptive statistical analysis was made of the quantitative data. The results showed that novice surgeons used hypotheticodeductive reasoning, whereas experts recognised patterns to diagnose and manage patients. Novices provided less detail when they prepared for a procedure. Concepts regarding "signs", "importance", "decisions", and "process" occurred most often during acquisition of information by both novices and experts. Based on these results, we formulated recommendations for the design of clinical information technology that would help to improve the acquisition of clinical information required by oral surgeons at all levels of expertise in their clinical decision-making. Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  2. Clinical process cost analysis.

    PubMed

    Marrin, C A; Johnson, L C; Beggs, V L; Batalden, P B

    1997-09-01

    New systems of reimbursement are exerting enormous pressure on clinicians and hospitals to reduce costs. Using cheaper supplies or reducing the length of stay may be a satisfactory short-term solution, but the best strategy for long-term success is radical reduction of costs by reengineering the processes of care. However, few clinicians or institutions know the actual costs of medical care; nor do they understand, in detail, the activities involved in the delivery of care. Finally, there is no accepted method for linking the two. Clinical process cost analysis begins with the construction of a detailed flow diagram incorporating each activity in the process of care. The cost of each activity is then calculated, and the two are linked. This technique was applied to Diagnosis Related Group 75 to analyze the real costs of the operative treatment of lung cancer at one institution. Total costs varied between $6,400 and $7,700. The major driver of costs was personnel time, which accounted for 55% of the total. Forty percent of the total cost was incurred in the operating room. The cost of care decreased progressively during hospitalization. Clinical process cost analysis provides detailed information about the costs and processes of care. The insights thus obtained may be used to reduce costs by reengineering the process.

  3. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its results and impact. We will highlight the insights gained by applying the Model Based System Engineering and provide recommendations for its applications and improvements.

  4. Neuroinflammation: the devil is in the details.

    PubMed

    DiSabato, Damon J; Quan, Ning; Godbout, Jonathan P

    2016-10-01

    There is significant interest in understanding inflammatory responses within the brain and spinal cord. Inflammatory responses that are centralized within the brain and spinal cord are generally referred to as 'neuroinflammatory'. Aspects of neuroinflammation vary within the context of disease, injury, infection, or stress. The context, course, and duration of these inflammatory responses are all critical aspects in the understanding of these processes and their corresponding physiological, biochemical, and behavioral consequences. Microglia, innate immune cells of the CNS, play key roles in mediating these neuroinflammatory responses. Because the connotation of neuroinflammation is inherently negative and maladaptive, the majority of research focus is on the pathological aspects of neuroinflammation. There are, however, several degrees of neuroinflammatory responses, some of which are positive. In many circumstances including CNS injury, there is a balance of inflammatory and intrinsic repair processes that influences functional recovery. In addition, there are several other examples where communication between the brain and immune system involves neuroinflammatory processes that are beneficial and adaptive. The purpose of this review is to distinguish different variations of neuroinflammation in a context-specific manner and detail both positive and negative aspects of neuroinflammatory processes. In this review, we will use brain and spinal cord injury, stress, aging, and other inflammatory events to illustrate the potential harm and benefits inherent to neuroinflammation. Context, course, and duration of the inflammation are highly important to the interpretation of these events, and we aim to provide insight into this by detailing several commonly studied insults. This article is part of the 60th anniversary supplemental issue. © 2016 International Society for Neurochemistry.

  5. Let Them Eat Promises

    ERIC Educational Resources Information Center

    Duggan, Thomas

    1972-01-01

    Article suggests books and films, as well as newspapers, which senior high-school students who will be voting in 1972 for the first time can study to broaden their understanding of the political parties and the electoral process. School and class activities relevant to the presidential election are also detailed. (PD)

  6. Mechanism of Phosphine Dissociation on the Si(001) Surface

    NASA Astrophysics Data System (ADS)

    Warschkow, Oliver; Schofield, Steven R.; Smith, Phil V.

    2005-03-01

    The continued down-scaling of electronic devices to the atomic scale increasingly requires an atomic-level understanding of the elementary processes of semiconductor doping. We present a combined experimental and theoretical investigation into the dissociation mechanism of phosphine (PH3) on the Si(001) surface. As reported by us elsewhere in this conference, a number of prominent intermediate species of PH3 dissociation observed in STM experiments have been structurally characterized as PH2+H, PH+2H and P+3H species respectively. In this poster we present detailed quantum chemical calculations of these and other short-lived intermediates as well as the transition (kinetic) barriers between them. This leads us to formulate a step-by-step mechanism for the complete dissociation of PH3 on the Si(001) surface.

  7. Multiscale mechanobiology: computational models for integrating molecules to multicellular systems

    PubMed Central

    Mak, Michael; Kim, Taeyoon

    2015-01-01

    Mechanical signals exist throughout the biological landscape. Across all scales, these signals, in the form of force, stiffness, and deformations, are generated and processed, resulting in an active mechanobiological circuit that controls many fundamental aspects of life, from protein unfolding and cytoskeletal remodeling to collective cell motions. The multiple scales and complex feedback involved present a challenge for fully understanding the nature of this circuit, particularly in development and disease in which it has been implicated. Computational models that accurately predict and are based on experimental data enable a means to integrate basic principles and explore fine details of mechanosensing and mechanotransduction in and across all levels of biological systems. Here we review recent advances in these models along with supporting and emerging experimental findings. PMID:26019013

  8. Beyond the checklist: assessing understanding for HIV vaccine trial participation in South Africa.

    PubMed

    Lindegger, Graham; Milford, Cecilia; Slack, Catherine; Quayle, Michael; Xaba, Xolani; Vardas, Eftyhia

    2006-12-15

    Informed consent and understanding are essential ethical requirements for clinical trial participation. Traditional binary measures of understanding may be limited and not be the best measures of level of understanding. This study designed and compared 4 measures of understanding for potential participants being prepared for enrollment in South African HIV vaccine trials, using detailed operational scoring criteria. Assessment of understanding of 7 key trial components was compared via self-report, checklist, vignettes, and narrative measures. Fifty-nine participants, including members of vaccine preparedness groups and 1 HIV vaccine trial, took part. There were significant differences across the measures for understanding of 5 components and for overall understanding. Highest scores were obtained on self-report and checklist measures, and lowest scores were obtained for vignettes and narrative descriptions. The findings suggest that levels of measured understanding are dependent on the tools used. Forced-choice measures like checklists tend to yield higher scores than open-ended measures like narratives or vignettes. Consideration should be given to complementing checklists and self-reports with open-ended measures, particularly for critical trial concepts, where the consequences of misunderstanding are potentially severe.

  9. Expert Witness or Advocate: Developing Oral Argument Skills in the Marine Science Student.

    ERIC Educational Resources Information Center

    Evans, James; And Others

    1992-01-01

    This article details an undergraduate/graduate-level, experimental, marine science education course utilizing interactive student participation concerning the relationship between individual property interests and coastal protection legislation in the aftermath of hurricane Hugo in 1989. Students successfully gained an understanding of the…

  10. Extended principle component analysis - a useful tool to understand processes governing water quality at catchment scales

    NASA Astrophysics Data System (ADS)

    Selle, B.; Schwientek, M.

    2012-04-01

    Water quality of ground and surface waters in catchments is typically driven by many complex and interacting processes. While small scale processes are often studied in great detail, their relevance and interplay at catchment scales remain often poorly understood. For many catchments, extensive monitoring data on water quality have been collected for different purposes. These heterogeneous data sets contain valuable information on catchment scale processes but are rarely analysed using integrated methods. Principle component analysis (PCA) has previously been applied to this kind of data sets. However, a detailed analysis of scores, which are an important result of a PCA, is often missing. Mathematically, PCA expresses measured variables on water quality, e.g. nitrate concentrations, as linear combination of independent, not directly observable key processes. These computed key processes are represented by principle components. Their scores are interpretable as process intensities which vary in space and time. Subsequently, scores can be correlated with other key variables and catchment characteristics, such as water travel times and land use that were not considered in PCA. This detailed analysis of scores represents an extension of the commonly applied PCA which could considerably improve the understanding of processes governing water quality at catchment scales. In this study, we investigated the 170 km2 Ammer catchment in SW Germany which is characterised by an above average proportion of agricultural (71%) and urban (17%) areas. The Ammer River is mainly fed by karstic springs. For PCA, we separately analysed concentrations from (a) surface waters of the Ammer River and its tributaries, (b) spring waters from the main aquifers and (c) deep groundwater from production wells. This analysis was extended by a detailed analysis of scores. We analysed measured concentrations on major ions and selected organic micropollutants. Additionally, redox-sensitive variables and environmental tracers indicating groundwater age were analysed for deep groundwater from production wells. For deep groundwater, we found that microbial turnover was stronger influenced by local availability of energy sources than by travel times of groundwater to the wells. Groundwater quality primarily reflected the input of pollutants determined by landuse, e.g. agrochemicals. We concluded that for water quality in the Ammer catchment, conservative mixing of waters with different origin is more important than reactive transport processes along the flow path.

  11. The Morphology and Uniformity of Circumstellar OH/H2O Masers around OH/IR Stars

    NASA Astrophysics Data System (ADS)

    Felli, Derek Sean

    Even though low mass stars ( 8 solar masses), the more massive stars drive the chemical evolution of galaxies from which the next generation of stars and planets can form. Understanding mass loss of asymptotic giant branch stars contributes to our understanding of the chemical evolution of the galaxy, stellar populations, and star formation history. Stars with mass 8 solar masses go supernova. In both cases, these stars enrich their environments with elements heavier than simple hydrogen and helium molecules. While some general info about how stars die and form planetary nebulae are known, specific details are missing due to a lack of high-resolution observations and analysis of the intermediate stages. For example, we know that mass loss in stars creates morphologically diverse planetary nebulae, but we do not know the uniformity of these processes, and therefore lack detailed models to better predict how spherically symmetric stars form asymmetric nebulae. We have selected a specific group of late-stage stars and observed them at different scales to reveal the uniformity of mass loss through different layers close to the star. This includes observing nearby masers that trace the molecular shell structure around these stars. This study revealed detailed structure that was analyzed for uniformity to place constraints on how the mass loss processes behave in models. These results will feed into our ability to create more detailed models to better predict the chemical evolution of the next generation of stars and planets.

  12. Conscious Vision Proceeds from Global to Local Content in Goal-Directed Tasks and Spontaneous Vision.

    PubMed

    Campana, Florence; Rebollo, Ignacio; Urai, Anne; Wyart, Valentin; Tallon-Baudry, Catherine

    2016-05-11

    The reverse hierarchy theory (Hochstein and Ahissar, 2002) makes strong, but so far untested, predictions on conscious vision. In this theory, local details encoded in lower-order visual areas are unconsciously processed before being automatically and rapidly combined into global information in higher-order visual areas, where conscious percepts emerge. Contingent on current goals, local details can afterward be consciously retrieved. This model therefore predicts that (1) global information is perceived faster than local details, (2) global information is computed regardless of task demands during early visual processing, and (3) spontaneous vision is dominated by global percepts. We designed novel textured stimuli that are, as opposed to the classic Navon's letters, truly hierarchical (i.e., where global information is solely defined by local information but where local and global orientations can still be manipulated separately). In line with the predictions, observers were systematically faster reporting global than local properties of those stimuli. Second, global information could be decoded from magneto-encephalographic data during early visual processing regardless of task demands. Last, spontaneous subjective reports were dominated by global information and the frequency and speed of spontaneous global perception correlated with the accuracy and speed in the global task. No such correlation was observed for local information. We therefore show that information at different levels of the visual hierarchy is not equally likely to become conscious; rather, conscious percepts emerge preferentially at a global level. We further show that spontaneous reports can be reliable and are tightly linked to objective performance at the global level. Is information encoded at different levels of the visual system (local details in low-level areas vs global shapes in high-level areas) equally likely to become conscious? We designed new hierarchical stimuli and provide the first empirical evidence based on behavioral and MEG data that global information encoded at high levels of the visual hierarchy dominates perception. This result held both in the presence and in the absence of task demands. The preferential emergence of percepts at high levels can account for two properties of conscious vision, namely, the dominance of global percepts and the feeling of visual richness reported independently of the perception of local details. Copyright © 2016 the authors 0270-6474/16/365200-14$15.00/0.

  13. Rational Design of Molecular Ferroelectric Materials and Nanostructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ducharme, Stephen

    2012-09-25

    The purpose of this project was to gain insight into the properties of molecular ferroelectrics through the detailed study of oligomer analogs of polyvinylidene fluoride (PVDF). By focusing on interactions at both the molecular level and the nanoscale level, we expect to gain improved understanding about the fundamental mechanism of ferroelectricity and its key properties. The research consisted of three complementary components: 1) Rational synthesis of VDF oligomers by Prof. Takacs' group; 2) Detailed structural and electrical studies of thin by Prof. Ducharme's Group; and 3) First-principles computational studies by DOE Lab Partner Dr. Serge Nakhman-son at Argonne National Laboratory.more » The main results of the work was a detailed understanding of the relationships between the molecular interactions and macroscopic phenomenology of fer-roelectricity VDF oligomers. This is valuable information supporting the development of im-proved electromechanical materials for, e.g., sonar, ultrasonic imaging, artificial muscles, and compliant actuators. Other potential applications include nonvolatile ferroelectric memories, heat-sensing imaging arrays, photovoltaic devices, and functional biomimetic materials. The pro-ject contributed to the training and professional development of undergraduate students and graduate students, post-doctoral assistants, and a high-school teacher. Project personnel took part in several outreach and education activities each year.« less

  14. Abnormalities of Object Visual Processing in Body Dysmorphic Disorder

    PubMed Central

    Feusner, Jamie D.; Hembacher, Emily; Moller, Hayley; Moody, Teena D.

    2013-01-01

    Background Individuals with body dysmorphic disorder may have perceptual distortions for their appearance. Previous studies suggest imbalances in detailed relative to configural/holistic visual processing when viewing faces. No study has investigated the neural correlates of processing non-symptom-related stimuli. The objective of this study was to determine whether individuals with body dysmorphic disorder have abnormal patterns of brain activation when viewing non-face/non-body object stimuli. Methods Fourteen medication-free participants with DSM-IV body dysmorphic disorder and 14 healthy controls participated. We performed functional magnetic resonance imaging while participants matched photographs of houses that were unaltered, contained only high spatial frequency (high detail) information, or only low spatial frequency (low detail) information. The primary outcome was group differences in blood oxygen level-dependent signal changes. Results The body dysmorphic disorder group showed lesser activity in the parahippocampal gyrus, lingual gyrus, and precuneus for low spatial frequency images. There were greater activations in medial prefrontal regions for high spatial frequency images, although no significant differences when compared to a low-level baseline. Greater symptom severity was associated with lesser activity in dorsal occipital cortex and ventrolateral prefrontal cortex for normal and high spatial frequency images. Conclusions Individuals with body dysmorphic disorder have abnormal brain activation patterns when viewing objects. Hypoactivity in visual association areas for configural and holistic (low detail) elements and abnormal allocation of prefrontal systems for details is consistent with a model of imbalances in global vs. local processing. This may occur not only for appearance but also for general stimuli unrelated to their symptoms. PMID:21557897

  15. Quasi-stationary fluid theory of the hole-boring process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pei, Zhikun; Shen, Baifei, E-mail: bfshen@mail.shcnc.ac.cn; Shi, Yin

    We present a quasi-stationary fluid theory to precisely describe the hole-boring process. The corresponding distributions of the electrostatic field and the particle density are theoretically obtained, which give more details than the previous stationary theory. The theoretical result is confirmed by one-dimensional particle-in-cell simulations. Such quasi-stationary fluid theory may help in understanding the basic mechanisms of ion acceleration in the radiation pressure acceleration.

  16. An image-processing method to detect sub-optical features based on understanding noise in intensity measurements.

    PubMed

    Bhatia, Tripta

    2018-07-01

    Accurate quantitative analysis of image data requires that we distinguish between fluorescence intensity (true signal) and the noise inherent to its measurements to the extent possible. We image multilamellar membrane tubes and beads that grow from defects in the fluid lamellar phase of the lipid 1,2-dioleoyl-sn-glycero-3-phosphocholine dissolved in water and water-glycerol mixtures by using fluorescence confocal polarizing microscope. We quantify image noise and determine the noise statistics. Understanding the nature of image noise also helps in optimizing image processing to detect sub-optical features, which would otherwise remain hidden. We use an image-processing technique "optimum smoothening" to improve the signal-to-noise ratio of features of interest without smearing their structural details. A high SNR renders desired positional accuracy with which it is possible to resolve features of interest with width below optical resolution. Using optimum smoothening, the smallest and the largest core diameter detected is of width [Formula: see text] and [Formula: see text] nm, respectively, discussed in this paper. The image-processing and analysis techniques and the noise modeling discussed in this paper can be used for detailed morphological analysis of features down to sub-optical length scales that are obtained by any kind of fluorescence intensity imaging in the raster mode.

  17. Sustainable Energy Solutions Task 3.0:Life-Cycle Database for Wind Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Twomey, Janet M.

    2010-03-01

    The benefits of wind energy had previously been captured in the literature at an overview level with relatively low transparency or ability to understand the basis for that information. This has limited improvement and decision-making to larger questions such as wind versus other electrical sources (such as coal-fired plants). This research project has established a substantially different approach which is to add modular, high granularity life cycle inventory (lci) information that can be used by a wide range of decision-makers, seeking environmental improvement. Results from this project have expanded the understanding and evaluation of the underlying factors that can improvemore » both manufacturing processes and specifically wind generators. The use of life cycle inventory techniques has provided a uniform framework to understand and compare the full range of environmental improvement in manufacturing, hence the concept of green manufacturing. In this project, the focus is on 1. the manufacturing steps that transform materials and chemicals into functioning products 2. the supply chain and end-of-life influences of materials and chemicals used in industry Results have been applied to wind generators, but also impact the larger U.S. product manufacturing base. For chemicals and materials, this project has provided a standard format for each lci that contains an overview and description, a process flow diagram, detailed mass balances, detailed energy of unit processes, and an executive summary. This is suitable for integration into other life cycle databases (such as that at NREL), so that broad use can be achieved. The use of representative processes allows unrestricted use of project results. With the framework refined in this project, information gathering was initiated for chemicals and materials in wind generation. Since manufacturing is one of the most significant parts of the environmental domain for wind generation improvement, this project research has developed a fundamental approach. The emphasis was place on individual unit processes as an organizing framework to understand the life cycle of manufactured products. The rearrangement of unit processes provides an efficient and versatile means of understanding improved manufactured products such as wind generators. The taxonomy and structure of unit process lci were developed in this project. A series of ten unit process lci were developed to sample the major segments of the manufacturing unit process taxonomy. Technical and economic effectiveness has been a focus of the project research in Task three. The use of repeatable modules for the organization of information on environmental improvement has a long term impact. The information developed can be used and reused in a variety of manufacturing plants and for a range of wind generator sizes and designs. Such a modular approach will lower the cost of life cycle analysis, that is often asked questions of carbon footprint, environmental impact, and sustainability. The use of a website for dissemination, linked to NREL, adds to the economic benefit as more users have access to the lci information. Benefit to the public has been achieved by a well-attended WSU conference, as well as presentations for the Kansas Wind Energy Commission. Attendees represented public interests, land owners, wind farm developers, those interested in green jobs, and industry. Another benefit to the public is the start of information flow from manufacturers that can inform individuals about products.« less

  18. Redox Signaling Mechanisms in Nervous System Development.

    PubMed

    Olguín-Albuerne, Mauricio; Morán, Julio

    2018-06-20

    Numerous studies have demonstrated the actions of reactive oxygen species (ROS) as regulators of several physiological processes. In this study, we discuss how redox signaling mechanisms operate to control different processes such as neuronal differentiation, oligodendrocyte differentiation, dendritic growth, and axonal growth. Recent Advances: Redox homeostasis regulates the physiology of neural stem cells (NSCs). Notably, the neuronal differentiation process of NSCs is determined by a change toward oxidative metabolism, increased levels of mitochondrial ROS, increased activity of NADPH oxidase (NOX) enzymes, decreased levels of Nrf2, and differential regulation of different redoxins. Furthermore, during the neuronal maturation processes, NOX and MICAL produce ROS to regulate cytoskeletal dynamics, which control the dendritic and axonal growth, as well as the axonal guidance. The redox homeostasis changes are, in part, attributed to cell metabolism and compartmentalized production of ROS, which is regulated, sensed, and transduced by different molecules such as thioredoxins, glutaredoxins, peroxiredoxins, and nucleoredoxin to control different signaling pathways in different subcellular regions. The study of how these elements cooperatively act is essential for the understanding of nervous system development, as well as the application of regenerative therapies that recapitulate these processes. The information about these topics in the last two decades leads us to the conclusion that the role of ROS signaling in development of the nervous system is more important than it was previously believed and makes clear the importance of exploring in more detail the mechanisms of redox signaling. Antioxid. Redox Signal. 28, 1603-1625.

  19. Evaluation of the FIR Example using Xilinx Vivado High-Level Synthesis Compiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Finkel, Hal; Yoshii, Kazutomo

    Compared to central processing units (CPUs) and graphics processing units (GPUs), field programmable gate arrays (FPGAs) have major advantages in reconfigurability and performance achieved per watt. This development flow has been augmented with high-level synthesis (HLS) flow that can convert programs written in a high-level programming language to Hardware Description Language (HDL). Using high-level programming languages such as C, C++, and OpenCL for FPGA-based development could allow software developers, who have little FPGA knowledge, to take advantage of the FPGA-based application acceleration. This improves developer productivity and makes the FPGA-based acceleration accessible to hardware and software developers. Xilinx Vivado HLSmore » compiler is a high-level synthesis tool that enables C, C++ and System C specification to be directly targeted into Xilinx FPGAs without the need to create RTL manually. The white paper [1] published recently by Xilinx uses a finite impulse response (FIR) example to demonstrate the variable-precision features in the Vivado HLS compiler and the resource and power benefits of converting floating point to fixed point for a design. To get a better understanding of variable-precision features in terms of resource usage and performance, this report presents the experimental results of evaluating the FIR example using Vivado HLS 2017.1 and a Kintex Ultrascale FPGA. In addition, we evaluated the half-precision floating-point data type against the double-precision and single-precision data type and present the detailed results.« less

  20. Designing for Mathematical Abstraction

    ERIC Educational Resources Information Center

    Pratt, Dave; Noss, Richard

    2010-01-01

    Our focus is on the design of systems (pedagogical, technical, social) that encourage mathematical abstraction, a process we refer to as "designing for abstraction." In this paper, we draw on detailed design experiments from our research on children's understanding about chance and distribution to re-present this work as a case study in designing…

  1. Better Beginnings through Nurturing Touch

    ERIC Educational Resources Information Center

    Storm, Linda; Reese, Suzanne P.

    2005-01-01

    The authors of this article describe how infant massage can promote attachment and greater attunement between very young children and their parents. Infant massage instructors teach parents how to understand babies' states of arousal so they can read and respond appropriately to their cues. The authors detail the process of teaching infant…

  2. Bringing a Global Perspective to Economics. Field Test Edition.

    ERIC Educational Resources Information Center

    Woyach, Robert B.; And Others

    Eight lessons on integrated global economics provide detailed instructional materials on world food and energy systems, international cartels, and the nature and process of foreign investments. The materials are designed to help high school social studies teachers develop student understanding of key economic systems and activities and reinforce…

  3. FORMATION OF POLYCYCLIC AROMATIC HYDROCARBONS AND THEIR GROWTH TO SOOT -A REVIEW OF CHEMICAL REACTION PATHWAYS. (R824970)

    EPA Science Inventory

    The generation by combustion processes of airborne species of current health concern such as polycyclic aromatic hydrocarbons (PAH) and soot particles necessitates a detailed understanding of chemical reaction pathways responsible for their formation. The present review discus...

  4. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  5. Glucose and Fructose to Platform Chemicals: Understanding the Thermodynamic Landscapes of Acid-Catalysed Reactions Using High-Level ab Initio Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Assary, Rajeev S.; Kim, Taijin; Low, John

    Molecular level understanding of acid-catalysed conversion of sugar molecules to platform chemicals such as hydroxy-methyl furfural (HMF), furfuryl alcohol (FAL), and levulinic acid (LA) is essential for efficient biomass conversion. In this paper, the high-level G4MP2 method along with the SMD solvation model is employed to understand detailed reaction energetics of the acid-catalysed decomposition of glucose and fructose to HMF. Based on protonation free energies of various hydroxyl groups of the sugar molecule, the relative reactivity of gluco-pyranose, fructo-pyranose and fructo-furanose are predicted. Calculations suggest that, in addition to the protonated intermediates, a solvent assisted dehydration of one of themore » fructo-furanosyl intermediates is a competing mechanism, indicating the possibility of multiple reaction pathways for fructose to HMF conversion in aqueous acidic medium. Two reaction pathways were explored to understand the thermodynamics of glucose to HMF; the first one is initiated by the protonation of a C2–OH group and the second one through an enolate intermediate involving acyclic intermediates. Additionally, a pathway is proposed for the formation of furfuryl alcohol from glucose initiated by the protonation of a C2–OH position, which includes a C–C bond cleavage, and the formation of formic acid. The detailed free energy landscapes predicted in this study can be used as benchmarks for further exploring the sugar decomposition reactions, prediction of possible intermediates, and finally designing improved catalysts for biomass conversion chemistry in the future.« less

  6. Glucose and fructose to platform chemicals: understanding the thermodynamic landscapes of acid-catalysed reactions using high-level ab initio methods.

    PubMed

    Assary, Rajeev S; Kim, Taejin; Low, John J; Greeley, Jeff; Curtiss, Larry A

    2012-12-28

    Molecular level understanding of acid-catalysed conversion of sugar molecules to platform chemicals such as hydroxy-methyl furfural (HMF), furfuryl alcohol (FAL), and levulinic acid (LA) is essential for efficient biomass conversion. In this paper, the high-level G4MP2 method along with the SMD solvation model is employed to understand detailed reaction energetics of the acid-catalysed decomposition of glucose and fructose to HMF. Based on protonation free energies of various hydroxyl groups of the sugar molecule, the relative reactivity of gluco-pyranose, fructo-pyranose and fructo-furanose are predicted. Calculations suggest that, in addition to the protonated intermediates, a solvent assisted dehydration of one of the fructo-furanosyl intermediates is a competing mechanism, indicating the possibility of multiple reaction pathways for fructose to HMF conversion in aqueous acidic medium. Two reaction pathways were explored to understand the thermodynamics of glucose to HMF; the first one is initiated by the protonation of a C2-OH group and the second one through an enolate intermediate involving acyclic intermediates. Additionally, a pathway is proposed for the formation of furfuryl alcohol from glucose initiated by the protonation of a C2-OH position, which includes a C-C bond cleavage, and the formation of formic acid. The detailed free energy landscapes predicted in this study can be used as benchmarks for further exploring the sugar decomposition reactions, prediction of possible intermediates, and finally designing improved catalysts for biomass conversion chemistry in the future.

  7. Consciousness weaves our internal view of the outside world.

    PubMed

    Gur, Moshe

    2016-01-01

    Low-level consciousness is fundamental to our understanding of the world. Within the conscious field, the constantly changing external visual information is transformed into stable, object-based percepts. Remarkably, holistic objects are perceived while we are cognizant of all of the spatial details comprising the objects and of the relationship between individual elements. This parallel conscious association is unique to the brain. Conscious contributions to motor activity come after our understanding of the world has been established.

  8. Sugar - hormone crosstalk in seed development: Two redundant pathways of IAA biosynthesis are regulated differentially in the invertase-deficient miniature1 (mn1) seed mutant in maize

    USDA-ARS?s Scientific Manuscript database

    The miniature1 (mn1) seed phenotype is a loss-of-function mutation at the Mn1 locus that encodes a cell wall invertase; its deficiency leads to pleiotropic changes including altered sugar levels and decreased levels of IAA throughout seed development. To understand the molecular details of such suga...

  9. Ocean Basin Impact of Ambient Noise on Marine Mammal Detectability, Distribution, and Acoustic Communication - YIP

    DTIC Science & Technology

    2013-09-30

    soundscape into frequency categories and sound level percentiles allowed for detailed examination of the acoustic environment that would not have been...patterns and trends across sound level parameters and frequency at a single location, it is recommended that the soundscape of any region be...joined to better understand the contribution and variation in distant shipping noise to local soundscapes (Ainslie & Miksis-Olds, 2013) REFERENCES

  10. Size-Dependent Regulation of Intracellular Trafficking of Polystyrene Nanoparticle-Based Drug-Delivery Systems.

    PubMed

    Wang, Ting; Wang, Lu; Li, Xiaoming; Hu, Xingjie; Han, Yuping; Luo, Yao; Wang, Zejun; Li, Qian; Aldalbahi, Ali; Wang, Lihua; Song, Shiping; Fan, Chunhai; Zhao, Yun; Wang, Maolin; Chen, Nan

    2017-06-07

    Nanoparticles (NPs) have shown great promise as intracellular imaging probes or nanocarriers and are increasingly being used in biomedical applications. A detailed understanding of how NPs get "in and out" of cells is important for developing new nanomaterials with improved selectivity and less cytotoxicity. Both physical and chemical characteristics have been proven to regulate the cellular uptake of NPs. However, the exocytosis process and its regulation are less explored. Herein, we investigated the size-regulated endocytosis and exocytosis of carboxylated polystyrene (PS) NPs. PS NPs with a smaller size were endocytosed mainly through the clathrin-dependent pathway, whereas PS NPs with a larger size preferred caveolae-mediated endocytosis. Furthermore, our results revealed exocytosis of larger PS NPs and tracked the dynamic process at the single-particle level. These results indicate that particle size is a key factor for the regulation of intracellular trafficking of NPs and provide new insight into the development of more effective cellular nanocarriers.

  11. A neural model of hierarchical reinforcement learning.

    PubMed

    Rasmussen, Daniel; Voelker, Aaron; Eliasmith, Chris

    2017-01-01

    We develop a novel, biologically detailed neural model of reinforcement learning (RL) processes in the brain. This model incorporates a broad range of biological features that pose challenges to neural RL, such as temporally extended action sequences, continuous environments involving unknown time delays, and noisy/imprecise computations. Most significantly, we expand the model into the realm of hierarchical reinforcement learning (HRL), which divides the RL process into a hierarchy of actions at different levels of abstraction. Here we implement all the major components of HRL in a neural model that captures a variety of known anatomical and physiological properties of the brain. We demonstrate the performance of the model in a range of different environments, in order to emphasize the aim of understanding the brain's general reinforcement learning ability. These results show that the model compares well to previous modelling work and demonstrates improved performance as a result of its hierarchical ability. We also show that the model's behaviour is consistent with available data on human hierarchical RL, and generate several novel predictions.

  12. Occurrence of Sudan I in paprika fruits caused by agricultural environmental contamination.

    PubMed

    Lian, Yunhe; Gao, Wei; Zhou, Li; Wu, Naiying; Lu, Qingguo; Han, Wenjie; Tie, Xiaowei

    2014-05-07

    Current research has demonstrated the presence of sub parts per billion levels of Sudan dye in paprika fruits during the vegetation process, which is difficult to understand on the basis of the conventional concept of cross-contamination or malicious addition. Detailed surveys on Sudan dyes I-IV in paprika fruits, soils, and agronomic materials used from seven fields of Xinjiang (China) were conducted to investigate the natural contamination. Results revealed that Sudan dyes II-IV were never detected and that Sudan I existed in almost all samples except for the mulching film and irrigation water. The higher total amount of Sudan I in soils, pesticides, and fertilizers compared to coated seeds indicated the combination of Sudan I-contaminated soils and application of Sudan I-containing agronomic materials constitutes a major source of 0.18-2.52 μg/kg levels of Sudan I in fruits during the growth period. The study offers a more reasonable explanation for the previously observed Sudan I in paprika fruits.

  13. HATCN-based charge recombination layers as effective interconnectors for tandem organic solar cells.

    PubMed

    Wang, Rong-Bin; Wang, Qian-Kun; Xie, Hao-Jun; Xu, Lu-Hai; Duhm, Steffen; Li, Yan-Qing; Tang, Jian-Xin

    2014-09-10

    A comprehensive understanding of the energy-level alignment at the organic heterojunction interfaces is of paramount importance to optimize the performance of organic solar cells (OSCs). Here, the detailed electronic structures of organic interconnectors, consisting of cesium fluoride-doped 4,7-diphenyl-1,10-phenanthroline and hexaazatriphenylene-hexacarbonitrile (HATCN), have been investigated via in situ photoemission spectroscopy, and their impact on the charge recombination process in tandem OSCs has been identified. The experimental determination shows that the HATCN interlayer plays a significant role in the interface energetics with a dramatic decrease in the reverse built-in potential for electrons and holes from stacked subcells, which is beneficial to the charge recombination between HATCN and the adjacent layer. In accordance with the energy-level alignments, the open-circuit voltage of tandem OSC incorporating a HATCN-based interconnector is almost 2 times that of a single-cell OSC, revealing the effectiveness of the HATCN-based interconnectors in tandem organic devices.

  14. A Method to Determine Lysine Acetylation Stoichiometries

    DOE PAGES

    Nakayasu, Ernesto S.; Wu, Si; Sydor, Michael A.; ...

    2014-01-01

    Lysine acetylation is a common protein posttranslational modification that regulates a variety of biological processes. A major bottleneck to fully understanding the functional aspects of lysine acetylation is the difficulty in measuring the proportion of lysine residues that are acetylated. Here we describe a mass spectrometry method using a combination of isotope labeling and detection of a diagnostic fragment ion to determine the stoichiometry of protein lysine acetylation. Using this technique, we determined the modification occupancy for ~750 acetylated peptides from mammalian cell lysates. Furthermore, the acetylation on N-terminal tail of histone H4 was cross-validated by treating cells with sodiummore » butyrate, a potent deacetylase inhibitor, and comparing changes in stoichiometry levels measured by our method with immunoblotting measurements. Of note we observe that acetylation stoichiometry is high in nuclear proteins, but very low in mitochondrial and cytosolic proteins. In summary, our method opens new opportunities to study in detail the relationship of lysine acetylation levels of proteins with their biological functions.« less

  15. Gay father surrogacy families: relationships with surrogates and egg donors and parental disclosure of children's origins.

    PubMed

    Blake, Lucy; Carone, Nicola; Slutsky, Jenna; Raffanello, Elizabeth; Ehrhardt, Anke A; Golombok, Susan

    2016-11-01

    To study the nature and quality of relationships between gay father families and their surrogates and egg donors and parental disclosure of children's origins. Cross-sectional study. Family homes. Parents in 40 gay father families with 3-9-year-old children born through surrogacy. Administration of a semistructured interview. Relationships between parents, children, surrogates, and egg donors and parental disclosure of children's origins were examined using a semistructured interview. The majority of fathers were content with the level of contact they had with the surrogate, with those who were discontent wanting more contact. Fathers were more likely to maintain relationships with surrogates than egg donors, and almost all families had started the process of talking to their children about their origins, with the level of detail and children's understanding increasing with the age of the child. In gay father surrogacy families with young children, relationships between parents, children, surrogates, and egg donors are generally positive. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  16. The view from the tip of the iceberg.

    PubMed

    Josephs, L

    1997-01-01

    In recent years there has been a growing interest in refining the technique of ego defense analysis. All of these approaches share in common an attempt to work closely with the patient's free associations, to interpret at a level that is accessible to the patient's consciously observing ego, and to avoid bypassing the analysis of the patient's most surface-level resistances in an effort to understand unconscious conflict. These innovations reflect a commendable effort to work in a way that is rigorously empirical, that respects the patient's autonomy, and that minimizes the pressure of the analyst's transferential authority in the patient's acceptance of the analyst's interpretations. Despite the undeniable value of these technical innovations, such approaches to ego defense analysis may inadvertently result in certain overemphases in technique that may unnecessarily constrain the analytic process. They may result in a sort of obsessive tunnel vision that is overly focused on small details to the exclusion of the larger picture. An approach that counterbalances the microscopic and the macroscopic analysis of ego defense is recommended.

  17. A Combined Experimental and Analytical Modeling Approach to Understanding Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Nunes, Arthur C., Jr.; Stewart, Michael B.; Adams, Glynn P.; Romine, Peter

    1998-01-01

    In the Friction Stir Welding (FSW) process a rotating pin tool joins the sides of a seam by stirring them together. This solid state welding process avoids problems with melting and hot-shortness presented by some difficult-to weld high-performance light alloys. The details of the plastic flow during the process are not well understood and are currently a subject of research. Two candidate models of the FSW process, the Mixed Zone (MZ) and the Single Slip Surface (S3) model are presented and their predictions compared to experimental data.

  18. PISA 2009 Technical Report

    ERIC Educational Resources Information Center

    OECD Publishing (NJ1), 2012

    2012-01-01

    The "PISA 2009 Technical Report" describes the methodology underlying the PISA 2009 survey. It examines additional features related to the implementation of the project at a level of detail that allows researchers to understand and replicate its analyses. The reader will find a wealth of information on the test and sample design,…

  19. Music Handbook for Primary Grades.

    ERIC Educational Resources Information Center

    Bowman, Doris; And Others

    GRADES OR AGES: Primary grades (1, 2, and 3). SUBJECT MATTER: Music. ORGANIZATION AND PHYSICAL APPEARANCE: This guide contains a detailed outline of the basic music concepts for elementary grades with suggestions for activities which may develop understanding of the concepts. The pages of activities are color coded by grade level. There are three…

  20. Performance Analysis of Saturated Induction Motors by Virtual Tests

    ERIC Educational Resources Information Center

    Ojaghi, M.; Faiz, J.; Kazemi, M.; Rezaei, M.

    2012-01-01

    Many undergraduate-level electrical machines textbooks give detailed treatments of the performance of induction motors. Students can deepen this understanding of motor performance by performing the appropriate practical work in laboratories or in simulation using proper software packages. This paper considers various common and less-common tests…

  1. A global survey of per- and polyfluoroalkyl substances (PFASs) in surface soils in remote and urban environments

    EPA Science Inventory

    The heightened attention placed on perfluoroalkyl and polyfluoroalkyl substances (PFASs) over the past decades has led to their detection in many environmental and biological compartments. A detailed understanding of PFAS levels in these compartments is an important step towards ...

  2. Teaming with Opportunity: Media Programs, Community Constituencies, and Technology.

    ERIC Educational Resources Information Center

    Farmer, Lesley S. J.

    This book is intended to help library media teachers understand the nature of partnerships at both individual and group levels. It details the steps for developing and maintaining partnerships, particularly with groups and demonstrates how technology can affect these educational collaborative efforts. The chapters cover the following topics: (1)…

  3. Conceptual Knowledge of Decimal Arithmetic

    ERIC Educational Resources Information Center

    Lortie-Forgues, Hugues; Siegler, Robert S.

    2016-01-01

    In two studies (N's = 55 and 54), we examined a basic form of conceptual understanding of rational number arithmetic, the direction of effect of decimal arithmetic operations, at a level of detail useful for informing instruction. Middle school students were presented tasks examining knowledge of the direction of effects (e.g., "True or…

  4. Changes in Food Intake in Australia: Comparing the 1995 and 2011 National Nutrition Survey Results Disaggregated into Basic Foods.

    PubMed

    Ridoutt, Bradley; Baird, Danielle; Bastiaans, Kathryn; Hendrie, Gilly; Riley, Malcolm; Sanguansri, Peerasak; Syrette, Julie; Noakes, Manny

    2016-05-25

    As nations seek to address obesity and diet-related chronic disease, understanding shifts in food intake over time is an imperative. However, quantifying intake of basic foods is not straightforward because of the diversity of raw and cooked wholefoods, processed foods and mixed dishes actually consumed. In this study, data from the Australian national nutrition surveys of 1995 and 2011, each involving more than 12,000 individuals and covering more than 4500 separate foods, were coherently disaggregated into basic foods, with cooking and processing factors applied where necessary. Although Australians are generally not eating in a manner consistent with national dietary guidelines, there have been several positive changes. Australians are eating more whole fruit, a greater diversity of vegetables, more beans, peas and pulses, less refined sugar, and they have increased their preference for brown and wholegrain cereals. Adult Australians have also increased their intake of nuts and seeds. Fruit juice consumption markedly declined, especially for younger Australians. Cocoa consumption increased and shifts in dairy product intake were mixed, reflecting one of several important differences between age and gender cohorts. This study sets the context for more detailed research at the level of specific foods to understand individual and household differences.

  5. Changes in Food Intake in Australia: Comparing the 1995 and 2011 National Nutrition Survey Results Disaggregated into Basic Foods

    PubMed Central

    Ridoutt, Bradley; Baird, Danielle; Bastiaans, Kathryn; Hendrie, Gilly; Riley, Malcolm; Sanguansri, Peerasak; Syrette, Julie; Noakes, Manny

    2016-01-01

    As nations seek to address obesity and diet-related chronic disease, understanding shifts in food intake over time is an imperative. However, quantifying intake of basic foods is not straightforward because of the diversity of raw and cooked wholefoods, processed foods and mixed dishes actually consumed. In this study, data from the Australian national nutrition surveys of 1995 and 2011, each involving more than 12,000 individuals and covering more than 4500 separate foods, were coherently disaggregated into basic foods, with cooking and processing factors applied where necessary. Although Australians are generally not eating in a manner consistent with national dietary guidelines, there have been several positive changes. Australians are eating more whole fruit, a greater diversity of vegetables, more beans, peas and pulses, less refined sugar, and they have increased their preference for brown and wholegrain cereals. Adult Australians have also increased their intake of nuts and seeds. Fruit juice consumption markedly declined, especially for younger Australians. Cocoa consumption increased and shifts in dairy product intake were mixed, reflecting one of several important differences between age and gender cohorts. This study sets the context for more detailed research at the level of specific foods to understand individual and household differences. PMID:28231135

  6. Markers of tolerance development to food allergens.

    PubMed

    Ponce, M; Diesner, S C; Szépfalusi, Z; Eiwegger, T

    2016-10-01

    IgE-mediated reactions to food allergens are the most common cause of anaphylaxis in childhood. Although allergies to cow's milk, egg, or soy proteins, in contrast to peanut and tree nut allergens, resolve within the first 6 years of life in up to 60% due to natural tolerance development, this process is not well understood. At present, there is no cure or treatment for food allergy that would result in an induction of tolerance to the symptom-eliciting food. Avoidance, providing an emergency plan and education, is the standard of treatment. Oral immunotherapeutic approaches have been proven reasonable efficacy; however, they are associated with high rates of side-effects and low numbers of patients achieving tolerance. Nevertheless, mechanisms that take place during oral immunotherapy may help to understand tolerance development. On the basis of these therapeutic interventions, events like loss of basophil activation and induction of regulatory lymphocyte subsets and of blocking antibodies have been described. Their functional importance at a clinical level, however, remains to be investigated in detail. Consequently, there is eminent need to understand the process of tolerance development to food allergens and define biomarkers to develop and monitor new treatment strategies for food allergy. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. The interaction of Matrix Reasoning and Social Motivation as predictors of Separation anxiety in boys with Autism Spectrum Disorder.

    PubMed

    Bitsika, Vicki; Sharpley, Christopher F

    2018-06-01

    It has been suggested that higher cognitive functioning based in the pre-frontal cortex is implicated in the ability of people with Autism Spectrum Disorder (ASD) to understand and communicate in social situations. Low motivation to engage in social interaction may also be influential in this process. Although both of these factors have been argued to influence the levels of comorbid anxiety in young people with ASD, no detailed examination of those relationships has been reported to date. A sample of 90 boys with ASD (aged 6 to 12 yr) and 29 of their non-ASD peers, matched for age and IQ, completed tests of cognitive function and anxiety. Only one form of anxiety-fear of being separated from their parents- was significantly associated with cognitive function, at the Full Scale IQ and Matrix Reasoning levels, plus motivation to engage in social interactions, and only for the ASD boys. These data represent a complex interaction between the neurobiological aspects of ASD, fluid reasoning, social motivation, and Separation Anxiety in boys with ASD. As such, they bring a new perspective to understanding and treating anxious behaviour in these boys. Copyright © 2018 ISDN. Published by Elsevier Ltd. All rights reserved.

  8. Understanding FRET as a Research Tool for Cellular Studies

    PubMed Central

    Shrestha, Dilip; Jenei, Attila; Nagy, Péter; Vereb, György; Szöllősi, János

    2015-01-01

    Communication of molecular species through dynamic association and/or dissociation at various cellular sites governs biological functions. Understanding these physiological processes require delineation of molecular events occurring at the level of individual complexes in a living cell. Among the few non-invasive approaches with nanometer resolution are methods based on Förster Resonance Energy Transfer (FRET). FRET is effective at a distance of 1–10 nm which is equivalent to the size of macromolecules, thus providing an unprecedented level of detail on molecular interactions. The emergence of fluorescent proteins and SNAP- and CLIP- tag proteins provided FRET with the capability to monitor changes in a molecular complex in real-time making it possible to establish the functional significance of the studied molecules in a native environment. Now, FRET is widely used in biological sciences, including the field of proteomics, signal transduction, diagnostics and drug development to address questions almost unimaginable with biochemical methods and conventional microscopies. However, the underlying physics of FRET often scares biologists. Therefore, in this review, our goal is to introduce FRET to non-physicists in a lucid manner. We will also discuss our contributions to various FRET methodologies based on microscopy and flow cytometry, while describing its application for determining the molecular heterogeneity of the plasma membrane in various cell types. PMID:25815593

  9. Models and signal processing for an implanted ethanol bio-sensor.

    PubMed

    Han, Jae-Joon; Doerschuk, Peter C; Gelfand, Saul B; O'Connor, Sean J

    2008-02-01

    The understanding of drinking patterns leading to alcoholism has been hindered by an inability to unobtrusively measure ethanol consumption over periods of weeks to months in the community environment. An implantable ethanol sensor is under development using microelectromechanical systems technology. For safety and user acceptability issues, the sensor will be implanted subcutaneously and, therefore, measure peripheral-tissue ethanol concentration. Determining ethanol consumption and kinetics in other compartments from the time course of peripheral-tissue ethanol concentration requires sophisticated signal processing based on detailed descriptions of the relevant physiology. A statistical signal processing system based on detailed models of the physiology and using extended Kalman filtering and dynamic programming tools is described which can estimate the time series of ethanol concentration in blood, liver, and peripheral tissue and the time series of ethanol consumption based on peripheral-tissue ethanol concentration measurements.

  10. Medical Representatives' Intention to Use Information Technology in Pharmaceutical Marketing

    PubMed Central

    Kwak, Eun-Seon

    2016-01-01

    Objectives Electronic detailing (e-detailing), the use of electronic devices to facilitate sales presentations to physicians, has been adopted and expanded in the pharmaceutical industry. To maximize the potential outcome of e-detailing, it is important to understand medical representatives (MRs)' behavior and attitude to e-detailing. This study investigates how information technology devices such as laptop computers and tablet PCs are utilized in pharmaceutical marketing, and it analyzes the factors influencing MRs' intention to use devices. Methods This study has adopted and modified the theory of Roger's diffusion of innovation model and the technology acceptance model. To test the model empirically, a questionnaire survey was conducted with 221 MRs who were working in three multinational or eleven domestic pharmaceutical companies in Korea. Results Overall, 28% and 35% of MRs experienced using laptop computers and tablet PCs in pharmaceutical marketing, respectively. However, the rates were different across different groups of MRs, categorized by age, education level, position, and career. The results showed that MRs' intention to use information technology devices was significantly influenced by perceived usefulness in general. Perceived ease of use, organizational and individual innovativeness, and several MR characteristics were also found to have significant impacts. Conclusions This study provides timely information about e-detailing devices to marketing managers and policy makers in the pharmaceutical industry for successful marketing strategy development by understanding the needs of MRs' intention to use information technology. Further in-depth study should be conducted to understand obstacles and limitations and to improve the strategies for better marketing tools. PMID:27895967

  11. Genomic and transcriptomic approaches to study immunology in cyprinids: What is next?

    PubMed

    Petit, Jules; David, Lior; Dirks, Ron; Wiegertjes, Geert F

    2017-10-01

    Accelerated by the introduction of Next-Generation Sequencing (NGS), a number of genomes of cyprinid fish species have been drafted, leading to a highly valuable collective resource of comparative genome information on cyprinids (Cyprinidae). In addition, NGS-based transcriptome analyses of different developmental stages, organs, or cell types, increasingly contribute to the understanding of complex physiological processes, including immune responses. Cyprinids are a highly interesting family because they comprise one of the most-diversified families of teleosts and because of their variation in ploidy level, with diploid, triploid, tetraploid, hexaploid and sometimes even octoploid species. The wealth of data obtained from NGS technologies provides both challenges and opportunities for immunological research, which will be discussed here. Correct interpretation of ploidy effects on immune responses requires knowledge of the degree of functional divergence between duplicated genes, which can differ even between closely-related cyprinid fish species. We summarize NGS-based progress in analysing immune responses and discuss the importance of respecting the presence of (multiple) duplicated gene sequences when performing transcriptome analyses for detailed understanding of complex physiological processes. Progressively, advances in NGS technology are providing workable methods to further elucidate the implications of gene duplication events and functional divergence of duplicates genes and proteins involved in immune responses in cyprinids. We conclude with discussing how future applications of NGS technologies and analysis methods could enhance immunological research and understanding. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. The Psycho-Neurology of Cross-Species Affective/Social Neuroscience: Understanding Animal Affective States as a Guide to Development of Novel Psychiatric Treatments.

    PubMed

    Panksepp, Jaak

    During the past half century of research with preclinical animal models, affective neuroscience has helped identify and illuminate the functional neuroanatomies and neurochemistries of seven primary process, i.e., genetically provided emotional systems of mammalian brains. All are subcortically localized, allowing animal models to guide the needed behavioral and neuroscientific analyses at levels of detail that cannot be achieved through human research, including modern brain imaging. They consist of the following neuronal processes: SEEKING/Enthusiasm, RAGE/Anger, FEAR/Anxiety, sexual LUST/Passion, maternal CARE/Nurturance, separation-distress PANIC/Grief and PLAY/Social Joy. Several of these systems figure heavily in social bonding. I will focus here especially on the genesis of depression. Its genesis is significantly influenced by (i) sustained overactivity of the separation-distress PANIC system reflecting severed social bonds and the excessive "psychological pain" of loneliness that can, if sustained, lead to a downward cascade known as psychological despair, and (ii) the despair phase that follows the acute PANIC response, which is characterized by abnormally low activity of the SEEKING, the so-called brain reward networks, leading to amotivational states that characterize depression. Depressive affect is promoted by such brain affective mechanisms of social attachments and social loss as well as diminished arousability of the SEEKING system, leading to chronic dysphoria. To understand why depression feels so bad, we must understand the neural mechanisms that mediate such social feelings.

  13. Complexity reduction of biochemical rate expressions.

    PubMed

    Schmidt, Henning; Madsen, Mads F; Danø, Sune; Cedersund, Gunnar

    2008-03-15

    The current trend in dynamical modelling of biochemical systems is to construct more and more mechanistically detailed and thus complex models. The complexity is reflected in the number of dynamic state variables and parameters, as well as in the complexity of the kinetic rate expressions. However, a greater level of complexity, or level of detail, does not necessarily imply better models, or a better understanding of the underlying processes. Data often does not contain enough information to discriminate between different model hypotheses, and such overparameterization makes it hard to establish the validity of the various parts of the model. Consequently, there is an increasing demand for model reduction methods. We present a new reduction method that reduces complex rational rate expressions, such as those often used to describe enzymatic reactions. The method is a novel term-based identifiability analysis, which is easy to use and allows for user-specified reductions of individual rate expressions in complete models. The method is one of the first methods to meet the classical engineering objective of improved parameter identifiability without losing the systems biology demand of preserved biochemical interpretation. The method has been implemented in the Systems Biology Toolbox 2 for MATLAB, which is freely available from http://www.sbtoolbox2.org. The Supplementary Material contains scripts that show how to use it by applying the method to the example models, discussed in this article.

  14. Neural tube programming and craniofacial cleft formation. I. The neuromeric organization of the head and neck.

    PubMed

    Carstens, Michael H

    2004-01-01

    This review presents a brief synopsis of neuromeric theory. Neuromeres are developmental units of the nervous system with specific anatomic content. Outlying each neuromere are tissues of ectoderm, mesoderm and endoderm that bear an anatomic relationship to the neuromere in three basic ways. This relationship is physical in that motor and sensory connections exist between a given neuromeric level and its target tissues. The relationship is also developmental because the target cells exit during gastrulation precisely at that same level. Finally the relationship is chemical because the genetic definition of a neuromere is shared with those tissues with which it interacts. The model developed by Puelles and Rubenstein is used to describe the neuroanatomy of the neuromeres. Although important details of the model are currently being refined it has immediate clinical relevance for practicing clinicians because it permits us to understand many pathologic states as relationships between the brain and the surrounding tissues. Relationships between the processes of neurulation and gastrulation have been presented to demonstrate the manner in which neuromeric anatomy is established in the embryo. We are now in a position to describe in detail the static anatomic structures that result from this system. The neuromeric 'map' of craniofacial bones, dermis, dura, muscles, and fascia will be the subject of the next part of this series.

  15. Discourse changes in early Alzheimer disease, mild cognitive impairment, and normal aging.

    PubMed

    Chapman, Sandra Bond; Zientz, Jennifer; Weiner, Myron; Rosenberg, Roger; Frawley, William; Burns, Mary Hope

    2002-01-01

    The purpose of this study was to determine the sensitivity of discourse gist measures to the early cognitive-linguistic changes in Alzheimer disease (AD) and in the preclinical stages. Differences in discourse abilities were examined in 25 cognitively normal adults, 24 adults with mild probable AD, and 20 adults with mild cognitive impairment (MCI) at gist and detail levels of discourse processing. The authors found that gist and detail levels of discourse processing were significantly impaired in persons with AD and MCI as compared with normal control subjects. Gist-level discourse processing abilities showed minimal overlap between cognitively normal control subjects and those with mild AD. Moreover, the majority of the persons with MCI performed in the range of AD on gist measures. These findings indicate that discourse gist measures hold promise as a diagnostic complement to enhance early detection of AD. Further studies are needed to determine how early the discourse gist deficits arise in AD.

  16. How to assess extreme weather impacts - case European transport network

    NASA Astrophysics Data System (ADS)

    Leviäkangas, P.

    2010-09-01

    To assess the impacts of climate change and preparing for impacts is a process. This process we must understand and learn to apply. EWENT (Extreme Weather impacts on European Networks of Transport) will be a test bench for one prospective approach. It has the following main components: 1) identifying what is "extreme", 2) assessing the change in the probabilities, 3) constructing the causal impact models, 4) finding appropriate methods of pricing and costing, 5) finding alternative strategy option, 6) assessing the efficiency of strategy option. This process follows actually the steps of standardized risk management process. Each step is challenging, but if EWENT project succeeds to assess the extreme weather impacts on European transport networks, it is one possible benchmark how to carry out similar analyses in other regions and on country level. EWENT approach could particularly useful for weather and climate information service providers, offering tools for transport authorities and financiers to assess weather risks, and then rationally managing the risks. EWENT project is financed by the European Commission and participated by met-service organisations and transport research institutes from different parts of Europe. The presentation will explain EWENT approach in detail and bring forth the findings of the first work packages.

  17. Formally grounding spatio-temporal thinking.

    PubMed

    Klippel, Alexander; Wallgrün, Jan Oliver; Yang, Jinlong; Li, Rui; Dylla, Frank

    2012-08-01

    To navigate through daily life, humans use their ability to conceptualize spatio-temporal information, which ultimately leads to a system of categories. Likewise, the spatial sciences rely heavily on conceptualization and categorization as means to create knowledge when they process spatio-temporal data. In the spatial sciences and in related branches of artificial intelligence, an approach has been developed for processing spatio-temporal data on the level of coarse categories: qualitative spatio-temporal representation and reasoning (QSTR). Calculi developed in QSTR allow for the meaningful processing of and reasoning with spatio-temporal information. While qualitative calculi are widely acknowledged in the cognitive sciences, there is little behavioral assessment whether these calculi are indeed cognitively adequate. This is an astonishing conundrum given that these calculi are ubiquitous, are often intended to improve processes at the human-machine interface, and are on several occasions claimed to be cognitively adequate. We have systematically evaluated several approaches to formally characterize spatial relations from a cognitive-behavioral perspective for both static and dynamically changing spatial relations. This contribution will detail our framework, which is addressing the question how formal characterization of space can help us understand how people think with, in, and about space.

  18. ESA Swarm Mission - Level 1b Products

    NASA Astrophysics Data System (ADS)

    Tøffner-Clausen, Lars; Floberghagen, Rune; Mecozzi, Riccardo; Menard, Yvon

    2014-05-01

    Swarm, a three-satellite constellation to study the dynamics of the Earth's magnetic field and its interactions with the Earth system, has been launched in November 2013. The objective of the Swarm mission is to provide the best ever survey of the geomagnetic field and its temporal evolution, which will bring new insights into the Earth system by improving our understanding of the Earth's interior and environment. The Level 1b Products of the Swarm mission contain time-series of the quality screened, calibrated, corrected, and fully geo-localized measurements of the magnetic field intensity, the magnetic field vector (provided in both instrument and Earth-fixed frames), the plasma density, temperature, and velocity. Additionally, quality screened and pre-calibrated measurements of the nongravitational accelerations are provided. Geo-localization is performed by 24- channel GPS receivers and by means of unique, three head Advanced Stellar Compasses for high-precision satellite attitude information. The Swarm Level 1b data will be provided in daily products separately for each of the three Swarm spacecrafts. This poster will present detailed lists of the contents of the Swarm Level 1b Products and brief descriptions of the processing algorithms used in the generation of these data.

  19. Kinetics in the real world: linking molecules, processes, and systems.

    PubMed

    Kohse-Höinghaus, Katharina; Troe, Jürgen; Grabow, Jens-Uwe; Olzmann, Matthias; Friedrichs, Gernot; Hungenberg, Klaus-Dieter

    2018-04-25

    Unravelling elementary steps, reaction pathways, and kinetic mechanisms is key to understanding the behaviour of many real-world chemical systems that span from the troposphere or even interstellar media to engines and process reactors. Recent work in chemical kinetics provides detailed information on the reactive changes occurring in chemical systems, often on the atomic or molecular scale. The optimisation of practical processes, for instance in combustion, catalysis, battery technology, polymerisation, and nanoparticle production, can profit from a sound knowledge of the underlying fundamental chemical kinetics. Reaction mechanisms can combine information gained from theory and experiments to enable the predictive simulation and optimisation of the crucial process variables and influences on the system's behaviour that may be exploited for both monitoring and control. Chemical kinetics, as one of the pillars of Physical Chemistry, thus contributes importantly to understanding and describing natural environments and technical processes and is becoming increasingly relevant for interactions in and with the real world.

  20. Arctic Research NASA's Cryospheric Sciences Program

    NASA Technical Reports Server (NTRS)

    Waleed, Abdalati; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    Much of NASA's Arctic Research is run through its Cryospheric Sciences Program. Arctic research efforts to date have focused primarily on investigations of the mass balance of the largest Arctic land-ice masses and the mechanisms that control it, interactions among sea ice, polar oceans, and the polar atmosphere, atmospheric processes in the polar regions, energy exchanges in the Arctic. All of these efforts have been focused on characterizing, understanding, and predicting, changes in the Arctic. NASA's unique vantage from space provides an important perspective for the study of these large scale processes, while detailed process information is obtained through targeted in situ field and airborne campaigns and models. An overview of NASA investigations in the Arctic will be presented demonstrating how the synthesis of space-based technology, and these complementary components have advanced our understanding of physical processes in the Arctic.

  1. Systems Biology Approaches for Understanding Genome Architecture.

    PubMed

    Sewitz, Sven; Lipkow, Karen

    2016-01-01

    The linear and three-dimensional arrangement and composition of chromatin in eukaryotic genomes underlies the mechanisms directing gene regulation. Understanding this organization requires the integration of many data types and experimental results. Here we describe the approach of integrating genome-wide protein-DNA binding data to determine chromatin states. To investigate spatial aspects of genome organization, we present a detailed description of how to run stochastic simulations of protein movements within a simulated nucleus in 3D. This systems level approach enables the development of novel questions aimed at understanding the basic mechanisms that regulate genome dynamics.

  2. Investigation of hindwing folding in ladybird beetles by artificial elytron transplantation and microcomputed tomography.

    PubMed

    Saito, Kazuya; Nomura, Shuhei; Yamamoto, Shuhei; Niiyama, Ryuma; Okabe, Yoji

    2017-05-30

    Ladybird beetles are high-mobility insects and explore broad areas by switching between walking and flying. Their excellent wing transformation systems enabling this lifestyle are expected to provide large potential for engineering applications. However, the mechanism behind the folding of their hindwings remains unclear. The reason is that ladybird beetles close the elytra ahead of wing folding, preventing the observation of detailed processes occurring under the elytra. In the present study, artificial transparent elytra were transplanted on living ladybird beetles, thereby enabling us to observe the detailed wing-folding processes. The result revealed that in addition to the abdominal movements mentioned in previous studies, the edge and ventral surface of the elytra, as well as characteristic shaped veins, play important roles in wing folding. The structures of the wing frames enabling this folding process and detailed 3D shape of the hindwing were investigated using microcomputed tomography. The results showed that the tape spring-like elastic frame plays an important role in the wing transformation mechanism. Compared with other beetles, hindwings in ladybird beetles are characterized by two seemingly incompatible properties: ( i ) the wing rigidity with relatively thick veins and ( ii ) the compactness in stored shapes with complex crease patterns. The detailed wing-folding process revealed in this study is expected to facilitate understanding of the naturally optimized system in this excellent deployable structure.

  3. Investigation of hindwing folding in ladybird beetles by artificial elytron transplantation and microcomputed tomography

    PubMed Central

    Nomura, Shuhei; Yamamoto, Shuhei; Niiyama, Ryuma; Okabe, Yoji

    2017-01-01

    Ladybird beetles are high-mobility insects and explore broad areas by switching between walking and flying. Their excellent wing transformation systems enabling this lifestyle are expected to provide large potential for engineering applications. However, the mechanism behind the folding of their hindwings remains unclear. The reason is that ladybird beetles close the elytra ahead of wing folding, preventing the observation of detailed processes occurring under the elytra. In the present study, artificial transparent elytra were transplanted on living ladybird beetles, thereby enabling us to observe the detailed wing-folding processes. The result revealed that in addition to the abdominal movements mentioned in previous studies, the edge and ventral surface of the elytra, as well as characteristic shaped veins, play important roles in wing folding. The structures of the wing frames enabling this folding process and detailed 3D shape of the hindwing were investigated using microcomputed tomography. The results showed that the tape spring-like elastic frame plays an important role in the wing transformation mechanism. Compared with other beetles, hindwings in ladybird beetles are characterized by two seemingly incompatible properties: (i) the wing rigidity with relatively thick veins and (ii) the compactness in stored shapes with complex crease patterns. The detailed wing-folding process revealed in this study is expected to facilitate understanding of the naturally optimized system in this excellent deployable structure. PMID:28507159

  4. Issues Management Process Course # 38401

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binion, Ula Marie

    The purpose of this training it to advise Issues Management Coordinators (IMCs) on the revised Contractor Assurance System (CAS) Issues Management (IM) process. Terminal Objectives: Understand the Laboratory’s IM process; Understand your role in the Laboratory’s IM process. Learning Objectives: Describe the IM process within the context of the CAS; Describe the importance of implementing an institutional IM process at LANL; Describe the process flow for the Laboratory’s IM process; Apply the definition of an issue; Use available resources to determine initial screening risk levels for issues; Describe the required major process steps for each risk level; Describe the personnelmore » responsibilities for IM process implementation; Access available resources to support IM process implementation.« less

  5. From somatic pain to psychic pain: The body in the psychoanalytic field.

    PubMed

    Hartung, Thomas; Steinbrecher, Michael

    2017-03-24

    The integration of psyche and soma begins with a baby's earliest contact with his or her parents. With the help of maternal empathy and reverie, β-elements are transformed into α-elements. While we understand this to be the case, we would like to enquire what actually happens to those parts of the affect which have not been transformed? For the most part they may be dealt with by evacuation, but they can also remain within the body, subsequently contributing to psychosomatic symptoms. This paper describes how the body serves as an intermediate store between the psychic (inner) and outer reality. The authors focuses on the unconscious communicative process between the analyst and the analysand, and in particular on how psychosomatic symptoms can spread to the analyst's body. The latter may become sensitive to the analysand's psychosomatic symptoms in order to better understand the psychoanalytical process. Sensory processes (visual and auditory) and psychic mechanisms such as projective identification can serve as a means for this communication. One of the first analysts to deal with this topic was Wilhelm Reich. He described one kind of psychosomatic defence like a shell, the character armour, comparing the armour formed by muscle tension with another, more psychical type of armour. This concept can be linked to Winnicott's contribution of the false self and later on to Feldman's concept of compliance as a defence. The authors links further details of the clinical material with theoretical concepts from Joyce McDougall, Piera Aulagnier, and Ricardo Rodulfo and Marilia Aisenstein. With the aid of the complex concept of projective identification, as described by Heinz Weiss, the authors discusses the important question of how the analyst gets in touch with the patient's current psychosomatic state, and describes a specific communication between the body of the psychoanalyst and the body of the patient. A vignette illustrates in greater detail the relationship between this theoretical understanding and an actual clinical example. In the session described, the analyst reacts to the patient with an intense body-countertransference, taking on the patient's symptoms for a short time. The patient, who had been unable to integrate psyche and soma (whose psyche did not indwell (Winnicott) in his body), projected the untransformed β-elements into his body, where they emerged as bodily symptoms. The body became a kind of intermediate store between inner and outer reality. By internalizing the patient's symptoms in his own body, the analyst created a bodily communication - something in between concerning the inner and the outer reality of both participants of the analytic dyad. The analyst was able to recognize his psychosomatic experience as the fear of dying, and to work through his bodily countertransference. This is described in detail. The emerging understanding of the countertransference helped the analyst to contribute to the patient's process of transforming his symptoms. The analyst was able to help the patient get in touch emotionally with many traumatic situations experienced during his life. The function of the psychosomatic symptoms was to contain the patient's fear of death. These frightening feelings could now be worked through on a psychical level; they could enter into a process of symbol formation so that the psychosomatic symptoms were no longer necessary and disappeared. Copyright © 2017 Institute of Psychoanalysis.

  6. Addressing and Presenting Quality of Satellite Data via Web-Based Services

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory; Lynnes, C.; Ahmad, S.; Fox, P.; Zednik, S.; West, P.

    2011-01-01

    With the recent attention to climate change and proliferation of remote-sensing data utilization, climate model and various environmental monitoring and protection applications have begun to increasingly rely on satellite measurements. Research application users seek good quality satellite data, with uncertainties and biases provided for each data point. However, different communities address remote sensing quality issues rather inconsistently and differently. We describe our attempt to systematically characterize, capture, and provision quality and uncertainty information as it applies to the NASA MODIS Aerosol Optical Depth data product. In particular, we note the semantic differences in quality/bias/uncertainty at the pixel, granule, product, and record levels. We outline various factors contributing to uncertainty or error budget; errors. Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having directly managing complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. We describe our experiences developing a semantic, provenance-aware, expert-knowledge advisory system applied to NASA Giovanni web-based Earth science data analysis tool as part of the ESTO AIST-funded Multi-sensor Data Synergy Advisor project.

  7. Detailed 3D representations for object recognition and modeling.

    PubMed

    Zia, M Zeeshan; Stark, Michael; Schiele, Bernt; Schindler, Konrad

    2013-11-01

    Geometric 3D reasoning at the level of objects has received renewed attention recently in the context of visual scene understanding. The level of geometric detail, however, is typically limited to qualitative representations or coarse boxes. This is linked to the fact that today's object class detectors are tuned toward robust 2D matching rather than accurate 3D geometry, encouraged by bounding-box-based benchmarks such as Pascal VOC. In this paper, we revisit ideas from the early days of computer vision, namely, detailed, 3D geometric object class representations for recognition. These representations can recover geometrically far more accurate object hypotheses than just bounding boxes, including continuous estimates of object pose and 3D wireframes with relative 3D positions of object parts. In combination with robust techniques for shape description and inference, we outperform state-of-the-art results in monocular 3D pose estimation. In a series of experiments, we analyze our approach in detail and demonstrate novel applications enabled by such an object class representation, such as fine-grained categorization of cars and bicycles, according to their 3D geometry, and ultrawide baseline matching.

  8. Dealing with Beam Structure in PIXIE

    NASA Technical Reports Server (NTRS)

    Fixsen, D. J.; Kogut, Alan; Hill, Robert S.; Nagler, Peter C.; Seals, Lenward T., III; Howard, Joseph M.

    2016-01-01

    Measuring the B-mode polarization of the CMB radiation requires a detailed understanding of the projection of the detector onto the sky. We show how the combination of scan strategy and processing generates a cylindrical beam for the spectrum measurement. Both the instrumental design and the scan strategy reduce the cross coupling between the temperature variations and the B-modes. As with other polarization measurements some post processing may be required to eliminate residual errors.

  9. Understanding the Relative Contributions of Lower-Level Word Processes, Higher-Level Processes, and Working Memory to Reading Comprehension Performance in Proficient Adult Readers

    ERIC Educational Resources Information Center

    Hannon, Brenda

    2012-01-01

    Although a considerable amount of evidence has been amassed regarding the contributions of lower-level word processes, higher-level processes, and working memory to reading comprehension, little is known about the relationships among these sources of individual differences or their relative contributions to reading comprehension performance. This…

  10. Making Scientific Data Usable and Useful

    NASA Astrophysics Data System (ADS)

    Satwicz, T.; Bharadwaj, A.; Evans, J.; Dirks, J.; Clark Cole, K.

    2017-12-01

    Transforming geological data into information that has broad scientific and societal impact is a process fraught with barriers. Data sets and tools are often reported to have poor user experiences (UX) that make scientific work more challenging than it needs be. While many other technical fields have benefited from ongoing improvements to the UX of their tools (e.g., healthcare and financial services) scientists are faced with using tools that are labor intensive and not intuitive. Our research team has been involved in a multi-year effort to understand and improve the UX of scientific tools and data sets. We use a User-Centered Design (UCD) process that involves naturalistic behavioral observation and other qualitative research methods adopted from Human-Computer Interaction (HCI) and related fields. Behavioral observation involves having users complete common tasks on data sets, tools, and websites to identify usability issues and understand the severity of the issues. We measure how successfully they complete tasks and diagnosis the cause of any failures. Behavioral observation is paired with in-depth interviews where users describe their process for generating results (from initial inquiry to final results). By asking detailed questions we unpack common patterns and challenges scientists experience while working with data. We've found that tools built using the UCD process can have a large impact on scientist work flows and greatly reduce the time it takes to process data before analysis. It is often challenging to understand the organization and nuances of data across scientific fields. By better understanding how scientists work we can create tools that make routine tasks less-labor intensive, data easier to find, and solve common issues with discovering new data sets and engaging in interdisciplinary research. There is a tremendous opportunity for advancing scientific knowledge and helping the public benefit from that work by creating intuitive, interactive, and powerful tools and resources for generating knowledge. The pathway to achieving that is through building a detailed understanding of users and their needs, then using this knowledge to inform the design of the data products, tools, and services scientists and non-scientists use to do their work.

  11. Neoblasts and the evolution of whole-body regeneration.

    PubMed

    Gehrke, Andrew R; Srivastava, Mansi

    2016-10-01

    The molecular mechanisms underlying whole-body regeneration are best understood in the planarian flatworm Schmidtea mediterranea, where a heterogeneous population of somatic stem cells called neoblasts provides new tissue for regeneration of essentially any missing body part. Studies on Schmidtea have provided a detailed description of neoblasts and their role in regeneration, but comparatively little is known about the evolutionary history of these cells and their underlying developmental programs. Acoels, an understudied group of aquatic worms that are also capable of extensive whole-body regeneration, have arisen as an attractive group to study the evolution of regenerative processes due to their phylogenetically distant position relative to flatworms. Here, we review the phylogenetic distribution of neoblast cells and compare their anatomical locations, transcriptional profiles, and roles during regeneration in flatworms and acoels to understand the evolution of whole-body regeneration. While the general role of neoblasts appears conserved in species separated by 550 million years of evolution, the extrinsic inputs they receive during regeneration can vary, making the distinction between homology and convergence of mechanism unclear. A more detailed understanding of the precise mechanisms behind whole-body regeneration in diverse phyla is necessary to understand the evolutionary history of this powerful process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Microspectroscopic Analysis of Anthropogenic- and Biogenic-Influenced Aerosol Particles during the SOAS Field Campaign

    NASA Astrophysics Data System (ADS)

    Ault, A. P.; Bondy, A. L.; Nhliziyo, M. V.; Bertman, S. B.; Pratt, K.; Shepson, P. B.

    2013-12-01

    During the summer, the southeastern United States experiences a cooling haze due to the interaction of anthropogenic and biogenic aerosol sources. An objective of the summer 2013 Southern Oxidant and Aerosol Study (SOAS) was to improve our understanding of how trace gases and aerosols are contributing to this relative cooling through light scattering and absorption. To improve understanding of biogenic-anthropogenic interactions through secondary organic aerosol (SOA) formation on primary aerosol cores requires detailed physicochemical characterization of the particles after uptake and processing. Our measurements focus on single particle analysis of aerosols in the accumulation mode (300-1000 nm) collected using a multi orifice uniform deposition impactor (MOUDI) at the Centreville, Alabama SEARCH site. Particles were characterized using an array of microscopic and spectroscopic techniques, including: scanning electron microscopy (SEM), transmission electron microscopy (TEM), energy dispersive X-ray analysis (EDX), and Raman microspectroscopy. These analyses provide detailed information on particle size, morphology, elemental composition, and functional groups. This information is combined with mapping capabilities to explore individual particle spatial patterns and how that impacts structural characteristics. The improved understanding will be used to explore how sources and processing (such as SOA coating of soot) change particle structure (i.e. core shell) and how the altered optical properties impact air quality/climate effects on a regional scale.

  13. Industry and Patient Perspectives on Child Participation in Clinical Trials: The Pediatric Assent Initiative Survey Report.

    PubMed

    Lombardi, Donald; Squires, Liza; Sjostedt, Philip; Eichler, Irmgard; Turner, Mark A; Thompson, Charles

    2018-01-01

    Obtaining assent from children participating in clinical trials acknowledges autonomy and developmental ability to contribute to the consent process. This critical step in pediatric drug development remains poorly understood, with significant room for improving the clarity, efficiency, and implementation of the assent process. Beyond ethical necessity of informing children about their treatment, the assent process provides the advantages of including children in discussions about their diagnosis and treatment-allowing greater understanding of interventions included in the study. A formalized assent process acknowledges the child as a volunteer and provides a forum for questions and feedback. Legal, cultural, and social differences have historically prevented the development of clear, concise, and accessible materials to ensure children understand the clinical trial design. Published guidelines on obtaining pediatric assent are vague, with many decisions left to local institutional review boards and ethics committees, underscoring the need for collaboratively designed standards. To address this need, 2 surveys were conducted to quantify perspectives on assent in pediatric clinical trials. Two digital surveys were circulated in the United States and internationally (October 2014 to January 2015). The first survey targeted children, parents, and/or caregivers. The second polled clinical trial professionals on their organizations' experience and policies regarding pediatric assent. Forty-five respondents completed the child and parent/caregiver survey; 57 respondents completed the industry survey. Respondents from both surveys detailed experiences with clinical trials and the impediments to securing assent, offering potential solutions to attaining assent in pediatric patients. An important opportunity exists for standardized practices and tools to ensure pediatric patients make well-informed decisions regarding their participation in clinical trials, using materials appropriate to their level of understanding. These tools would establish a baseline standard for the assent process and be made available to researchers, improving their ability to secure assent from young patients.

  14. Quantifying irreversible movement in steep, fractured bedrock permafrost on Matterhorn (CH)

    NASA Astrophysics Data System (ADS)

    Weber, Samuel; Beutel, Jan; Faillettaz, Jérome; Hasler, Andreas; Krautblatter, Michael; Vieli, Andreas

    2017-02-01

    Understanding rock slope kinematics in steep, fractured bedrock permafrost is a challenging task. Recent laboratory studies have provided enhanced understanding of rock fatigue and fracturing in cold environments but were not successfully confirmed by field studies. This study presents a unique time series of fracture kinematics, rock temperatures and environmental conditions at 3500 m a. s. l. on the steep, strongly fractured Hörnligrat of the Matterhorn (Swiss Alps). Thanks to 8 years of continuous data, the longer-term evolution of fracture kinematics in permafrost can be analyzed with an unprecedented level of detail. Evidence for common trends in spatiotemporal pattern of fracture kinematics could be found: a partly reversible seasonal movement can be observed at all locations, with variable amplitudes. In the wider context of rock slope stability assessment, we propose separating reversible (elastic) components of fracture kinematics, caused by thermoelastic strains, from the irreversible (plastic) component due to other processes. A regression analysis between temperature and fracture displacement shows that all instrumented fractures exhibit reversible displacements that dominate fracture kinematics in winter. Furthermore, removing this reversible component from the observed displacement enables us to quantify the irreversible component. From this, a new metric - termed index of irreversibility - is proposed to quantify relative irreversibility of fracture kinematics. This new index can identify periods when fracture displacements are dominated by irreversible processes. For many sensors, irreversible enhanced fracture displacement is observed in summer and its initiation coincides with the onset of positive rock temperatures. This likely indicates thawing-related processes, such as meltwater percolation into fractures, as a forcing mechanism for irreversible displacements. For a few instrumented fractures, irreversible displacements were found at the onset of the freezing period, suggesting that cryogenic processes act as a driving factor through increasing ice pressure. The proposed analysis provides a tool for investigating and better understanding processes related to irreversible kinematics.

  15. Meteorology, Macrophysics, Microphysics, Microwaves, and Mesoscale Modeling of Mediterranean Mountain Storms: The M8 Laboratory

    NASA Technical Reports Server (NTRS)

    Starr, David O. (Technical Monitor); Smith, Eric A.

    2002-01-01

    Comprehensive understanding of the microphysical nature of Mediterranean storms can be accomplished by a combination of in situ meteorological data analysis and radar-passive microwave data analysis, effectively integrated with numerical modeling studies at various scales, from synoptic scale down through the mesoscale, the cloud macrophysical scale, and ultimately the cloud microphysical scale. The microphysical properties of and their controls on severe storms are intrinsically related to meteorological processes under which storms have evolved, processes which eventually select and control the dominant microphysical properties themselves. This involves intense convective development, stratiform decay, orographic lifting, and sloped frontal lifting processes, as well as the associated vertical motions and thermodynamical instabilities governing physical processes that affect details of the size distributions and fall rates of the various types of hydrometeors found within the storm environment. Insofar as hazardous Mediterranean storms, highlighted in this study by three mountain storms producing damaging floods in northern Italy between 1992 and 2000, developing a comprehensive microphysical interpretation requires an understanding of the multiple phases of storm evolution and the heterogeneous nature of precipitation fields within a storm domain. This involves convective development, stratiform transition and decay, orographic lifting, and sloped frontal lifting processes. This also involves vertical motions and thermodynamical instabilities governing physical processes that determine details of the liquid/ice water contents, size disi:ributions, and fall rates of the various modes of hydrometeors found within hazardous storm environments.

  16. Modeling of the HiPco process for carbon nanotube production. II. Reactor-scale analysis

    NASA Technical Reports Server (NTRS)

    Gokcen, Tahir; Dateo, Christopher E.; Meyyappan, M.

    2002-01-01

    The high-pressure carbon monoxide (HiPco) process, developed at Rice University, has been reported to produce single-walled carbon nanotubes from gas-phase reactions of iron carbonyl in carbon monoxide at high pressures (10-100 atm). Computational modeling is used here to develop an understanding of the HiPco process. A detailed kinetic model of the HiPco process that includes of the precursor, decomposition metal cluster formation and growth, and carbon nanotube growth was developed in the previous article (Part I). Decomposition of precursor molecules is necessary to initiate metal cluster formation. The metal clusters serve as catalysts for carbon nanotube growth. The diameter of metal clusters and number of atoms in these clusters are some of the essential information for predicting carbon nanotube formation and growth, which is then modeled by the Boudouard reaction with metal catalysts. Based on the detailed model simulations, a reduced kinetic model was also developed in Part I for use in reactor-scale flowfield calculations. Here this reduced kinetic model is integrated with a two-dimensional axisymmetric reactor flow model to predict reactor performance. Carbon nanotube growth is examined with respect to several process variables (peripheral jet temperature, reactor pressure, and Fe(CO)5 concentration) with the use of the axisymmetric model, and the computed results are compared with existing experimental data. The model yields most of the qualitative trends observed in the experiments and helps to understanding the fundamental processes in HiPco carbon nanotube production.

  17. Quantification of fossil fuel CO2 at the building/street level for large US cities

    NASA Astrophysics Data System (ADS)

    Gurney, K. R.; Razlivanov, I. N.; Song, Y.

    2012-12-01

    Quantification of fossil fuel CO2 emissions from the bottom-up perspective is a critical element in emerging plans on a global, integrated, carbon monitoring system (CMS). A space/time explicit emissions data product can act as both a verification and planning system. It can verify atmospheric CO2 measurements (in situ and remote) and offer detailed mitigation information to management authorities in order to optimize the mix of mitigation efforts. Here, we present the Hestia Project, an effort aimed at building a high resolution (eg. building and road link-specific, hourly) fossil fuel CO2 emissions data product for the urban domain as a pilot effort to a CMS. A complete data product has been built for the city of Indianapolis and preliminary quantification has been completed for Los Angeles and Phoenix (see figure). The effort in Indianapolis is now part of a larger effort aimed at a convergent top-down/bottom-up assessment of greenhouse gas emissions, called INFLUX. Our urban-level quantification relies on a mixture of data and modeling structures. We start with the sector-specific Vulcan Project estimate at the mix of geocoded and county-wide levels. The Hestia aim is to distribute the Vulcan result in space and time. Two components take the majority of effort: buildings and onroad emissions. In collaboration with our INFLUX colleagues, we are transporting these high resolution emissions through an atmospheric transport model for a forward comparison of the Hestia data product with atmospheric measurements, collected on aircraft and cell towers. In preparation for a formal urban-scale inversion, these forward comparisons offer insights into both improving our emissions data product and measurement strategies. A key benefit of the approach taken in this study is the tracking and archiving of fuel and process-level detail (eg. combustion process, other pollutants), allowing for a more thorough understanding and analysis of energy throughputs in the urban environment. Quantification of fossil fuel emissions, however, is one piece in a larger conception of cities as complex dynamic socio-technological systems and the Hestia effort is at the very beginning stages of connecting to the large community of research approaching cities from other perspectives and utilizing other tools. Through analysis of the three cities for which we have quantified fossil fuel CO2 emissions and recognition of the current threads emerging in urban research, we are attempting to offer insight into understanding cities via the mechanistic quantification of energy and CO2 emissions.

  18. Toward theoretical understanding of the fertility preservation decision-making process: examining information processing among young women with cancer.

    PubMed

    Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2013-01-01

    Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.

  19. Insights into DNA-mediated interparticle interactions from a coarse-grained model

    NASA Astrophysics Data System (ADS)

    Ding, Yajun; Mittal, Jeetain

    2014-11-01

    DNA-functionalized particles have great potential for the design of complex self-assembled materials. The major hurdle in realizing crystal structures from DNA-functionalized particles is expected to be kinetic barriers that trap the system in metastable amorphous states. Therefore, it is vital to explore the molecular details of particle assembly processes in order to understand the underlying mechanisms. Molecular simulations based on coarse-grained models can provide a convenient route to explore these details. Most of the currently available coarse-grained models of DNA-functionalized particles ignore key chemical and structural details of DNA behavior. These models therefore are limited in scope for studying experimental phenomena. In this paper, we present a new coarse-grained model of DNA-functionalized particles which incorporates some of the desired features of DNA behavior. The coarse-grained DNA model used here provides explicit DNA representation (at the nucleotide level) and complementary interactions between Watson-Crick base pairs, which lead to the formation of single-stranded hairpin and double-stranded DNA. Aggregation between multiple complementary strands is also prevented in our model. We study interactions between two DNA-functionalized particles as a function of DNA grafting density, lengths of the hybridizing and non-hybridizing parts of DNA, and temperature. The calculated free energies as a function of pair distance between particles qualitatively resemble experimental measurements of DNA-mediated pair interactions.

  20. Successful Negotiation in Schools: Management, Unions, Employees, and Citizens.

    ERIC Educational Resources Information Center

    Herman, Jerry J.; Herman, Janice L.

    This book is a how-to-do-it roadmap that presents practical details on the important aspects of collective bargaining at the local school district level. It details all of the strategies, tasks, events, and influences that bear on the collective bargaining process from the initial certification election of a union through the preparation for…

  1. Chronobiology and obesity: Interactions between circadian rhythms and energy regulation.

    PubMed

    Summa, Keith C; Turek, Fred W

    2014-05-01

    Recent advances in the understanding of the molecular, genetic, neural, and physiologic basis for the generation and organization of circadian clocks in mammals have revealed profound bidirectional interactions between the circadian clock system and pathways critical for the regulation of metabolism and energy balance. The discovery that mice harboring a mutation in the core circadian gene circadian locomotor output cycles kaput (Clock) develop obesity and evidence of the metabolic syndrome represented a seminal moment for the field, clearly establishing a link between circadian rhythms, energy balance, and metabolism at the genetic level. Subsequent studies have characterized in great detail the depth and magnitude of the circadian clock's crucial role in regulating body weight and other metabolic processes. Dietary nutrients have been shown to influence circadian rhythms at both molecular and behavioral levels; and many nuclear hormone receptors, which bind nutrients as well as other circulating ligands, have been observed to exhibit robust circadian rhythms of expression in peripheral metabolic tissues. Furthermore, the daily timing of food intake has itself been shown to affect body weight regulation in mammals, likely through, at least in part, regulation of the temporal expression patterns of metabolic genes. Taken together, these and other related findings have transformed our understanding of the important role of time, on a 24-h scale, in the complex physiologic processes of energy balance and coordinated regulation of metabolism. This research has implications for human metabolic disease and may provide unique and novel insights into the development of new therapeutic strategies to control and combat the epidemic of obesity. © 2014 American Society for Nutrition.

  2. Direct Scaling of Leaf-Resolving Biophysical Models from Leaves to Canopies

    NASA Astrophysics Data System (ADS)

    Bailey, B.; Mahaffee, W.; Hernandez Ochoa, M.

    2017-12-01

    Recent advances in the development of biophysical models and high-performance computing have enabled rapid increases in the level of detail that can be represented by simulations of plant systems. However, increasingly detailed models typically require increasingly detailed inputs, which can be a challenge to accurately specify. In this work, we explore the use of terrestrial LiDAR scanning data to accurately specify geometric inputs for high-resolution biophysical models that enables direct up-scaling of leaf-level biophysical processes. Terrestrial LiDAR scans generate "clouds" of millions of points that map out the geometric structure of the area of interest. However, points alone are often not particularly useful in generating geometric model inputs, as additional data processing techniques are required to provide necessary information regarding vegetation structure. A new method was developed that directly reconstructs as many leaves as possible that are in view of the LiDAR instrument, and uses a statistical backfilling technique to ensure that the overall leaf area and orientation distribution matches that of the actual vegetation being measured. This detailed structural data is used to provide inputs for leaf-resolving models of radiation, microclimate, evapotranspiration, and photosynthesis. Model complexity is afforded by utilizing graphics processing units (GPUs), which allows for simulations that resolve scales ranging from leaves to canopies. The model system was used to explore how heterogeneity in canopy architecture at various scales affects scaling of biophysical processes from leaves to canopies.

  3. Development of a high-content screening assay panel to accelerate mechanism of action studies for oncology research.

    PubMed

    Towne, Danli L; Nicholl, Emily E; Comess, Kenneth M; Galasinski, Scott C; Hajduk, Philip J; Abraham, Vivek C

    2012-09-01

    Efficient elucidation of the biological mechanism of action of novel compounds remains a major bottleneck in the drug discovery process. To address this need in the area of oncology, we report the development of a multiparametric high-content screening assay panel at the level of single cells to dramatically accelerate understanding the mechanism of action of cell growth-inhibiting compounds on a large scale. Our approach is based on measuring 10 established end points associated with mitochondrial apoptosis, cell cycle disruption, DNA damage, and cellular morphological changes in the same experiment, across three multiparametric assays. The data from all of the measurements taken together are expected to help increase our current understanding of target protein functions, constrain the list of possible targets for compounds identified using phenotypic screens, and identify off-target effects. We have also developed novel data visualization and phenotypic classification approaches for detailed interpretation of individual compound effects and navigation of large collections of multiparametric cellular responses. We expect this general approach to be valuable for drug discovery across multiple therapeutic areas.

  4. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  5. Gesture and the Mathematics of Motion.

    ERIC Educational Resources Information Center

    Noble, Tracy

    This paper investigates one high school student's use of gestures in an interview context in which he worked on the problem of understanding graphical representations of motion. The goal of the investigation was to contribute a detailed analysis of the process of learning as it occurred over a short time period in order to contribute to the…

  6. Booni Valley Women's Perceptions of Schooling: Hopes and Barriers

    ERIC Educational Resources Information Center

    Pardhan, Almina

    2005-01-01

    Schooling for girls is a relatively recent process in Booni Valley, a remote mountainous village in Chitral District, Pakistan. It is impacting greatly upon the lives of the women. This study has taken an ethnographic perspective and has assumed that an understanding of women's schooling requires a detailed, in-depth account of women's actual…

  7. Student Teachers' Patterns of Reflection in the Context of Teaching Practice

    ERIC Educational Resources Information Center

    Toom, Auli; Husu, Jukka; Patrikainen, Sanna

    2015-01-01

    This study clarifies the basic structure of student teachers' reflective thinking. It presents a constructivist account of teacher knowledge through a detailed analysis of various patterns of reflection in student teacher portfolios. We aim to gain a greater understanding of the process and outcomes of portfolio writing in the context of teaching…

  8. Subterranean ventilation of allochthonous CO2 governs net CO2 exchange in a semiarid Mediterranean grassland

    USDA-ARS?s Scientific Manuscript database

    Recent research highlights the important role of (semi-) arid ecosystems in the global carbon (C) cycle. However, detailed process based investigations are still necessary in order to fully understand how drylands behave and to determine the main factors currently affecting their C balance with the ...

  9. Meadow management and treatment options [chapter 8

    Treesearch

    Jeanne C. Chambers; Jerry R. Miller

    2011-01-01

    Restoration and management objectives and approaches are most effective when based on an understanding of ecosystem processes and the long- and short-term causes of disturbance (Wohl and others 2005). As detailed in previous chapters, several factors are critical in developing effective management strategies for streams and their associated meadow ecosystems in the...

  10. An acceptable role for computers in the aircraft design process

    NASA Technical Reports Server (NTRS)

    Gregory, T. J.; Roberts, L.

    1980-01-01

    Some of the reasons why the computerization trend is not wholly accepted are explored for two typical cases: computer use in the technical specialties and computer use in aircraft synthesis. The factors that limit acceptance are traced in part, to the large resources needed to understand the details of computer programs, the inability to include measured data as input to many of the theoretical programs, and the presentation of final results without supporting intermediate answers. Other factors are due solely to technical issues such as limited detail in aircraft synthesis and major simplifying assumptions in the technical specialties. These factors and others can be influenced by the technical specialist and aircraft designer. Some of these factors may become less significant as the computerization process evolves, but some issues, such as understanding large integrated systems, may remain issues in the future. Suggestions for improved acceptance include publishing computer programs so that they may be reviewed, edited, and read. Other mechanisms include extensive modularization of programs and ways to include measured information as part of the input to theoretical approaches.

  11. Improving identification and management of partner violence: examining the process of academic detailing: a qualitative study

    PubMed Central

    2011-01-01

    Background Many physicians do not routinely inquire about intimate partner violence. Purpose This qualitative study explores the process of academic detailing as an intervention to change physician behavior with regard to intimate partner violence (IPV) identification and documentation. Method A non-physician academic detailer provided a seven-session modular curriculum over a two-and-a-half month period. The detailer noted written details of each training session. Audiotapes of training sessions and semi-structured exit interviews with each physician were recorded and transcribed. Transcriptions were qualitatively and thematically coded and analyzed using Atlas ti®. Results All three study physicians reported increased clarity with regard to the scope of their responsibility to their patients experiencing IPV. They also reported increased levels of comfort in the effective identification and appropriate documentation of IPV and the provision of ongoing support to the patient, including referrals to specialized community services. Conclusion Academic detailing, if presented by a supportive and knowledgeable academic detailer, shows promise to improve physician attitudes and practices with regards to patients in violent relationships. PMID:21679450

  12. Use of complex adaptive systems metaphor to achieve professional and organizational change.

    PubMed

    Rowe, Ann; Hogarth, Annette

    2005-08-01

    This paper uses the experiences of a programme designed to bring about change in performance of public health nurses (health visitors and school nurses) in an inner city primary care trust, to explore the issues of professional and organizational change in health care organizations. The United Kingdom government has given increasing emphasis to programmes of modernization within the National Health Service. A central facet of this policy shift has been an expectation of behaviour and practice change by health care professionals. Change was brought about through use of a Complex Adaptive Systems approach. This enabled change to be seen as an inclusive, evolving and unpredictable process rather one which is linear and mechanistic. The paper examines in detail how the use of concepts and metaphors associated with Complex Adaptive Systems influenced the development of the programme, its implementation and outcomes. The programme resulted in extensive change in professional behaviour, service delivery and transformational change in the organizational structures and processes of the employing organization. This gave greater opportunities for experimentation and innovation, leading to new developments in service delivery, but also meant higher levels of uncertainty, responsibility, decision-making and risk management for practitioners. Using a Complex Adaptive Systems approach was helpful for developing alternative views of change and for understanding why and how some aspects of change were more successful than others. Its use encouraged the confrontation of some long-standing assumptions about change and service delivery patterns in the National Health Service, and the process exposed challenging tensions within the Service. The consequent destabilising of organizational and professional norms resulted in considerable emotional impacts for practitioners, an area which was found to be underplayed within the Complex Adaptive Systems literature. A Complex Adaptive Systems approach can support change, in particular a recognition and understanding of the emergence of unexpected structures, patterns and processes. The approach can support nurses to change their behaviour and innovate, but requires high levels of accountability, individual and professional creativity.

  13. Vadose zone process that control landslide initiation and debris flow propagation

    NASA Astrophysics Data System (ADS)

    Sidle, Roy C.

    2015-04-01

    Advances in the areas of geotechnical engineering, hydrology, mineralogy, geomorphology, geology, and biology have individually advanced our understanding of factors affecting slope stability; however, the interactions among these processes and attributes as they affect the initiation and propagation of landslides and debris flows are not well understood. Here the importance of interactive vadose zone processes is emphasized related to the mechanisms, initiation, mode, and timing of rainfall-initiated landslides that are triggered by positive pore water accretion, loss of soil suction and increase in overburden weight, and long-term cumulative rain water infiltration. Both large- and small-scale preferential flow pathways can both contribute to and mitigate instability, by respectively concentrating and dispersing subsurface flow. These mechanisms are influenced by soil structure, lithology, landforms, and biota. Conditions conducive to landslide initiation by infiltration versus exfiltration are discussed relative to bedrock structure and joints. The effects of rhizosphere processes on slope stability are examined, including root reinforcement of soil mantles, evapotranspiration, and how root structures affect preferential flow paths. At a larger scale, the nexus between hillslope landslides and in-channel debris flows is examined with emphasis on understanding the timing of debris flows relative to chronic and episodic infilling processes, as well as the episodic nature of large rainfall and related stormflow generation in headwater streams. The hydrogeomorphic processes and conditions that determine whether or not landslides immediately mobilize into debris flows is important for predicting the timing and extent of devastating debris flow runout in steep terrain. Given the spatial footprint of individual landslides, it is necessary to assess vadose zone processes at appropriate scales to ascertain impacts on mass wasting phenomena. Articulating the appropriate level of detail of small-scale vadose zone processes into landslide models is a particular challenge. As such, understanding flow pathways in regoliths susceptible to mass movement is critical, including distinguishing between conditions conducive to vertical recharge of water through relatively homogeneous soil mantles and conditions where preferential flow dominates - either by rapid infiltration and lateral flow through interconnected preferential flow networks or via exfiltration through bedrock fractures. These different hydrologic scenarios have major implications for the occurrence, timing, and mode of slope failures.

  14. Protein O-GlcNAcylation: a new signaling paradigm for the cardiovascular system

    PubMed Central

    Laczy, Boglarka; Hill, Bradford G.; Wang, Kai; Paterson, Andrew J.; White, C. Roger; Xing, Dongqi; Chen, Yiu-Fai; Darley-Usmar, Victor; Oparil, Suzanne; Chatham, John C.

    2009-01-01

    The posttranslational modification of serine and threonine residues of nuclear and cytoplasmic proteins by the O-linked attachment of the monosaccharide β-N-acetylglucosamine (O-GlcNAc) is a highly dynamic and ubiquitous protein modification. Protein O-GlcNAcylation is rapidly emerging as a key regulator of critical biological processes including nuclear transport, translation and transcription, signal transduction, cytoskeletal reorganization, proteasomal degradation, and apoptosis. Increased levels of O-GlcNAc have been implicated as a pathogenic contributor to glucose toxicity and insulin resistance, which are both major hallmarks of diabetes mellitus and diabetes-related cardiovascular complications. Conversely, there is a growing body of data demonstrating that the acute activation of O-GlcNAc levels is an endogenous stress response designed to enhance cell survival. Reports on the effect of altered O-GlcNAc levels on the heart and cardiovascular system have been growing rapidly over the past few years and have implicated a role for O-GlcNAc in contributing to the adverse effects of diabetes on cardiovascular function as well as mediating the response to ischemic injury. Here, we summarize our present understanding of protein O-GlcNAcylation and its effect on the regulation of cardiovascular function. We examine the pathways regulating protein O-GlcNAcylation and discuss, in more detail, our understanding of the role of O-GlcNAc in both mediating the adverse effects of diabetes as well as its role in mediating cellular protective mechanisms in the cardiovascular system. In addition, we also explore the parallels between O-GlcNAc signaling and redox signaling, as an alternative paradigm for understanding the role of O-GlcNAcylation in regulating cell function. PMID:19028792

  15. Clinical professional governance for detailed clinical models.

    PubMed

    Goossen, William; Goossen-Baremans, Anneke

    2013-01-01

    This chapter describes the need for Detailed Clinical Models for contemporary Electronic Health Systems, data exchange and data reuse. It starts with an explanation of the components related to Detailed Clinical Models with a brief summary of knowledge representation, including terminologies representing clinic relevant "things" in the real world, and information models that abstract these in order to let computers process data about these things. Next, Detailed Clinical Models are defined and their purpose is described. It builds on existing developments around the world and accumulates in current work to create a technical specification at the level of the International Standards Organization. The core components of properly expressed Detailed Clinical Models are illustrated, including clinical knowledge and context, data element specification, code bindings to terminologies and meta-information about authors, versioning among others. Detailed Clinical Models to date are heavily based on user requirements and specify the conceptual and logical levels of modelling. It is not precise enough for specific implementations, which requires an additional step. However, this allows Detailed Clinical Models to serve as specifications for many different kinds of implementations. Examples of Detailed Clinical Models are presented both in text and in Unified Modelling Language. Detailed Clinical Models can be positioned in health information architectures, where they serve at the most detailed granular level. The chapter ends with examples of projects that create and deploy Detailed Clinical Models. All have in common that they can often reuse materials from earlier projects, and that strict governance of these models is essential to use them safely in health care information and communication technology. Clinical validation is one point of such governance, and model testing another. The Plan Do Check Act cycle can be applied for governance of Detailed Clinical Models. Finally, collections of clinical models do require a repository in which they can be stored, searched, and maintained. Governance of Detailed Clinical Models is required at local, national, and international levels.

  16. Generalized File Management System or Proto-DBMS?

    ERIC Educational Resources Information Center

    Braniff, Tom

    1979-01-01

    The use of a data base management system (DBMS) as opposed to traditional data processing is discussed. The generalized file concept is viewed as an entry level step to the DBMS. The transition process from one system to the other is detailed. (SF)

  17. Sustainability Research: Biofuels, Processes and Supply Chains

    EPA Science Inventory

    Presentation will talk about sustainability at the EPA, summarily covering high level efforts and focusing in more detail on research in metrics for liquid biofuels and tools to evaluate sustainable processes. The presentation will also briefly touch on a new area of research, t...

  18. Sperm as microswimmers - navigation and sensing at the physical limit

    NASA Astrophysics Data System (ADS)

    Kaupp, Ulrich B.; Alvarez, Luis

    2016-11-01

    Many cells and microorganisms have evolved a motility apparatus to explore their surroundings. For guidance, these biological microswimmers rely on physical and chemical cues that are transduced by cellular pathways into directed movement - a process called taxis. Only few biological microswimmers have been studied as detailed as sperm from sea urchins. Sperm and eggs are released into the seawater. To enhance the chances of fertilization, eggs release chemical factors - called chemoattractants - that establish a chemical gradient and, thereby, guide sperm to the egg. Sea urchin sperm constitute a unique model system for understanding cell navigation at every level: from molecules to cell behaviours. We will outline the chemotactic signalling pathway of sperm from the sea urchin Arbacia punctulata and discuss how signalling controls navigation in a chemical gradient. Finally, we discuss recent insights into sperm chemotaxis in three dimensions (3D).

  19. Study on the biological effect of cosmic radiation and the development of radiation protection technology (L-11)

    NASA Technical Reports Server (NTRS)

    Nagaoka, Shunji

    1993-01-01

    NASDA is now participating in a series of flight experiments on Spacelab missions. The first experiment was carried out on the first International Microgravity Laboratory Mission (IML-1) January 1992, and the second experiment will be conducted on the Spacelab-J Mission, First Materials Processing Test (FMPT). The equipment or Radiation Monitoring Container Devices (RMCD) includes passive dosimeter systems and biological specimens. The experiments using this hardware are designed by NASDA to measure and investigate the radiation levels inside spacecraft like space shuttle and to look at the basic effects of the space environment from the aspect of radiation biology. The data gathered will be analyzed to understand the details of biological effects as well as the physical nature of space radiation registered in the sensitive Solid-State Track Detectors (SSTD).

  20. Humidity-dependent compression-induced glass transition of the air–water interfacial Langmuir films of poly(D,L-lactic acid- ran-glycolic acid) (PLGA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Hyun Chang; Lee, Hoyoung; Jung, Hyunjung

    2015-08-26

    Constant rate compression isotherms of the air–water interfacial Langmuir films of poly(D,L-lactic acid- ran-glycolic acid) (PLGA)show a distinct feature of an exponential increase in surface pressure in the high surface polymer concentration regime. We have previously demonstrated that this abrupt increase in surface pressure is linked to the glass transition of the polymer film, but the detailed mechanism of this process is not understood. In order to obtain a molecular-level understanding of this behavior, we performed extensive characterizations of the surface mechanical, structural and rheological properties of Langmuir PLGA films at the air–water interface, using combined experimental techniques including themore » Langmuir film balance, X-ray reflectivity and double-wall-ring interfacial rheometry methods.« less

  1. Monoterpenes are the largest source of summertime organic aerosol in the southeastern United States

    DOE PAGES

    Zhang, Haofei; Yee, Lindsay D.; Lee, Ben H.; ...

    2018-02-12

    The chemical complexity of atmospheric organic aerosol (OA) has caused substantial uncertainties in understanding its origins and environmental impacts. Here, we provide constraints on OA origins through compositional characterization with molecular-level details. Our results suggest that secondary OA (SOA) from monoterpene oxidation accounts for approximately half of summertime fine OA in Centreville, AL, a forested area in the southeastern United States influenced by anthropogenic pollution. We find that different chemical processes involving nitrogen oxides, during days and nights, play a central role in determining the mass of monoterpene SOA produced. These findings elucidate the strong anthropogenic–biogenic interaction affecting ambient aerosolmore » in the southeastern United States and point out the importance of reducing anthropogenic emissions, especially under a changing climate, where biogenic emissions will likely keep increasing.« less

  2. Never at rest: insights into the conformational dynamics of ion channels from cryo-electron microscopy.

    PubMed

    Lau, Carus; Hunter, Mark J; Stewart, Alastair; Perozo, Eduardo; Vandenberg, Jamie I

    2018-04-01

    The tightly regulated opening and closure of ion channels underlies the electrical signals that are vital for a wide range of physiological processes. Two decades ago the first atomic level view of ion channel structures led to a detailed understanding of ion selectivity and conduction. In recent years, spectacular developments in the field of cryo-electron microscopy have resulted in cryo-EM superseding crystallography as the technique of choice for determining near-atomic resolution structures of ion channels. Here, we will review the recent developments in cryo-EM and its specific application to the study of ion channel gating. We will highlight the advantages and disadvantages of the current technology and where the field is likely to head in the next few years. © 2018 The Authors. The Journal of Physiology © 2018 The Physiological Society.

  3. Radiation Damage From Mono-energetic Electrons Up to 200 keV On Biological Systems

    NASA Astrophysics Data System (ADS)

    Prilepskiy, Yuriy

    2006-03-01

    The electron gun of the CEBAF machine at Jefferson lab (Newport News, VA) is capable of delivering electrons with energies up to 200 keV with a resolution of about 10-5. This 1.5 GHz beam permits to generate cellular radiation damage within minutes. We have performed irradiation of cancer cells with different energies and different currents to investigate their biological responses. This study will permit to address the physical processes involved in the RBE and LET at a level that supersedes current data listed in the literature by orders of magnitude. We will discuss the experimental setup and results of the first stage of data collected with this novel system. This research is part of a global program to provide detailed information for the understanding of radiation based cancer treatments.

  4. Soot formation and radiation in turbulent jet diffusion flames under normal and reduced gravity conditions

    NASA Technical Reports Server (NTRS)

    Ku, Jerry C.; Tong, LI; Sun, Jun; Greenberg, Paul S.; Griffin, Devon W.

    1993-01-01

    Most practical combustion processes, as well as fires and explosions, exhibit some characteristics of turbulent diffusion flames. For hydrocarbon fuels, the presence of soot particles significantly increases the level of radiative heat transfer from flames. In some cases, flame radiation can reach up to 75 percent of the heat release by combustion. Laminar diffusion flame results show that radiation becomes stronger under reduced gravity conditions. Therefore, detailed soot formation and radiation must be included in the flame structure analysis. A study of sooting turbulent diffusion flames under reduced-gravity conditions will not only provide necessary information for such practical issues as spacecraft fire safety, but also develop better understanding of fundamentals for diffusion combustion. In this paper, a summary of the work to date and of future plans is reported.

  5. Understanding the chemistry of the development of latent fingerprints by superglue fuming.

    PubMed

    Wargacki, Stephen P; Lewis, Linda A; Dadmun, Mark D

    2007-09-01

    Cyanoacrylate fuming is a widely used forensic tool for the development of latent fingerprints, however the mechanistic details of the reaction between the fingerprint residue and the cyanoacrylate vapor are not well understood. Here the polymerization of ethyl-cyanoacrylate vapor by sodium lactate or alanine solutions, two of the major components in fingerprint residue, has been examined by monitoring the time dependence of the mass uptake and resultant polymer molecular weight characteristics. This data provides insight into the molecular level actions in the efficient development of latent fingerprints by superglue fuming. The results show that the carboxylate moiety is the primary initiator of the polymerization process and that a basic environment inhibits chain termination while an acidic environment promotes it. The results also indicate that water cannot be the primary initiator in this forensic technique.

  6. Cataclysmic Variable Stars

    NASA Astrophysics Data System (ADS)

    Hellier, Coel

    2001-01-01

    Cataclysmic variable stars are the most variable stars in the night sky, fluctuating in brightness continually on timescales from seconds to hours to weeks to years. The changes can be recorded using amateur telescopes, yet are also the subject of intensive study by professional astronomers. That study has led to an understanding of cataclysmic variables as binary stars, orbiting so closely that material transfers from one star to the other. The resulting process of accretion is one of the most important in astrophysics. This book presents the first account of cataclysmic variables at an introductory level. Assuming no previous knowledge of the field, it explains the basic principles underlying the variability, while providing an extensive compilation of cataclysmic variable light curves. Aimed at amateur astronomers, undergraduates, and researchers, the main text is accessible to those with no mathematical background, while supplementary boxes present technical details and equations.

  7. Monoterpenes are the largest source of summertime organic aerosol in the southeastern United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Haofei; Yee, Lindsay D.; Lee, Ben H.

    The chemical complexity of atmospheric organic aerosol (OA) has caused substantial uncertainties in understanding its origins and environmental impacts. Here, we provide constraints on OA origins through compositional characterization with molecular-level details. Our results suggest that secondary OA (SOA) from monoterpene oxidation accounts for approximately half of summertime fine OA in Centreville, AL, a forested area in the southeastern United States influenced by anthropogenic pollution. We find that different chemical processes involving nitrogen oxides, during days and nights, play a central role in determining the mass of monoterpene SOA produced. These findings elucidate the strong anthropogenic–biogenic interaction affecting ambient aerosolmore » in the southeastern United States and point out the importance of reducing anthropogenic emissions, especially under a changing climate, where biogenic emissions will likely keep increasing.« less

  8. How does undergraduate college biology students' level of understanding, in regard to the role of the seed plant root system, relate to their level of understanding of photosynthesis?

    NASA Astrophysics Data System (ADS)

    Njeng'ere, James Gicheha

    This research study investigated how undergraduate college biology students' level of understanding of the role of the seed plant root system relates to their level of understanding of photosynthesis. This research was conducted with 65 undergraduate non-majors biology who had completed 1 year of biology at Louisiana State University in Baton Rouge and Southeastern Louisiana University in Hammond. A root probe instrument was developed from some scientifically acceptable propositional statements about the root system, the process of photosynthesis, as well as the holistic nature of the tree. These were derived from research reviews of the science education and the arboriculture literature. This was administered to 65 students selected randomly from class lists of the two institutions. Most of the root probe's items were based on the Live Oak tree. An in-depth, clinical interview-based analysis was conducted with 12 of those tested students. A team of root experts participated by designing, validating and answering the same questions that the students were asked. A "systems" lens as defined by a team of college instructors, root experts (Shigo, 1991), and this researcher was used to interpret the results. A correlational coefficient determining students' level of understanding of the root system and their level of understanding of the process of photosynthesis was established by means of Pearson's r correlation (r = 0.328) using the SAS statistical analysis (SAS, 1987). From this a coefficient of determination (r2 = 0.104) was determined. Students' level of understanding of the Live Oak root system (mean score 5.94) was not statistically different from their level of understanding of the process of photosynthesis (mean score 5.54) as assessed by the root probe, t (129) = 0.137, p > 0.05 one tailed-test. This suggests that, to some degree, level of the root system limits level of understanding of photosynthesis and vice versa. Analysis of quantitative and qualitative data revealed that students who applied principles of systems thinking performed better than those who did not. Students' understanding of the root system of the Live Oak tree was hindered by understanding of, plant food, the nonwoody roots, and the tree as a system.

  9. Characterising the development of the understanding of human body systems in high-school biology students - a longitudinal study

    NASA Astrophysics Data System (ADS)

    Snapir, Zohar; Eberbach, Catherine; Ben-Zvi-Assaraf, Orit; Hmelo-Silver, Cindy; Tripto, Jaklin

    2017-10-01

    Science education today has become increasingly focused on research into complex natural, social and technological systems. In this study, we examined the development of high-school biology students' systems understanding of the human body, in a three-year longitudinal study. The development of the students' system understanding was evaluated using the Components Mechanisms Phenomena (CMP) framework for conceptual representation. We coded and analysed the repertory grid personal constructs of 67 high-school biology students at 4 points throughout the study. Our data analysis builds on the assumption that systems understanding entails a perception of all the system categories, including structures within the system (its Components), specific processes and interactions at the macro and micro levels (Mechanisms), and the Phenomena that present the macro scale of processes and patterns within a system. Our findings suggest that as the learning process progressed, the systems understanding of our students became more advanced, moving forward within each of the major CMP categories. Moreover, there was an increase in the mechanism complexity presented by the students, manifested by more students describing mechanisms at the molecular level. Thus, the 'mechanism' category and the micro level are critical components that enable students to understand system-level phenomena such as homeostasis.

  10. Assessment of microscale spatio-temporal variation of air pollution at an urban hotspot in Madrid (Spain) through an extensive field campaign

    NASA Astrophysics Data System (ADS)

    Borge, Rafael; Narros, Adolfo; Artíñano, Begoña; Yagüe, Carlos; Gómez-Moreno, Francisco Javier; de la Paz, David; Román-Cascón, Carlos; Díaz, Elías; Maqueda, Gregorio; Sastre, Mariano; Quaassdorff, Christina; Dimitroulopoulou, Chrysanthi; Vardoulakis, Sotiris

    2016-09-01

    Poor urban air quality is one of the main environmental concerns worldwide due to its implications for population exposure and health-related issues. However, the development of effective abatement strategies in cities requires a consistent and holistic assessment of air pollution processes, taking into account all the relevant scales within a city. This contribution presents the methodology and main results of an intensive experimental campaign carried out in a complex pollution hotspot in Madrid (Spain) under the TECNAIRE-CM research project, which aimed at understanding the microscale spatio-temporal variation of ambient concentration levels in areas where high pollution values are recorded. A variety of instruments were deployed during a three-week field campaign to provide detailed information on meteorological and micrometeorological parameters and spatio-temporal variations of the most relevant pollutants (NO2 and PM) along with relevant information needed to simulate pedestrian fluxes. The results show the strong dependence of ambient concentrations on local emissions and meteorology that turns out in strong spatial and temporal variations, with gradients up to 2 μg m-3 m-1 for NO2 and 55 μg m-3 min-1 for PM10. Pedestrian exposure to these pollutants also presents strong variations temporally and spatially but it concentrates on pedestrian crossings and bus stops. The analysis of the results show that the high concentration levels found in urban hotspots depend on extremely complex dynamic processes that cannot be captured by routinely measurements made by air quality monitoring stations used for regulatory compliance assessment. The large influence from local traffic in the concentration fields highlights the need for a detailed description of specific variables that determine emissions and dispersion at microscale level. This also indicates that city-scale interventions may be complemented with local control measures and exposure management, to improve air quality and reduce air pollution health effects more effectively.

  11. Urban CO2 emissions metabolism: The Hestia Project

    NASA Astrophysics Data System (ADS)

    Gurney, K. R.; Razlivanov, I.; Zhou, Y.; Song, Y.

    2011-12-01

    A central expression of urban metabolism is the consumption of energy and the resulting environmental impact, particularly the emission of CO2 and other greenhouse gases. Quantification of energy and emissions has been performed for numerous cities but rarely has this been done in explicit space/time detail. Here, we present the Hestia Project, an effort aimed at building a high resolution (eg. building and road link-specific, hourly) fossil fuel CO2 emissions data product for the urban domain. A complete data product has been built for the city of Indianapolis and work is ongoing for the city of Los Angeles (Figure 1). The effort in Indianapolis is now part of a larger effort aimed at a convergent top-down/bottom-up assessment of greenhouse gas emissions, called INFLUX. Our urban-level quantification relies on a mixture of data and modeling structures. We start with the sector-specific Vulcan Project estimate at the mix of geocoded and county-wide levels. The Hestia aim is to distribute the Vulcan result in space and time. Two components take the majority of effort: buildings and onroad emissions. For the buildings, we utilize an energy building model which we constrain through lidar data, county assessor parcel data and GIS layers. For onroad emissions, we use a combination of traffic data and GIS road layers maintaining vehicle class information. Finally, all pointwise data in the Vulcan Project are transferred to our urban landscape and additional time distribution is performed. A key benefit of the approach taken in this study is the tracking and archiving of fuel and process-level detail (eg. combustion process, other pollutants), allowing for a more thorough understanding and analysis of energy throughputs in the urban environment. Next steps in this research from the metabolism perspective is to consider the carbon footprint of material goods and their lateral transfer in addition to the connection between electricity consumption and production.

  12. Word Processing in Elementary Schools: Seven Case Studies. Education and Technology Series.

    ERIC Educational Resources Information Center

    Murray, Jack; And Others

    As a result of preliminary observations of word processing in elementary level language the seven case studies presented in this report reveal the effectiveness of current word processing (WP) activities within their respective instructional contexts. Each study is presented separately, detailing the classroom context, tasks and outcomes, program…

  13. Towards Accelerated Aging Methodologies and Health Management of Power MOSFETs (Technical Brief)

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Patil, Nishad; Saha, Sankalita; Wysocki, Phil; Goebel, Kai

    2009-01-01

    Understanding aging mechanisms of electronic components is of extreme importance in the aerospace domain where they are part of numerous critical subsystems including avionics. In particular, power MOSFETs are of special interest as they are involved in high voltage switching circuits such as drivers for electrical motors. With increased use of electronics in aircraft control, it becomes more important to understand the degradation of these components in aircraft specific environments. In this paper, we present an accelerated aging methodology for power MOSFETs that subject the devices to indirect thermal overstress during high voltage switching. During this accelerated aging process, two major modes of failure were observed - latch-up and die attach degradation. In this paper we present the details of our aging methodology along with details of experiments and analysis of the results.

  14. Elements of impact assessment: a case study with cyber attacks

    NASA Astrophysics Data System (ADS)

    Yang, Shanchieh Jay; Holsopple, Jared; Liu, Daniel

    2009-05-01

    Extensive discussions have taken place in recent year regarding impact assessment - what is it and how can we do it? It is especially intriguing in this modern era where non-traditional warfare has caused either information overload or limited understanding of adversary doctrines. This work provides a methodical discussion of key elements for the broad definition of impact assessment (IA). The discussion will start with a process flow involving components related to IA. Two key functional components, impact estimation and threat projection, are compared and illustrated in detail. These details include a discussion of when to model red and blue knowledge. Algorithmic approaches will be discussed, augmented with lessons learned from our IA development for cyber situation awareness. This paper aims at providing the community with a systematic understanding of IA and its open issues with specific examples.

  15. Ion-neutral Clustering of Bile Acids in Electrospray Ionization Across UPLC Flow Regimes

    NASA Astrophysics Data System (ADS)

    Brophy, Patrick; Broeckling, Corey D.; Murphy, James; Prenni, Jessica E.

    2018-02-01

    Bile acid authentic standards were used as model compounds to quantitatively evaluate complex in-source phenomenon on a UPLC-ESI-TOF-MS operated in the negative mode. Three different diameter columns and a ceramic-based microfluidic separation device were utilized, allowing for detailed descriptions of bile acid behavior across a wide range of flow regimes and instantaneous concentrations. A custom processing algorithm based on correlation analysis was developed to group together all ion signals arising from a single compound; these grouped signals produce verified compound spectra for each bile acid at each on-column mass loading. Significant adduction was observed for all bile acids investigated under all flow regimes and across a wide range of bile acid concentrations. The distribution of bile acid containing clusters was found to depend on the specific bile acid species, solvent flow rate, and bile acid concentration. Relative abundancies of each cluster changed non-linearly with concentration. It was found that summing all MS level (low collisional energy) ions and ion-neutral adducts arising from a single compound improves linearity across the concentration range (0.125-5 ng on column) and increases the sensitivity of MS level quantification. The behavior of each cluster roughly follows simple equilibrium processes consistent with our understanding of electrospray ionization mechanisms and ion transport processes occurring in atmospheric pressure interfaces. [Figure not available: see fulltext.

  16. Deglaciation, lake levels, and meltwater discharge in the Lake Michigan basin

    USGS Publications Warehouse

    Colman, Steven M.; Clark, J.A.; Clayton, L.; Hansel, A.K.; Larsen, C.E.

    1994-01-01

    The deglacial history of the Lake Michigan basin, including discharge and routing of meltwater, is complex because of the interaction among (1) glacial retreats and re-advances in the basin (2) the timing of occupation and the isostatic adjustment of lake outlets and (3) the depositional and erosional processes that left evidence of past lake levels. In the southern part of the basin, a restricted area little affected by differential isostasy, new studies of onshore and offshore areas allow refinement of a lake-level history that has evolved over 100 years. Important new data include the recognition of two periods of influx of meltwater from Lake Agassiz into the basin and details of the highstands gleaned from sedimentological evidence. Major disagreements still persist concerning the exact timing and lake-level changes associated with the Algonquin phase, approximately 11,000 BP. A wide variety of independent data suggests that the Lake Michigan Lobe was thin, unstable, and subject to rapid advances and retreats. Consequently, lake-level changes were commonly abrupt and stable shorelines were short-lived. The long-held beliefs that the southern part of the basin was stable and separated from deformed northern areas by a hinge-line discontinuity are becoming difficult to maintain. Numerical modeling of the ice-earth system and empirical modeling of shoreline deformation are both consistent with observed shoreline tilting in the north and with the amount and pattern of modern deformation shown by lake-level gauges. New studies of subaerial lacustrine features suggest the presence of deformed shorelines higher than those originally ascribed to the supposed horizontal Glenwood level. Finally, the Lake Michigan region as a whole appears to behave in a similar manner to other areas, both local (other Great Lakes) and regional (U.S. east coast), that have experienced major isostatic changes. Detailed sedimentological and dating studies of field sites and additional development of geophysical models offer hope for reconciling the field data with our understanding of earth rheology. ?? 1995.

  17. Regulation and dysregulation of immunoglobulin E: a molecular and clinical perspective

    PubMed Central

    2010-01-01

    Background Altered levels of Immunoglobulin E (IgE) represent a dysregulation of IgE synthesis and may be seen in a variety of immunological disorders. The object of this review is to summarize the historical and molecular aspects of IgE synthesis and the disorders associated with dysregulation of IgE production. Methods Articles published in Medline/PubMed were searched with the keyword Immunoglobulin E and specific terms such as class switch recombination, deficiency and/or specific disease conditions (atopy, neoplasia, renal disease, myeloma, etc.). The selected papers included reviews, case reports, retrospective reviews and molecular mechanisms. Studies involving both sexes and all ages were included in the analysis. Results Both very low and elevated levels of IgE may be seen in clinical practice. Major advancements have been made in our understanding of the molecular basis of IgE class switching including roles for T cells, cytokines and T regulatory (or Treg) cells in this process. Dysregulation of this process may result in either elevated IgE levels or IgE deficiency. Conclusion Evaluation of a patient with elevated IgE must involve a detailed differential diagnosis and consideration of various immunological and non-immunological disorders. The use of appropriate tests will allow the correct diagnosis to be made. This can often assist in the development of tailored treatments. PMID:20178634

  18. Evaluating the implementation process of a participatory organizational level occupational health intervention in schools.

    PubMed

    Schelvis, Roosmarijn M C; Wiezer, Noortje M; Blatter, Birgitte M; van Genabeek, Joost A G M; Oude Hengel, Karen M; Bohlmeijer, Ernst T; van der Beek, Allard J

    2016-12-01

    The importance of process evaluations in examining how and why interventions are (un) successful is increasingly recognized. Process evaluations mainly studied the implementation process and the quality of the implementation (fidelity). However, in adopting this approach for participatory organizational level occupational health interventions, important aspects such as context and participants perceptions are missing. Our objective was to systematically describe the implementation process of a participatory organizational level occupational health intervention aimed at reducing work stress and increasing vitality in two schools by applying a framework that covers aspects of the intervention and its implementation as well as the context and participants perceptions. A program theory was developed, describing the requirements for successful implementation. Each requirement was operationalized by making use of the framework, covering: initiation, communication, participation, fidelity, reach, communication, satisfaction, management support, targeting, delivery, exposure, culture, conditions, readiness for change and perceptions. The requirements were assessed by quantitative and qualitative data, collected at 12 and 24 months after baseline in both schools (questionnaire and interviews) or continuously (logbooks). The intervention consisted of a needs assessment phase and a phase of implementing intervention activities. The needs assessment phase was implemented successfully in school A, but not in school B where participation and readiness for change were insufficient. In the second phase, several intervention activities were implemented at school A, whereas this was only partly the case in school B (delivery). In both schools, however, participants felt not involved in the choice of intervention activities (targeting, participation, support), resulting in a negative perception of and only partial exposure to the intervention activities. Conditions, culture and events hindered the implementation of intervention activities in both schools. The framework helped us to understand why the implementation process was not successful. It is therefore considered of added value for the evaluation of implementation processes in participatory organizational level interventions, foremost because of the context and mental models dimensions. However, less demanding methods for doing detailed process evaluations need to be developed. This can only be done if we know more about the most important process components and this study contributes to that knowledge base. Netherlands Trial Register NTR3284 .

  19. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  20. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  1. Adaptation of global land use and management intensity to changes in climate and atmospheric carbon dioxide.

    PubMed

    Alexander, Peter; Rabin, Sam; Anthoni, Peter; Henry, Roslyn; Pugh, Thomas A M; Rounsevell, Mark D A; Arneth, Almut

    2018-02-27

    Land use contributes to environmental change, but is also influenced by such changes. Climate and atmospheric carbon dioxide (CO 2 ) levels' changes alter agricultural crop productivity, plant water requirements and irrigation water availability. The global food system needs to respond and adapt to these changes, for example, by altering agricultural practices, including the crop types or intensity of management, or shifting cultivated areas within and between countries. As impacts and associated adaptation responses are spatially specific, understanding the land use adaptation to environmental changes requires crop productivity representations that capture spatial variations. The impact of variation in management practices, including fertiliser and irrigation rates, also needs to be considered. To date, models of global land use have selected agricultural expansion or intensification levels using relatively aggregate spatial representations, typically at a regional level, that are not able to characterise the details of these spatially differentiated responses. Here, we show results from a novel global modelling approach using more detailed biophysically derived yield responses to inputs with greater spatial specificity than previously possible. The approach couples a dynamic global vegetative model (LPJ-GUESS) with a new land use and food system model (PLUMv2), with results benchmarked against historical land use change from 1970. Land use outcomes to 2100 were explored, suggesting that increased intensity of climate forcing reduces the inputs required for food production, due to the fertilisation and enhanced water use efficiency effects of elevated atmospheric CO 2 concentrations, but requiring substantial shifts in the global and local patterns of production. The results suggest that adaptation in the global agriculture and food system has substantial capacity to diminish the negative impacts and gain greater benefits from positive outcomes of climate change. Consequently, agricultural expansion and intensification may be lower than found in previous studies where spatial details and processes consideration were more constrained. © 2018 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  2. A rule-based shell to hierarchically organize HST observations

    NASA Technical Reports Server (NTRS)

    Bose, Ashim; Gerb, Andrew

    1995-01-01

    An observing program on the Hubble Space Telescope (HST) is described in terms of exposures that are obtained by one or more of the instruments onboard the HST. These exposures are organized into a hierarchy of structures for purposes of efficient scheduling of observations. The process by which exposures get organized into the higher-level structures is called merging. This process relies on rules to determine which observations can be 'merged' into the same higher level structure, and which cannot. The TRANSformation expert system converts proposals for astronomical observations with HST into detailed observing plans. The conversion process includes the task of merging. Within TRANS, we have implemented a declarative shell to facilitate merging. This shell offers the following features: (1) an easy way of specifying rules on when to merge and when not to merge, (2) a straightforward priority mechanism for resolving conflicts among rules, (3) an explanation facility for recording the merging history, (4) a report generating mechanism to help users understand the reasons for merging, and (5) a self-documenting mechanism that documents all the merging rules that have been defined in the shell, ordered by priority. The merging shell is implemented using an object-oriented paradigm in CLOS. It has been a part of operational TRANS (after extensive testing) since July 1993. It has fulfilled all performance expectations, and has considerably simplified the process of implementing new or changed requirements for merging. The users are pleased with its report-generating and self-documenting features.

  3. Monitoring landscape level processes using remote sensing of large plots

    Treesearch

    Raymond L. Czaplewski

    1991-01-01

    Global and regional assessaents require timely information on landscape level status (e.g., areal extent of different ecosystems) and processes (e.g., changes in land use and land cover). To measure and understand these processes at the regional level, and model their impacts, remote sensing is often necessary. However, processing massive volumes of remotely sensing...

  4. Information giving in clinical trials: the views of medical researchers.

    PubMed

    Ferguson, P R

    2003-02-01

    It is both an ethical and legal requirement that patients who participate in clinical trials must generally give their consent. As part of this process, patients must be provided with adequate information to enable them to decide whether or not to take part. In the UK, the pharmaceutical companies that sponsor such research, as well as Local Research Ethics Committees, specify in detail the information that must be given to trial participants. The researchers who conduct clinical trials inevitably form views on the amount of information they are required to provide, and about patients' comprehension of that information. The literature in this area suggests that some medical researchers may be unhappy with the amount of information that they must give patient participants. There have been, however, few systematic attempts to determine their views. This paper reports a study that explored researchers' views as to (i) the amount of information provided to trial participants, and (ii) participants' understanding of that information. Researchers generally felt that they were required to give trial participants an appropriate amount of information, and that most patients had at least a reasonable understanding of key aspects of the clinical trials' process. However, there were differing views as to the level of information that they felt patients themselves wanted. The researchers did not generally feel that the patients' inability to comprehend information rendered the process of obtaining 'informed consent' a waste of time. However, some did believe that they were required to burden patients with excessive information.

  5. Analysis of the Molecular Mechanisms of Reepithelialization in Drosophila Embryos

    PubMed Central

    Matsubayashi, Yutaka; Millard, Tom H.

    2016-01-01

    Significance: The epidermis provides the main barrier function of skin, and therefore its repair following wounding is an essential component of wound healing. Repair of the epidermis, also known as reepithelialization, occurs by collective migration of epithelial cells from around the wound edge across the wound until the advancing edges meet and fuse. Therapeutic manipulation of this process could potentially be used to accelerate wound healing. Recent Advances: It is difficult to analyze the cellular and molecular mechanisms of reepithelialization in human tissue, so a variety of model organisms have been used to improve our understanding of the process. One model system that has been especially useful is the embryo of the fruit fly Drosophila, which provides a simple, accessible model of the epidermis and can be manipulated genetically, allowing detailed analysis of reepithelialization at the molecular level. This review will highlight the key insights that have been gained from studying reepithelialization in Drosophila embryos. Critical Issues: Slow reepithelialization increases the risk of wounds becoming infected and ulcerous; therefore, the development of therapies to accelerate or enhance the process would be a great clinical advance. Improving our understanding of the molecular mechanisms that underlie reepithelialization will help in the development of such therapies. Future Directions: Research in Drosophila embryos has identified a variety of genes and proteins involved in triggering and driving reepithelialization, many of which are conserved in humans. These novel reepithelialization proteins are potential therapeutic targets and therefore findings obtained in Drosophila may ultimately lead to significant clinical advances. PMID:27274434

  6. Understanding the interactions of human follicle stimulating hormone with single-walled carbon nanotubes by molecular dynamics simulation and free energy analysis.

    PubMed

    Mahmoodi, Yasaman; Mehrnejad, Faramarz; Khalifeh, Khosrow

    2018-01-01

    Interactions of carbon nanotubes (CNTs) and blood proteins are of interest for nanotoxicology and nanomedicine. It is believed that the interactions of blood proteins and glycoproteins with CNTs may have important biological effects. In spite of many experimental studies of single-walled carbon nanotubes (SWCNT) and glycoproteins with different methods, little is known about the atomistic details of their association process or of structural alterations occurring in adsorbed glycoproteins. In this study, we have applied molecular dynamics simulation to investigate the interaction of follicle stimulating hormone (hFSH) with SWCNT. The aim of this work is to investigate possible mechanisms of nanotoxicity at a molecular level. We present details of the molecular dynamics, structure, and free energy of binding of hFSH on the surface of SWCNT. We find that hFSH in aqueous solution strongly adsorbs onto SWCNT via their concave surface as evidenced by high binding free energies for residues in both protein subunits. It was found that hydrophobic, π-cation, and π-π stacking interactions are the main driving forces for the adsorption of the protein at the nanotube surface.

  7. The pathophysiology of pulmonary hypertension in left heart disease.

    PubMed

    Breitling, Siegfried; Ravindran, Krishnan; Goldenberg, Neil M; Kuebler, Wolfgang M

    2015-11-01

    Pulmonary hypertension (PH) is characterized by elevated pulmonary arterial pressure leading to right-sided heart failure and can arise from a wide range of etiologies. The most common cause of PH, termed Group 2 PH, is left-sided heart failure and is commonly known as pulmonary hypertension with left heart disease (PH-LHD). Importantly, while sharing many clinical features with pulmonary arterial hypertension (PAH), PH-LHD differs significantly at the cellular and physiological levels. These fundamental pathophysiological differences largely account for the poor response to PAH therapies experienced by PH-LHD patients. The relatively high prevalence of this disease, coupled with its unique features compared with PAH, signal the importance of an in-depth understanding of the mechanistic details of PH-LHD. The present review will focus on the current state of knowledge regarding the pathomechanisms of PH-LHD, highlighting work carried out both in human trials and in preclinical animal models. Adaptive processes at the alveolocapillary barrier and in the pulmonary circulation, including alterations in alveolar fluid transport, endothelial junctional integrity, and vasoactive mediator secretion will be discussed in detail, highlighting the aspects that impact the response to, and development of, novel therapeutics. Copyright © 2015 the American Physiological Society.

  8. Integrating psychology with interpersonal communication skills in undergraduate nursing education: addressing the challenges.

    PubMed

    McCarthy, Bridie; Trace, Anna; O'Donovan, Moira

    2014-05-01

    The inclusion of the social, behavioural and bio-sciences is acknowledged as essential to the development of the art and science of nursing. Nonetheless, the literature highlights on-going debate about the content and delivery of these subject areas in undergraduate nursing education. The bio-sciences and social sciences in particular have received much attention but more recently the inclusion of psychology in nursing curricula is gaining momentum. Studies conducted on nursing students' views of these supporting sciences have also highlighted problems with their understanding, relevance and application to nursing practice. Although broad guidelines are given as to what should be included, no detail is given as to how much detail or at what level these subjects should be taught. Subsequently, approved institutions are responsible for their own course content. This has resulted in inconsistent and varied approaches to integrating the sciences in undergraduate nursing curricula. Following a recent review of the undergraduate nursing curriculum in one university in the Republic of Ireland a decision was made to combine the teaching, learning and assessment of Applied Psychology with Interpersonal Communication skills. This paper will describe the developmental process and evaluation of the integrated module. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Moving Up the CMMI Capability and Maturity Levels Using Simulation

    DTIC Science & Technology

    2008-01-01

    Alternative Process Tools, Including NPV and ROI 6 Figure 3: Top-Level View of the Full Life-Cycle Version of the IEEE 12207 PSIM, Including IV&V Layer 19...Figure 4: Screenshot of the Incremental Version Model 19 Figure 5: IEEE 12207 PSIM Showing the Top-Level Life-Cycle Phases 22 Figure 6: IEEE 12207 ...Software Detailed Design for the IEEE 12207 Life- Cycle Process 24 Figure 8: Incremental Life Cycle PSIM Configured for a Specific Project Using SEPG

  10. Investigating Students' Understanding of the Dissolving Process

    ERIC Educational Resources Information Center

    Naah, Basil M.; Sanger, Michael J.

    2013-01-01

    In a previous study, the authors identified several student misconceptions regarding the process of dissolving ionic compounds in water. The present study used multiple-choice questions whose distractors were derived from these misconceptions to assess students' understanding of the dissolving process at the symbolic and particulate levels. The…

  11. A new laboratory facility to study the interactions of aerosols, cloud droplets/ice crystals, and trace gases in a turbulent environment: The Π Chamber

    NASA Astrophysics Data System (ADS)

    Cantrell, W. H., II; Chang, K.; Ciochetto, D.; Niedermeier, D.; Bench, J.; Shaw, R. A.

    2014-12-01

    A detailed understanding of gas-aerosol-cloud interaction within the turbulent atmosphere is of prime importance for an accurate understanding of Earth's climate system. As one example: While every cloud droplet began as an aerosol particle, not every aerosol particle becomes a cloud droplet. The particle to droplet transformation requires that the particle be exposed to some critical concentration of water vapor, which differs for different combinations of particle size and chemical composition. Similarly, the formation of ice particles in mixed phase clouds is also catalyzed by aerosol particles. Even in the simplest scenarios it is challenging to gain a full understanding of the aerosol activation and ice nucleation processes. At least two other factors contribute significantly to the complexity observed in the atmosphere. First, aerosols and cloud particles are not static entities, but are continuously interacting with their chemical environment, and therefore changing in their properties. Second, clouds are ubiquitously turbulent, so thermodynamic and compositional variables, such as water vapor or other trace gas concentrations, fluctuate in space and time. Indeed, the coupling between turbulence and microphysical processes is one of the major research challenges in cloud physics. We have developed a multiphase, turbulent reaction chamber, (dubbed the Π Chamber, after the internal volume of 3.14 cubic meters) designed to address the problems outlined above. It is capable of pressures ranging from sea level to ~ 100 mbar, and can sustain temperatures of +40 to -55 ºC. We can independently control the temperatures on the surfaces of three heat transfer zones. This allows us to establish a temperature gradient between the floor and ceiling inducing Rayleigh-Benard convection and inducing a turbulent environment. Interior surfaces are electropolished stainless steel to facilitate cleaning before and after chemistry experiments. At present, supporting instrumentation includes a suite of aerosol generation and characterization techniques, a laser Doppler interferometer, and a holographic cloud particle imaging system.We will present detailed specifications, an overview of the supporting instrumentation, and initial characterization experiments from the Π chamber.

  12. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  13. Discrete-time moment closure models for epidemic spreading in populations of interacting individuals.

    PubMed

    Frasca, Mattia; Sharkey, Kieran J

    2016-06-21

    Understanding the dynamics of spread of infectious diseases between individuals is essential for forecasting the evolution of an epidemic outbreak or for defining intervention policies. The problem is addressed by many approaches including stochastic and deterministic models formulated at diverse scales (individuals, populations) and different levels of detail. Here we consider discrete-time SIR (susceptible-infectious-removed) dynamics propagated on contact networks. We derive a novel set of 'discrete-time moment equations' for the probability of the system states at the level of individual nodes and pairs of nodes. These equations form a set which we close by introducing appropriate approximations of the joint probabilities appearing in them. For the example case of SIR processes, we formulate two types of model, one assuming statistical independence at the level of individuals and one at the level of pairs. From the pair-based model we then derive a model at the level of the population which captures the behavior of epidemics on homogeneous random networks. With respect to their continuous-time counterparts, the models include a larger number of possible transitions from one state to another and joint probabilities with a larger number of individuals. The approach is validated through numerical simulation over different network topologies. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  15. Manipulating Color and Other Visual Information Influences Picture Naming at Different Levels of Processing: Evidence from Alzheimer Subjects and Normal Controls

    ERIC Educational Resources Information Center

    Zannino, Gian Daniele; Perri, Roberta; Salamone, Giovanna; Di Lorenzo, Concetta; Caltagirone, Carlo; Carlesimo, Giovanni A.

    2010-01-01

    There is now a large body of evidence suggesting that color and photographic detail exert an effect on recognition of visually presented familiar objects. However, an unresolved issue is whether these factors act at the visual, the semantic or lexical level of the recognition process. In the present study, we investigated this issue by having…

  16. Levels of Understanding of L2 Literary Texts under Repeated Readings: Factors Contributing to Readers' Processing of Second Language Literature and Their Learning Outcomes.

    ERIC Educational Resources Information Center

    Carroli, Piera

    This study investigated college students' levels of understanding of texts and reading processes, noting how they changed through a cycle of individual reading and writing followed by classroom comparison of students' responses, text re-reading, and re-writing. The study, which followed 17 students of continuing Italian over 6 weeks, involved…

  17. Developmental Dynamics of Emotion and Cognition Processes in Preschoolers

    PubMed Central

    Blankson, A. Nayena; O’Brien, Marion; Leerkes, Esther M.; Marcovitch, Stuart; Calkins, Susan D.; Weaver, Jennifer Miner

    2012-01-01

    Dynamic relations during the preschool years across processes of control and understanding in the domains of emotion and cognition were examined. Participants were 263 children (42% non-white) and their mothers who were seen first when the children were 3 years old and again when they were 4. Results indicated dynamic dependence among the processes studied. Specifically, change in cognitive processes of control and understanding were dependent upon initial levels of the other processes. Changes in emotion control and understanding were not predicted by earlier performance in the other processes. Findings are discussed with regard to the constructs of control and understanding and the developmental interrelations among emotion and cognitive processes. PMID:22925076

  18. Transfer of the epoxidation of soybean oil from batch to flow chemistry guided by cost and environmental issues.

    PubMed

    Kralisch, Dana; Streckmann, Ina; Ott, Denise; Krtschil, Ulich; Santacesaria, Elio; Di Serio, Martino; Russo, Vincenzo; De Carlo, Lucrezia; Linhart, Walter; Christian, Engelbert; Cortese, Bruno; de Croon, Mart H J M; Hessel, Volker

    2012-02-13

    The simple transfer of established chemical production processes from batch to flow chemistry does not automatically result in more sustainable ones. Detailed process understanding and the motivation to scrutinize known process conditions are necessary factors for success. Although the focus is usually "only" on intensifying transport phenomena to operate under intrinsic kinetics, there is also a large intensification potential in chemistry under harsh conditions and in the specific design of flow processes. Such an understanding and proposed processes are required at an early stage of process design because decisions on the best-suited tools and parameters required to convert green engineering concepts into practice-typically with little chance of substantial changes later-are made during this period. Herein, we present a holistic and interdisciplinary process design approach that combines the concept of novel process windows with process modeling, simulation, and simplified cost and lifecycle assessment for the deliberate development of a cost-competitive and environmentally sustainable alternative to an existing production process for epoxidized soybean oil. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Research and Teaching: Analyzing Upper Level Undergraduate Knowledge of Evolutionary Processes-- Can Class Discussions Help?

    ERIC Educational Resources Information Center

    Tran, Mark V.; Weigel, Emily G.; Richmond, Gail

    2014-01-01

    For biologists, a proper understanding of evolutionary processes is fundamentally important. However, undergraduate biology students often struggle to understand evolutionary processes, replacing factual knowledge with misconceptions on the subject. Classroom discussions can be effective active learning tools used to address these misconceptions…

  20. Vesicular trafficking of immune mediators in human eosinophils revealed by immunoelectron microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melo, Rossana C.N., E-mail: rossana.melo@ufjf.edu.br; Department of Medicine, Beth Israel Deaconess Medical Center, Harvard Medical School, 330 Brookline Avenue, CLS 943, Boston, MA 02215; Weller, Peter F.

    Electron microscopy (EM)-based techniques are mostly responsible for our current view of cell morphology at the subcellular level and continue to play an essential role in biological research. In cells from the immune system, such as eosinophils, EM has helped to understand how cells package and release mediators involved in immune responses. Ultrastructural investigations of human eosinophils enabled visualization of secretory processes in detail and identification of a robust, vesicular trafficking essential for the secretion of immune mediators via a non-classical secretory pathway associated with secretory (specific) granules. This vesicular system is mainly organized as large tubular-vesicular carriers (Eosinophil Sombreromore » Vesicles – EoSVs) actively formed in response to cell activation and provides a sophisticated structural mechanism for delivery of granule-stored mediators. In this review, we highlight the application of EM techniques to recognize pools of immune mediators at vesicular compartments and to understand the complex secretory pathway within human eosinophils involved in inflammatory and allergic responses. - Highlights: • Application of EM to understand the complex secretory pathway in human eosinophils. • EM techniques reveal an active vesicular system associated with secretory granules. • Tubular vesicles are involved in the transport of granule-derived immune mediators.« less

  1. Detailed Studies on the Structure and Dynamics of Reacting Dusty Flows at Normal and Microgravity

    NASA Technical Reports Server (NTRS)

    Andac, M. Gurhan; Cracchiola, Brad; Egolfopoulos, Fokion N.; Campbell, Charles S.

    1999-01-01

    Dusty reacting flows are of particular interest for a wide range of applications. Inert particles can alter the flammability and extinction limits of a combustible mixture. Reacting particles can release substantial amount of heat and can be used either for power generation or propulsion. Accumulation of combustible particles in air can result in explosions which, for example, can occur in grain elevators, during lumber milling and in mine galleries. Furthermore, inert particles are used as flow velocity markers in reacting flows, and their velocity is measured by non-intrusive laser diagnostic techniques. Despite their importance, dusty reacting flows have been less studied and understood compared to gas phase as well as sprays. The addition of solid particles in a flowing gas stream can lead to strong couplings between the two phases, which can be of dynamic, thermal, and chemical nature. The dynamic coupling between the two phases is caused by the inertia that causes the phases to move with different velocities. Furthermore, gravitational, thermophoretic, photophoretic, electrophoretic, diffusiophoretic, centrifugal, and magnetic forces can be exerted on the particles. In general, magnetic, electrophoretic, centrifugal, photophoretic, and diffusiophoretic can be neglected. On the other hand, thermophoretic forces, caused by steep temperature gradients, can be important. The gravitational forces are almost always present and can affect the dynamic response of large particles. Understanding and quantifying the chemical coupling between two phases is a challenging task. However, all reacting particles begin this process as inert particles, and they must be heated before they participate in the combustion process. Thus, one must first understand the interactions of inert particles in a combustion environment. The in-detail understanding of the dynamics and structure of dusty flows can be only advanced by considering simple flow geometries such as the opposed-jet, stagnation-type. In such configurations the imposed strain rate is well characterized, and the in-depth understanding of the details of the physico-chemical processes can be systematically obtained. A number of computational and experimental studies on spray and particle flows have been conducted in stagnation-type configurations. Numerically, the need for a hybrid Eulerian-Lagrangian approach has been identified by Continillo and Sirignano, and the use of such approach has allowed for the prediction of the phenomenon of droplet flow reversal. Gomez and Rosner have conducted a detailed study on the particle response in the opposed-jet configuration, and the particle thermophoretic diffusivities were determined experimentally. Sung, Law and co-workers have conducted numerical studies on the effect of strain rate and temperature gradients on the dynamics of inert particles, as a way of understanding potential errors in experimental LDV data that may arise from thermophoretic forces. This investigation is a combined experimental and numerical study on the details of reacting dusty flows. The specific tasks are: (1) Experimental determination of laminar flame speeds, and extinction strain rates of dusty flows at normal- and micro-gravity as functions of the particle type, particle initial diameter, particle initial number density, and gas phase chemical composition; (2) Detailed numerical simulation of the experiments. Results are compared with experiments and the adequacy of theoretical models is assessed; and (3) Provision of enhanced insight into the thermo-chemical coupling between the two phases.

  2. Nuclear Astrophysics at DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reifarth, R.; Bredeweg, T.A.; Esch, E.-I.

    2005-05-24

    One of the most interesting nuclear physics challenges is obtaining a detailed understanding of the nucleosynthesis processes of the elements. Knowledge about the stellar sites, and how they are governed by stellar evolution and cosmology are crucial in understanding the overall picture. Information on reaction rates for neutron- and charged-particle-induced reactions have a direct impact on existing stellar models. Except for the stable isotopes, very few neutron-induced reactions in the energy range of interest have been measured to date. DANCE measurements on stable and unstable isotopes will provide many of the missing key reactions that are needed to understand themore » nucleosynthesis of the heavy elements.« less

  3. A New Method to Retrieve the Data Requirements of the Remote Sensing Community – Exemplarily Demonstrated for Hyperspectral User Needs

    PubMed Central

    Nieke, Jens; Reusen, Ils

    2007-01-01

    User-driven requirements for remote sensing data are difficult to define, especially details on geometric, spectral and radiometric parameters. Even more difficult is a decent assessment of the required degrees of processing and corresponding data quality. It is therefore a real challenge to appropriately assess data costs and services to be provided. In 2006, the HYRESSA project was initiated within the framework 6 programme of the European Commission to analyze the user needs of the hyperspectral remote sensing community. Special focus was given to finding an answer to the key question, “What are the individual user requirements for hyperspectral imagery and its related data products?”. A Value-Benefit Analysis (VBA) was performed to retrieve user needs and address open items accordingly. The VBA is an established tool for systematic problem solving by supporting the possibility of comparing competing projects or solutions. It enables evaluation on the basis of a multidimensional objective model and can be augmented with expert's preferences. After undergoing a VBA, the scaling method (e.g., Law of Comparative Judgment) was applied for achieving the desired ranking judgments. The result, which is the relative value of projects with respect to a well-defined main objective, can therefore be produced analytically using a VBA. A multidimensional objective model adhering to VBA methodology was established. Thereafter, end users and experts were requested to fill out a Questionnaire of User Needs (QUN) at the highest level of detail - the value indicator level. The end user was additionally requested to report personal preferences for his particular research field. In the end, results from the experts' evaluation and results from a sensor data survey can be compared in order to understand user needs and the drawbacks of existing data products. The investigation – focusing on the needs of the hyperspectral user community in Europe – showed that a VBA is a suitable method for analyzing the needs of hyperspectral data users and supporting the sensor/data specification-building process. The VBA has the advantage of being easy to handle, resulting in a comprehensive evaluation. The primary disadvantage is the large effort in realizing such an analysis because the level of detail is extremely high.

  4. Informed consent and the readability of the written consent form.

    PubMed

    Sivanadarajah, N; El-Daly, I; Mamarelis, G; Sohail, M Z; Bates, P

    2017-11-01

    Introduction The aim of this study was to objectively ascertain the level of readability of standardised consent forms for orthopaedic procedures. Methods Standardised consent forms (both in summary and detailed formats) endorsed by the British Orthopaedic Association (BOA) were retrieved from orthoconsent.com and assessed for readability. This involved using an online tool to calculate the validated Flesch reading ease score (FRES). This was compared with the FRES for the National Health Service (NHS) Consent Form 1. Data were analysed and interpreted according to the FRES grading table. Results The FRES for Consent Form 1 was 55.6, relating to the literacy expected of an A level student. The mean FRES for the BOA summary consent forms (n=27) was 63.6 (95% confidence interval [CI]: 61.2-66.0) while for the detailed consent forms (n=32), it was 68.9 (95% CI: 67.7-70.0). All BOA detailed forms scored >60, correlating to the literacy expected of a 13-15-year-old. The detailed forms had a higher FRES than the summary forms (p<0.001). Conclusions This study demonstrates that the BOA endorsed standardised consent forms are much easier to read and understand than the NHS Consent Form 1, with the detailed BOA forms being the easiest to read. Despite this, owing to varying literacy levels, a significant proportion of patients may struggle to give informed consent based on the written information provided to them.

  5. Understanding r-process nucleosynthesis with dwarf galaxies

    NASA Astrophysics Data System (ADS)

    Ji, Alexander P.

    2018-06-01

    The Milky Way's faintest dwarf galaxy satellites each sample short, independent bursts of star formation from the first 1-2 Gyr of the universe. Their simple formation history makes them ideal systems to understand how rare events like neutron star mergers contribute to early enrichment of r-process elements. I will focus on the ultra-faint galaxy Reticulum II, which experienced a single prolific r-process event that left ~80% of its stars extremely enriched in r-process elements. I will present abundances of ~40 elements derived from the highest signal-to-noise high-resolution spectrum ever taken for an ultra-faint dwarf galaxy star. Precise measurements of elements from all three r-process peaks reaffirm the universal nature of the r-process abundance pattern from Ba to Ir. The first r-process peak is significantly lower than solar but matches other r-process enhanced stars. This constrains the neutron-richness of r-process ejecta in neutron star mergers. The radioactive element thorium is detected with a somewhat low abundance. Naive application of currently predicted initial production ratios could imply an age >20 Gyr, but more likely indicates that the initial production ratios require revision. The abundance of lighter elements up to Zn are consistent with extremely metal-poor Milky Way halo stars. These elements may eventually provide a way to test for other hypothesized r-process sites, but only after a more detailed understanding of the chemical evolution in this galaxy. Reticulum II provides a clean view of early r-process enrichment that can be used to understand the increasing number of r-process measurements in other dwarf galaxies.

  6. "I Do Which the Question": Students' Innovative Use of Technology Resources in the Language Classroom

    ERIC Educational Resources Information Center

    Dooly, Melinda

    2018-01-01

    Many reports suggest that the use of education technology can have a positive effect on language education. However, most of the research indicates that there is need for more detailed understanding of the pedagogical processes that support technology-enhanced language learning. This text takes a social semiotic perspective to examine multimodal…

  7. Silicon Schottky photovoltaic diodes for solar energy conversion

    NASA Technical Reports Server (NTRS)

    Anderson, W. A.

    1975-01-01

    Various factors in Schottky barrier solar cell fabrication are evaluated in order to improve understanding of the current flow mechanism and to isolate processing variables that improve efficiency. Results of finger design, substrate resistivity, surface finishing and activation energy studies are detailed. An increased fill factor was obtained by baking of the vacuum system to remove moisture.

  8. X-ray diffraction imaging (topography) of electroopticcrystals by synchrotron radiation

    NASA Technical Reports Server (NTRS)

    Steiner, Bruce; Kuriyama, Masao; Dobbyn, Ronald C.; Laor, Uri

    1988-01-01

    Information of special interest to crystal growers and device physicists now available from monochromatic synchrotron diffraction imaging (topography) is reviewed. Illustrations are taken from a variety of electro-optic crystals. Aspects of the detailed understanding of crystal growth processes obtainable from carefully selected samples are described. Finally, new experimental opportunities now available for exploitation are indicated.

  9. Smartphone Apps on the Mobile Web: An Exploratory Case Study of Business Models

    ERIC Educational Resources Information Center

    Ford, Caroline Morgan

    2012-01-01

    The purpose of this research is to explore the business strategies of a firm seeking to develop and profitably market a mobile smartphone application to understand how small, digital entrepreneurships may build sustainable business models given substantial market barriers. Through a detailed examination of one firm's process to try to…

  10. Helping Students Understand the Role of Symmetry in Chemistry Using the Particle-in-a-Box Model

    ERIC Educational Resources Information Center

    Manae, Meghna A.; Hazra, Anirban

    2016-01-01

    In a course on chemical applications of symmetry and group theory, students learn to use several useful tools (like character tables, projection operators, and correlation tables), but in the process of learning the mathematical details, they often miss the conceptual big picture about "why" and "how" symmetry leads to the…

  11. Language or Music, Mother or Mozart? Structural and Environmental Influences on Infants' Language Networks

    ERIC Educational Resources Information Center

    Dehaene-Lambertz, G.; Montavont, A.; Jobert, A.; Allirol, L.; Dubois, J.; Hertz-Pannier, L.; Dehaene, S.

    2010-01-01

    Understanding how language emerged in our species calls for a detailed investigation of the initial specialization of the human brain for speech processing. Our earlier research demonstrated that an adult-like left-lateralized network of perisylvian areas is already active when infants listen to sentences in their native language, but did not…

  12. Starbursts and their dynamics

    NASA Technical Reports Server (NTRS)

    Norman, Colin

    1987-01-01

    Detailed mechanisms associated with dynamical process occurring in starburst galaxies are considered including the role of bars, waves, mergers, sinking satellites, self gravitating gas and bulge heating. The current understanding of starburst galaxies both observational and theoretical is placed in the context of theories of galaxy formations, Hubble sequence evolution, starbursts and activity, and the nature of quasar absorption lines.

  13. Class and Home Problems. Modeling an Explosion: The Devil Is in the Details

    ERIC Educational Resources Information Center

    Hart, Peter W.; Rudie, Alan W.

    2011-01-01

    Within the past 15 years, three North American pulp mills experienced catastrophic equipment failures while using 50 wt% hydrogen peroxide. In two cases, explosions occurred when normal pulp flow was interrupted due to other process problems. To understand the accidents, a kinetic model of alkali-catalyzed decomposition of peroxide was developed.…

  14. Defining High-Risk Precursor Signaling to Advance Breast Cancer Risk Assessment and Prevention

    DTIC Science & Technology

    2016-03-01

    the incidence and lethality of breast cancer will require a detailed understanding of the earliest tissue changes that ultimately drive the process of...Changes/Problems 12-14 6. Products ...as stated in the approved SOW. If the application listed milestones/target dates for important activities or phases of the project, identify these

  15. Telemetry downlink interfaces and level-zero processing

    NASA Technical Reports Server (NTRS)

    Horan, S.; Pfeiffer, J.; Taylor, J.

    1991-01-01

    The technical areas being investigated are as follows: (1) processing of space to ground data frames; (2) parallel architecture performance studies; and (3) parallel programming techniques. Additionally, the University administrative details and the technical liaison between New Mexico State University and Goddard Space Flight Center are addressed.

  16. The Diamond Beamline Controls and Data Acquisition Software Architecture

    NASA Astrophysics Data System (ADS)

    Rees, N.

    2010-06-01

    The software for the Diamond Light Source beamlines[1] is based on two complementary software frameworks: low level control is provided by the Experimental Physics and Industrial Control System (EPICS) framework[2][3] and the high level user interface is provided by the Java based Generic Data Acquisition or GDA[4][5]. EPICS provides a widely used, robust, generic interface across a wide range of hardware where the user interfaces are focused on serving the needs of engineers and beamline scientists to obtain detailed low level views of all aspects of the beamline control systems. The GDA system provides a high-level system that combines an understanding of scientific concepts, such as reciprocal lattice coordinates, a flexible python syntax scripting interface for the scientific user to control their data acquisition, and graphical user interfaces where necessary. This paper describes the beamline software architecture in more detail, highlighting how these complementary frameworks provide a flexible system that can accommodate a wide range of requirements.

  17. Rovibrational bound states of SO2 isotopologues. I: Total angular momentum J = 0-10

    NASA Astrophysics Data System (ADS)

    Kumar, Praveen; Ellis, Joseph; Poirier, Bill

    2015-04-01

    Isotopic variation of the rovibrational bound states of SO2 for the four stable sulfur isotopes 32-34,36S is investigated in comprehensive detail. In a two-part series, we compute the low-lying energy levels for all values of total angular momentum in the range J = 0-20. All rovibrational levels are computed, to an extremely high level of numerical convergence. The calculations have been carried out using the ScalIT suite of parallel codes. The present study (Paper I) examines the J = 0-10 rovibrational levels, providing unambiguous symmetry and rovibrational label assignments for each computed state. The calculated vibrational energy levels exhibit very good agreement with previously reported experimental and theoretical data. Rovibrational energy levels, calculated without any Coriolis approximations, are reported here for the first time. Among other potential ramifications, this data will facilitate understanding of the origin of mass-independent fractionation of sulfur isotopes in the Archean rock record-of great relevance for understanding the "oxygen revolution".

  18. Lhires III High Resolution Spectrograph

    NASA Astrophysics Data System (ADS)

    Thizy, O.

    2007-05-01

    By spreading the light from celestial objects by wavelength, spectroscopists are like detectives looking for clues and identifying guilty phenomena that shape their spectra. We will review some basic principles in spectroscopy that will help, at our amateur level, to understand how spectra are shaped. We will review the Lhires III highresolution spectrograph Mark Three that was designed to reveal line profile details and subtle changes. Then, we will do an overview of educational and scientific projects that are conducted with the Lhires III and detail the COROT Be star program and the BeSS database for which the spectrograph is a key instrument.

  19. When parsimony is not enough: Considering dual processes and dual levels of influence in sexual decision making

    PubMed Central

    Rendina, H. Jonathon

    2015-01-01

    The literature on sexual decision making that has been used to understand behaviors relevant to HIV and STI risk has relied primarily on cognitive antecedents of behavior. In contrast, several prominent models of decision making outside of the sexual behavior literature rely on dual process models, in which both affective and cognitive processing are considered important precursors to behavior. Moreover, much of the literature on sexual behavior utilizes individual-level traits and characteristics to predict aggregated sexual behavior, despite decision making itself being a situational or event-level process. This paper proposes a framework for understanding sexual decision making as the result of dual processes (affective and cognitive) operating at dual level of influence (individual and situational). Finally, the paper ends with a discussion of the conceptual and methodological benefits and challenges to its use and future directions for research. PMID:26168978

  20. Decoding Environmental Processes Using Radioactive Isotopes for the Post-Radioactive Contamination Recovery Assessment

    NASA Astrophysics Data System (ADS)

    Yasumiishi, Misa; Nishimura, Taku; Osawa, Kazutoshi; Renschler, Chris

    2017-04-01

    The continual monitoring of environmental radioactive levels in Fukushima, Japan following the nuclear plant accident in March 2011 provides our society with valuable information in two ways. First, the collected data can be used as an indicator to assess the progress of decontamination efforts. Secondly, the collected data also can be used to understand the behavior of radioactive isotopes in the environment which leads to further understanding of the landform processes. These two aspects are inseparable for us to understand the effects of radioactive contamination in a dynamic environmental system. During the summer of 2016, 27 soil core samples were collected on a farmer's land (rice paddies and forest) in Fukushima, about 20 km northwest of the nuclear plant. Each core was divided into 2.0 - 3.0 cm slices for the Cs-134, Cs-137, and I-131 level measurement. The collected data is being analyzed from multiple perspectives: temporal, spatial, and geophysical. In the forest area, even on the same hillslope, multiple soil types and horizon depths were observed which indicates the challenges in assessing the subsurface radioactive isotope movements. It appears that although highly humic soils show higher or about the same level of radioactivity in the surface layers, as the depth increased, the radioactivity decreased more in those samples compared with more sandy soils. With regard to the direction a slope faces and the sampling altitudes, the correlation between those attributes and radioactivity levels is inconclusive at this moment. The altitude might have affected the fallout level on a single hillslope-basis. However, to determine the correlation, further sampling and the detailed analysis of vegetation and topography might be necessary. Where the surface soil was scraped and new soil was brought in, former rice paddy surface layers did show three-magnitude levels lower of radioactivity in the top layer when compared with forest soils. At the foot of forest slopes where the surface soil was scraped and litter was cleared, the scraping showed mixed results in radioactivity reduction. It is estimated that by the completion of soil decontamination in 2020, up to 22 million cubic meters of so-called 'contaminated soils' will have been scraped off in the affected areas and transferred to an underground storage. Understanding the radioactive isotope behaviors is crucial to assessing the financial and environmental consequences of such measures. As an example, a 30-year simulation of a 5-13 % hillslope under thick vegetation with GeoWEPP (the Geospatial interface for the Water Erosion Prediction Project) resulted in a very small soil loss on the hillslope. However, the results showed about five tons of soil loss through channels and as sediment discharge annually. On the hillslope, the radioactivity level in about the top 4.0 cm of the soil exceeded the 8,000 Bq/kg threshold which the Japanese government has set for surface soil removal. Referring to the case study data in Fukushima, this presentation will discuss how environmental decontamination measures (e.g. forest clearing) and monitoring methods should be considered and planned against dynamic environmental processes.

Top