Research accomplished at the Knowledge Based Systems Lab: IDEF3, version 1.0
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Menzel, Christopher P.; Mayer, Paula S. D.
1991-01-01
An overview is presented of the foundations and content of the evolving IDEF3 process flow and object state description capture method. This method is currently in beta test. Ongoing efforts in the formulation of formal semantics models for descriptions captured in the outlined form and in the actual application of this method can be expected to cause an evolution in the method language. A language is described for the representation of process and object state centered system description. IDEF3 is a scenario driven process flow modeling methodology created specifically for these types of descriptive activities.
THE RADIATIVE NEUTRON CAPTURE ON 2H, 6Li, 7Li, 12C AND 13C AT ASTROPHYSICAL ENERGIES
NASA Astrophysics Data System (ADS)
Dubovichenko, Sergey; Dzhazairov-Kakhramanov, Albert; Burkova, Natalia
2013-05-01
The continued interest in the study of radiative neutron capture on atomic nuclei is due, on the one hand, to the important role played by this process in the analysis of many fundamental properties of nuclei and nuclear reactions, and, on the other hand, to the wide use of the capture cross-section data in the various applications of nuclear physics and nuclear astrophysics, and, also, to the importance of the analysis of primordial nucleosynthesis in the Universe. This paper is devoted to the description of results for the processes of the radiative neutron capture on certain light atomic nuclei at thermal and astrophysical energies. The consideration of these processes is done within the framework of the potential cluster model (PCM), general description of which was given earlier. The methods of usage of the results obtained, based on the phase shift analysis intercluster potentials, are demonstrated in calculations of the radiative capture characteristics. The considered capture reactions are not part of stellar thermonuclear cycles, but involve in the basic reaction chain of primordial nucleosynthesis in the course of the Universe formation.
NASA Astrophysics Data System (ADS)
Mumpower, M. R.; Kawano, T.; Ullmann, J. L.; Krtička, M.; Sprouse, T. M.
2017-08-01
Radiative neutron capture is an important nuclear reaction whose accurate description is needed for many applications ranging from nuclear technology to nuclear astrophysics. The description of such a process relies on the Hauser-Feshbach theory which requires the nuclear optical potential, level density, and γ -strength function as model inputs. It has recently been suggested that the M 1 scissors mode may explain discrepancies between theoretical calculations and evaluated data. We explore statistical model calculations with the strength of the M 1 scissors mode estimated to be dependent on the nuclear deformation of the compound system. We show that the form of the M 1 scissors mode improves the theoretical description of evaluated data and the match to experiment in both the fission product and actinide regions. Since the scissors mode occurs in the range of a few keV to a few MeV, it may also impact the neutron capture cross sections of neutron-rich nuclei that participate in the rapid neutron capture process of nucleosynthesis. We comment on the possible impact to nucleosynthesis by evaluating neutron capture rates for neutron-rich nuclei with the M 1 scissors mode active.
Capturing Cognitive Processing Time for Active Authentication
2014-02-01
cognitive fingerprint for continuous authentication. Its effectiveness has been verified through a campus-wide experiment at Iowa State University...2 3.1 Cognitive Fingerprint Description...brief to capture a “ cognitive fingerprint .” In the current keystroke-authentication commercial market, some products combine the timing information of
Mumpower, Matthew Ryan; Kawano, Toshihiko; Ullmann, John Leonard; ...
2017-08-17
Radiative neutron capture is an important nuclear reaction whose accurate description is needed for many applications ranging from nuclear technology to nuclear astrophysics. The description of such a process relies on the Hauser-Feshbach theory which requires the nuclear optical potential, level density, and γ-strength function as model inputs. It has recently been suggested that the M1 scissors mode may explain discrepancies between theoretical calculations and evaluated data. We explore statistical model calculations with the strength of the M1 scissors mode estimated to be dependent on the nuclear deformation of the compound system. We show that the form of the M1more » scissors mode improves the theoretical description of evaluated data and the match to experiment in both the fission product and actinide regions. Since the scissors mode occurs in the range of a few keV to a few MeV, it may also impact the neutron capture cross sections of neutron-rich nuclei that participate in the rapid neutron capture process of nucleosynthesis. As a result, we comment on the possible impact to nucleosynthesis by evaluating neutron capture rates for neutron-rich nuclei with the M1 scissors mode active.« less
NASA Technical Reports Server (NTRS)
Bonanne, Kevin H.
2011-01-01
Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.
Typewriting: Toward Duplicating Success
ERIC Educational Resources Information Center
Orsborn, Karen J.
1977-01-01
A description of two projects (secretarial handbook and memo pad and personalized stationery) for use in teaching the duplication process that will capture the interests of students in an advanced typewriting class. (HD)
Modeling Amorphous Microporous Polymers for CO2 Capture and Separations.
Kupgan, Grit; Abbott, Lauren J; Hart, Kyle E; Colina, Coray M
2018-06-13
This review concentrates on the advances of atomistic molecular simulations to design and evaluate amorphous microporous polymeric materials for CO 2 capture and separations. A description of atomistic molecular simulations is provided, including simulation techniques, structural generation approaches, relaxation and equilibration methodologies, and considerations needed for validation of simulated samples. The review provides general guidelines and a comprehensive update of the recent literature (since 2007) to promote the acceleration of the discovery and screening of amorphous microporous polymers for CO 2 capture and separation processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruiz, Steven Adriel
The following discussion contains a high-level description of methods used to implement software for data processing. It describes the required directory structures and file handling required to use Excel's Visual Basic for Applications programming language and how to identify shot, test and capture types to appropriately process data. It also describes how to interface with the software.
1995-09-01
vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems
Decomposing phenotype descriptions for the human skeletal phenome.
Groza, Tudor; Hunter, Jane; Zankl, Andreas
2013-01-01
Over the course of the last few years there has been a significant amount of research performed on ontology-based formalization of phenotype descriptions. The intrinsic value and knowledge captured within such descriptions can only be expressed by taking advantage of their inner structure that implicitly combines qualities and anatomical entities. We present a meta-model (the Phenotype Fragment Ontology) and a processing pipeline that enable together the automatic decomposition and conceptualization of phenotype descriptions for the human skeletal phenome. We use this approach to showcase the usefulness of the generic concept of phenotype decomposition by performing an experimental study on all skeletal phenotype concepts defined in the Human Phenotype Ontology.
The gas heterogeneous flows cleaning technology from corona discharge field
NASA Astrophysics Data System (ADS)
Bogdanov, A.; Tokarev, A.; Judanov, V.; Vinogradov, V.
2017-11-01
A nanogold capture and extraction from combustion products of Kara-Keche coal, description the process: a coal preparation to experiments, nanogold introducing in its composition, temperature and time performance of combustion, device and function of experimental apparatus, gas-purification of the gas flow process and receiving combustion products (condensate, coke, ash, rags) is offerred.
NASA Astrophysics Data System (ADS)
Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.
2003-03-01
A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.
NASA Technical Reports Server (NTRS)
Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.
1991-01-01
The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.
The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis.
Hachaj, Tomasz; Ogiela, Marek R
2016-06-01
The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.
1992-01-01
In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.
Averaging, passage through resonances, and capture into resonance in two-frequency systems
NASA Astrophysics Data System (ADS)
Neishtadt, A. I.
2014-10-01
Applying small perturbations to an integrable system leads to its slow evolution. For an approximate description of this evolution the classical averaging method prescribes averaging the rate of evolution over all the phases of the unperturbed motion. This simple recipe does not always produce correct results, because of resonances arising in the process of evolution. The phenomenon of capture into resonance consists in the system starting to evolve in such a way as to preserve the resonance property once it has arisen. This paper is concerned with application of the averaging method to a description of evolution in two-frequency systems. It is assumed that the trajectories of the averaged system intersect transversally the level surfaces of the frequency ratio and that certain other conditions of general position are satisfied. The rate of evolution is characterized by a small parameter \\varepsilon. The main content of the paper is a proof of the following result: outside a set of initial data with measure of order \\sqrt \\varepsilon the averaging method describes the evolution to within O(\\sqrt \\varepsilon \\vert\\ln\\varepsilon\\vert) for periods of time of order 1/\\varepsilon. This estimate is sharp. The exceptional set of measure \\sqrt \\varepsilon contains the initial data for phase points captured into resonance. A description of the motion of such phase points is given, along with a survey of related results on averaging. Examples of capture into resonance are presented for some problems in the dynamics of charged particles. Several open problems are stated. Bibliography: 65 titles.
Connecting single cell to collective cell behavior in a unified theoretical framework
NASA Astrophysics Data System (ADS)
George, Mishel; Bullo, Francesco; Campàs, Otger
Collective cell behavior is an essential part of tissue and organ morphogenesis during embryonic development, as well as of various disease processes, such as cancer. In contrast to many in vitro studies of collective cell migration, most cases of in vivo collective cell migration involve rather small groups of cells, with large sheets of migrating cells being less common. The vast majority of theoretical descriptions of collective cell behavior focus on large numbers of cells, but fail to accurately capture the dynamics of small groups of cells. Here we introduce a low-dimensional theoretical description that successfully captures single cell migration, cell collisions, collective dynamics in small groups of cells, and force propagation during sheet expansion, all within a common theoretical framework. Our description is derived from first principles and also includes key phenomenological aspects of cell migration that control the dynamics of traction forces. Among other results, we explain the counter-intuitive observations that pairs of cells repel each other upon collision while they behave in a coordinated manner within larger clusters.
Physical Model for the Evolution of the Genetic Code
NASA Astrophysics Data System (ADS)
Yamashita, Tatsuro; Narikiyo, Osamu
2011-12-01
Using the shape space of codons and tRNAs we give a physical description of the genetic code evolution on the basis of the codon capture and ambiguous intermediate scenarios in a consistent manner. In the lowest dimensional version of our description, a physical quantity, codon level is introduced. In terms of the codon levels two scenarios are typically classified into two different routes of the evolutional process. In the case of the ambiguous intermediate scenario we perform an evolutional simulation implemented cost selection of amino acids and confirm a rapid transition of the code change. Such rapidness reduces uncomfortableness of the non-unique translation of the code at intermediate state that is the weakness of the scenario. In the case of the codon capture scenario the survival against mutations under the mutational pressure minimizing GC content in genomes is simulated and it is demonstrated that cells which experience only neutral mutations survive.
Dynamic CDM strategies in an EHR environment.
Bieker, Michael; Bailey, Spencer
2012-02-01
A dynamic charge description master (CDM) integrates information from clinical ancillary systems into the charge-capture process, so an organization can reduce its reliance on the patient accounting system as the sole source of billing information. By leveraging the information from electronic ancillary systems, providers can eliminate the need for paper charge-capture forms and see increased accuracy and efficiency in the maintenance of billing information. Before embarking on a dynamic CDM strategy, organizations should first determine their goals for implementing an EHR system, include revenue cycle leaders on the EHR implementation team, and carefully weigh the pros and cons of CDM design decisions.
Electron capture and excitation processes in H+-H collisions in dense quantum plasmas
NASA Astrophysics Data System (ADS)
Jakimovski, D.; Markovska, N.; Janev, R. K.
2016-10-01
Electron capture and excitation processes in proton-hydrogen atom collisions taking place in dense quantum plasmas are studied by employing the two-centre atomic orbital close-coupling (TC-AOCC) method. The Debye-Hückel cosine (DHC) potential is used to describe the plasma screening effects on the Coulomb interaction between charged particles. The properties of a hydrogen atom with DHC potential are investigated as a function of the screening strength of the potential. It is found that the decrease in binding energy of nl levels with increasing screening strength is considerably faster than in the case of the Debye-Hückel (DH) screening potential, appropriate for description of charged particle interactions in weakly coupled classical plasmas. This results in a reduction in the number of bound states in the DHC potential with respect to that in the DH potential for the same plasma screening strength, and is reflected in the dynamics of excitation and electron capture processes for the two screened potentials. The TC-AOCC cross sections for total and state-selective electron capture and excitation cross sections with the DHC potential are calculated for a number of representative screening strengths in the 1-300 keV energy range and compared with those for the DH and pure Coulomb potential. The total capture cross sections for a selected number of screening strengths are compared with the available results from classical trajectory Monte Carlo calculations.
1992-05-01
methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key
Verhulst, Sarah; Altoè, Alessandro; Vasilkov, Viacheslav
2018-03-01
Models of the human auditory periphery range from very basic functional descriptions of auditory filtering to detailed computational models of cochlear mechanics, inner-hair cell (IHC), auditory-nerve (AN) and brainstem signal processing. It is challenging to include detailed physiological descriptions of cellular components into human auditory models because single-cell data stems from invasive animal recordings while human reference data only exists in the form of population responses (e.g., otoacoustic emissions, auditory evoked potentials). To embed physiological models within a comprehensive human auditory periphery framework, it is important to capitalize on the success of basic functional models of hearing and render their descriptions more biophysical where possible. At the same time, comprehensive models should capture a variety of key auditory features, rather than fitting their parameters to a single reference dataset. In this study, we review and improve existing models of the IHC-AN complex by updating their equations and expressing their fitting parameters into biophysical quantities. The quality of the model framework for human auditory processing is evaluated using recorded auditory brainstem response (ABR) and envelope-following response (EFR) reference data from normal and hearing-impaired listeners. We present a model with 12 fitting parameters from the cochlea to the brainstem that can be rendered hearing impaired to simulate how cochlear gain loss and synaptopathy affect human population responses. The model description forms a compromise between capturing well-described single-unit IHC and AN properties and human population response features. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
FBI Fingerprint Image Capture System High-Speed-Front-End throughput modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rathke, P.M.
1993-09-01
The Federal Bureau of Investigation (FBI) has undertaken a major modernization effort called the Integrated Automated Fingerprint Identification System (IAFISS). This system will provide centralized identification services using automated fingerprint, subject descriptor, mugshot, and document processing. A high-speed Fingerprint Image Capture System (FICS) is under development as part of the IAFIS program. The FICS will capture digital and microfilm images of FBI fingerprint cards for input into a central database. One FICS design supports two front-end scanning subsystems, known as the High-Speed-Front-End (HSFE) and Low-Speed-Front-End, to supply image data to a common data processing subsystem. The production rate of themore » HSFE is critical to meeting the FBI`s fingerprint card processing schedule. A model of the HSFE has been developed to help identify the issues driving the production rate, assist in the development of component specifications, and guide the evolution of an operations plan. A description of the model development is given, the assumptions are presented, and some HSFE throughput analysis is performed.« less
Mott physics beyond the Brinkman-Rice scenario
NASA Astrophysics Data System (ADS)
Wysokiński, Marcin M.; Fabrizio, Michele
2017-04-01
The main flaw of the well-known Brinkman-Rice description, obtained through the Gutzwiller approximation, of the paramagnetic Mott transition in the Hubbard model is in neglecting high-energy virtual processes that generate, for instance, the antiferromagnetic exchange J ˜t2/U . Here, we propose a way to capture those processes by combining the Brinkman-Rice approach with a variational Schrieffer-Wolff transformation, and apply this method to study the single-band metal-to-insulator transition in a Bethe lattice with infinite coordination number, where the Gutzwiller approximation becomes exact. We indeed find for the Mott transition a description very close to the real one provided by the dynamical mean-field theory, an encouraging result in view of possible applications to more involved models.
Describing content in middle school science curricula
NASA Astrophysics Data System (ADS)
Schwarz-Ballard, Jennifer A.
As researchers and designers, we intuitively recognize differences between curricula and describe them in terms of design strategy: project-based, laboratory-based, modular, traditional, and textbook, among others. We assume that practitioners recognize the differences in how each requires that students use knowledge, however these intuitive differences have not been captured or systematically described by the existing languages for describing learning goals. In this dissertation I argue that we need new ways of capturing relationships among elements of content, and propose a theory that describes some of the important differences in how students reason in differently designed curricula and activities. Educational researchers and curriculum designers have taken a variety of approaches to laying out learning goals for science. Through an analysis of existing descriptions of learning goals I argue that to describe differences in the understanding students come away with, they need to (1) be specific about the form of knowledge, (2) incorporate both the processes through which knowledge is used and its form, and (3) capture content development across a curriculum. To show the value of inquiry curricula, learning goals need to incorporate distinctions among the variety of ways we ask students to use knowledge. Here I propose the Epistemic Structures Framework as one way to describe differences in students reasoning that are not captured by existing descriptions of learning goals. The usefulness of the Epistemic Structures framework is demonstrated in the four curriculum case study examples in Part II of this work. The curricula in the case studies represent a range of content coverage, curriculum structure, and design rationale. They serve both to illustrate the Epistemic Structures analysis process and make the case that it does in fact describe learning goals in a way that captures important differences in students reasoning in differently designed curricula. Describing learning goals in terms of Epistemic Structures provides one way to define what we mean when we talk about "project-based" curricula and demonstrate its "value added" to educators, administrators and policy makers.
Design Rules and Analysis of a Capture Mechanism for Rendezvous between a Space Tether and Payload
NASA Technical Reports Server (NTRS)
Sorensen, Kirk F.; Canfield, Stephen L.; Norris, Marshall A.
2006-01-01
Momentum-exchange/electrodynamic reboost (MXER) tether systems have been proposed to serve as an "upper stage in space". A MXER tether station would boost spacecraft from low Earth orbit to a high-energy orbit quickly, like a high-thrust rocket. Then, it would slowly rebuild its orbital momentum through electrodynamic thrust, minimizing the use of propellant. One of the primary challenges in developing a momentum-exchange/electrodynamic reboost tether system as identified by the 2003 MXER Technology Assessment Group is in the development of a mechanism that will enable the processes of capture, carry and release of a payload by the rotating tether as required by the MXER tether approach. This paper will present a concept that will achieve the desired goals of the capture system. This solution is presented as a multi-DOF (degree-of-freedom) capture mechanism with nearly passive operation that features matching of the capture space and expected window of capture error, efficient use of mass and nearly passive actuation during the capture process. This paper will describe the proposed capture mechanism concept and provide an evaluation of the concept through a dynamic model and experimental tests performed on a prototype article of the mechanism in a dynamically similar environment. This paper will also develop a set of rules to guide the design of such a capture mechanism based on analytical and experimental analyses. The primary contributions of this paper will be a description of the proposed capture mechanism concept, a collection of rules to guide its design, and empirical and model information that can be used to evaluate the capability of the concept
A description of the first live Poouli captured
Baker, P.E.
1998-01-01
The Poouli (Melamprosops phaeosoma) is an endangered Hawaiian honeycreeper found only on Maui, Hawaii. It was rare at the time of its discovery in 1973, but by 1997 was on the brink of extinction with fewer than six individuals left. Two specimens were collected for the description of the species, but both proved to be immature by comparison with a pair of adults at a nest. Until 1997 no Poouli had ever been captured alive, and consequently descriptions of adult Poouli were produced from field observations. In 1997, I captured an adult male Poouli which is described here for the first time. Detailed comparisons of the plumage of this adult with that of an immature specimen and previous descriptions of the species are discussed in this paper, as are differences in plumage between adult and immature males and females that may aid the sexing and ageing of birds in the field.
Understanding and Leveraging a Supplier’s CMMI Efforts: A Guidebook for Acquirers (Revised for V1.3)
2011-09-01
and SCAMPI (e.g., ISO /IEC 15288, 12207 , 15504; ISO 9001, EIA 632, and IEEE 1220), or if there are processes to be implemented that are not captured...process descriptions and tailoring as well as any formal audit results. ISO -9001:2008 is a quality management standard for development created and...maintained by the International Organisation for Standardisation ( ISO ). The American National Standard equivalent is ANSI/ ISO /ASQ Q9001-2008
New quasibound states of the compound nucleus in α -particle capture by the nucleus
NASA Astrophysics Data System (ADS)
Maydanyuk, Sergei P.; Zhang, Peng-Ming; Zou, Li-Ping
2017-07-01
We generalize the theory of nuclear decay and capture of Gamow that is based on tunneling through the barrier and internal oscillations inside the nucleus. In our formalism an additional factor is obtained, which describes distribution of the wave function of the the α particle inside the nuclear region. We discover new most stable states (called quasibound states) of the compound nucleus (CN) formed during the capture of α particle by the nucleus. With a simple example, we explain why these states cannot appear in traditional calculations of the α capture cross sections based on monotonic penetrabilities of a barrier, but they appear in a complete description of the evolution of the CN. Our result is obtained by a complete description of the CN evolution, which has the advantages of (1) a clear picture of the formation of the CN and its disintegration, (2) a detailed quantum description of the CN, (3) tests of the calculated amplitudes based on quantum mechanics (not realized in other approaches), and (4) high accuracy of calculations (not achieved in other approaches). These peculiarities are shown with the capture reaction of α +44Ca . We predict quasibound energy levels and determine fusion probabilities for this reaction. The difference between our approach and theory of quasistationary states with complex energies applied for the α capture is also discussed. We show (1) that theory does not provide calculations for the cross section of α capture (according to modern models of the α capture), in contrast with our formalism, and (2) these two approaches describe different states of the α capture (for the same α -nucleus potential).
NASA Astrophysics Data System (ADS)
Gomez, John A.; Henderson, Thomas M.; Scuseria, Gustavo E.
2017-11-01
In electronic structure theory, restricted single-reference coupled cluster (CC) captures weak correlation but fails catastrophically under strong correlation. Spin-projected unrestricted Hartree-Fock (SUHF), on the other hand, misses weak correlation but captures a large portion of strong correlation. The theoretical description of many important processes, e.g. molecular dissociation, requires a method capable of accurately capturing both weak and strong correlation simultaneously, and would likely benefit from a combined CC-SUHF approach. Based on what we have recently learned about SUHF written as particle-hole excitations out of a symmetry-adapted reference determinant, we here propose a heuristic CC doubles model to attenuate the dominant spin collective channel of the quadratic terms in the CC equations. Proof of principle results presented here are encouraging and point to several paths forward for improving the method further.
Theoretical investigation of the electron capture and loss processes in the collisions of He2+ + Ne.
Hong, Xuhai; Wang, Feng; Jiao, Yalong; Su, Wenyong; Wang, Jianguo; Gou, Bingcong
2013-08-28
Based on the time-dependent density functional theory, a method is developed to study ion-atom collision dynamics, which self-consistently couples the quantum mechanical description of electron dynamics with the classical treatment of the ion motion. Employing real-time and real-space method, the coordinate space translation technique is introduced to allow one to focus on the region of target or projectile depending on the actual concerned process. The benchmark calculations are performed for the collisions of He(2+) + Ne, and the time evolution of electron density distribution is monitored, which provides interesting details of the interaction dynamics between the electrons and ion cores. The cross sections of single and many electron capture and loss have been calculated in the energy range of 1-1000 keV/amu, and the results show a good agreement with the available experiments over a wide range of impact energies.
Role of nuclear reactions on stellar evolution of intermediate-mass stars
NASA Astrophysics Data System (ADS)
Möller, H.; Jones, S.; Fischer, T.; Martínez-Pinedo, G.
2018-01-01
The evolution of intermediate-mass stars (8 - 12 solar masses) represents one of the most challenging subjects in nuclear astrophysics. Their final fate is highly uncertain and strongly model dependent. They can become white dwarfs, they can undergo electron-capture or core-collapse supernovae or they might even proceed towards explosive oxygen burning and a subsequent thermonuclear explosion. We believe that an accurate description of nuclear reactions is crucial for the determination of the pre-supernova structure of these stars. We argue that due to the possible development of an oxygen-deflagration, a hydrodynamic description has to be used. We implement a nuclear reaction network with ∼200 nuclear species into the implicit hydrodynamic code AGILE. The reaction network considers all relevant nuclear electron captures and beta-decays. For selected relevant nuclear species, we include a set of updated reaction rates, for which we discuss the role for the evolution of the stellar core, at the example of selected stellar models. We find that the final fate of these intermediate-mass stars depends sensitively on the density threshold for weak processes that deleptonize the core.
NASA Astrophysics Data System (ADS)
Escher, Jutta
2016-09-01
Cross sections for compound-nuclear reactions involving unstable targets are important for many applications, but can often not be measured directly. Several indirect methods have recently been proposed to determine neutron capture cross sections for unstable isotopes. These methods aim at constraining statistical calculations of capture cross sections with data obtained from the decay of the compound nucleus relevant to the desired reaction. Each method produces this compound nucleus in a different manner (via a light-ion reaction, a photon-induced reaction, or β decay) and requires additional ingredients to yield the sought-after cross section. This contribution focuses on the process of determining capture cross sections from inelastic scattering and transfer experiments. Specifically, theoretical descriptions of the (p,d) transfer reaction have been developed to complement recent measurements in the Zr-Y region. The procedure for obtaining constraints for unknown capture cross sections is illustrated. The main advantages and challenges of this approach are compared to those of the proposed alternatives. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Advances of lab-on-a-chip in isolation, detection and post-processing of circulating tumour cells.
Yu, Ling; Ng, Shu Rui; Xu, Yang; Dong, Hua; Wang, Ying Jun; Li, Chang Ming
2013-08-21
Circulating tumour cells (CTCs) are shed by primary tumours and are found in the peripheral blood of patients with metastatic cancers. Recent studies have shown that the number of CTCs corresponds with disease severity and prognosis. Therefore, detection and further functional analysis of CTCs are important for biomedical science, early diagnosis of cancer metastasis and tracking treatment efficacy in cancer patients, especially in point-of-care applications. Over the last few years, there has been an increasing shift towards not only capturing and detecting these rare cells, but also ensuring their viability for post-processing, such as cell culture and genetic analysis. High throughput lab-on-a-chip (LOC) has been fuelled up to process and analyse heterogeneous real patient samples while gaining profound insights for cancer biology. In this review, we highlight how miniaturisation strategies together with nanotechnologies have been used to advance LOC for capturing, separating, enriching and detecting different CTCs efficiently, while meeting the challenges of cell viability, high throughput multiplex or single-cell detection and post-processing. We begin this survey with an introduction to CTC biology, followed by description of the use of various materials, microstructures and nanostructures for design of LOC to achieve miniaturisation, as well as how various CTC capture or separation strategies can enhance cell capture and enrichment efficiencies, purity and viability. The significant progress of various nanotechnologies-based detection techniques to achieve high sensitivities and low detection limits for viable CTCs and/or to enable CTC post-processing are presented and the fundamental insights are also discussed. Finally, the challenges and perspectives of the technologies are enumerated.
Thermostable Carbonic Anhydrases in Biotechnological Applications
Di Fiore, Anna; Alterio, Vincenzo; Monti, Simona M.; De Simone, Giuseppina; D’Ambrosio, Katia
2015-01-01
Carbonic anhydrases are ubiquitous metallo-enzymes which catalyze the reversible hydration of carbon dioxide in bicarbonate ions and protons. Recent years have seen an increasing interest in the utilization of these enzymes in CO2 capture and storage processes. However, since this use is greatly limited by the harsh conditions required in these processes, the employment of thermostable enzymes, both those isolated by thermophilic organisms and those obtained by protein engineering techniques, represents an interesting possibility. In this review we will provide an extensive description of the thermostable carbonic anhydrases so far reported and the main processes in which these enzymes have found an application. PMID:26184158
Costa, Gabriela M C; Gualda, Dulce M R
2010-12-01
The article discusses anthropology, ethnographic method, and narrative as possible ways of coming to know subjects' experiences and the feelings they attribute to them. From an anthropological perspective, the sociocultural universe is taken as a point of reference in understanding the meaning of the processes of health and sickness, using a dense ethnographic description from an interpretivist analytical approach. In this context, narratives afford possible paths to understanding how subjective human experiences are shared and how behavior is organized, with a special focus on meaning, the process by which stories are produced, relations between narrator and other subjects, processes of knowledge, and the manifold ways in which experience can be captured.
Connecting Provenance with Semantic Descriptions in the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Nemani, R. R.
2012-12-01
NASA Earth Exchange (NEX) is a data, modeling and knowledge collaboratory that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform. Some of the main goals of NEX are transparency and repeatability and to that extent we have been adding components that enable tracking of provenance of both scientific processes and datasets produced by these processes. As scientific processes become more complex, they are often developed collaboratively and it becomes increasingly important for the research team to be able to track the development of the process and the datasets that are produced along the way. Additionally, we want to be able to link the processes and the datasets developed on NEX to an existing information and knowledge, so that the users can query and compare the provenance of any dataset or process with regard to the component-specific attributes such as data quality, geographic location, related publications, user comments and annotations etc. We have developed several ontologies that describe datasets and workflow components available on NEX using the OWL ontology language as well as a simple ontology that provides linking mechanism to the collected provenance information. The provenance is captured in two ways - we utilize existing provenance infrastructure of VisTrails, which is used as a workflow engine on NEX, and we extend the captured provenance using the PROV data model expressed through the PROV-O ontology. We do this in order to link and query the provenance easier in the context of the existing NEX information and knowledge. The captured provenance graph is processed and stored using RDFlib with MySQL backend that can be queried using either RDFLib or SPARQL. As a concrete example, we show how this information is captured during anomaly detection process in large satellite datasets.
A Description for Rock Joint Roughness Based on Terrestrial Laser Scanner and Image Analysis
Ge, Yunfeng; Tang, Huiming; Eldin, M. A. M Ez; Chen, Pengyu; Wang, Liangqing; Wang, Jinge
2015-01-01
Shear behavior of rock mass greatly depends upon the rock joint roughness which is generally characterized by anisotropy, scale effect and interval effect. A new index enabling to capture all the three features, namely brightness area percentage (BAP), is presented to express the roughness based on synthetic illumination of a digital terrain model derived from terrestrial laser scanner (TLS). Since only tiny planes facing opposite to shear direction make contribution to resistance during shear failure, therefore these planes are recognized through the image processing technique by taking advantage of the fact that they appear brighter than other ones under the same light source. Comparison with existing roughness indexes and two case studies were illustrated to test the performance of BAP description. The results reveal that the rock joint roughness estimated by the presented description has a good match with existing roughness methods and displays a wider applicability. PMID:26585247
TARGET: Rapid Capture of Process Knowledge
NASA Technical Reports Server (NTRS)
Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.
1993-01-01
TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.
Conceptual-level workflow modeling of scientific experiments using NMR as a case study
Verdi, Kacy K; Ellis, Heidi JC; Gryk, Michael R
2007-01-01
Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment. PMID:17263870
Conceptual-level workflow modeling of scientific experiments using NMR as a case study.
Verdi, Kacy K; Ellis, Heidi Jc; Gryk, Michael R
2007-01-30
Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR) spectroscopy. Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting biomolecular analysis using NMR spectroscopy experiment.
Geometric Brownian Motion with Tempered Stable Waiting Times
NASA Astrophysics Data System (ADS)
Gajda, Janusz; Wyłomańska, Agnieszka
2012-08-01
One of the earliest system that was used to asset prices description is Black-Scholes model. It is based on geometric Brownian motion and was used as a tool for pricing various financial instruments. However, when it comes to data description, geometric Brownian motion is not capable to capture many properties of present financial markets. One can name here for instance periods of constant values. Therefore we propose an alternative approach based on subordinated tempered stable geometric Brownian motion which is a combination of the popular geometric Brownian motion and inverse tempered stable subordinator. In this paper we introduce the mentioned process and present its main properties. We propose also the estimation procedure and calibrate the analyzed system to real data.
Artemyev, A V; Neishtadt, A I; Zelenyi, L M; Vainchtein, D L
2010-12-01
We present an analytical and numerical study of the surfatron acceleration of nonrelativistic charged particles by electromagnetic waves. The acceleration is caused by capture of particles into resonance with one of the waves. We investigate capture for systems with one or two waves and provide conditions under which the obtained results can be applied to systems with more than two waves. In the case of a single wave, the once captured particles never leave the resonance and their velocity grows linearly with time. However, if there are two waves in the system, the upper bound of the energy gain may exist and we find the analytical value of that bound. We discuss several generalizations including the relativistic limit, different wave amplitudes, and a wide range of the waves' wavenumbers. The obtained results are used for qualitative description of some phenomena observed in the Earth's magnetosphere. © 2010 American Institute of Physics.
Introducing Explorer of Taxon Concepts with a case study on spider measurement matrix building.
Cui, Hong; Xu, Dongfang; Chong, Steven S; Ramirez, Martin; Rodenhausen, Thomas; Macklin, James A; Ludäscher, Bertram; Morris, Robert A; Soto, Eduardo M; Koch, Nicolás Mongiardino
2016-11-17
Taxonomic descriptions are traditionally composed in natural language and published in a format that cannot be directly used by computers. The Exploring Taxon Concepts (ETC) project has been developing a set of web-based software tools that convert morphological descriptions published in telegraphic style to character data that can be reused and repurposed. This paper introduces the first semi-automated pipeline, to our knowledge, that converts morphological descriptions into taxon-character matrices to support systematics and evolutionary biology research. We then demonstrate and evaluate the use of the ETC Input Creation - Text Capture - Matrix Generation pipeline to generate body part measurement matrices from a set of 188 spider morphological descriptions and report the findings. From the given set of spider taxonomic publications, two versions of input (original and normalized) were generated and used by the ETC Text Capture and ETC Matrix Generation tools. The tools produced two corresponding spider body part measurement matrices, and the matrix from the normalized input was found to be much more similar to a gold standard matrix hand-curated by the scientist co-authors. Special conventions utilized in the original descriptions (e.g., the omission of measurement units) were attributed to the lower performance of using the original input. The results show that simple normalization of the description text greatly increased the quality of the machine-generated matrix and reduced edit effort. The machine-generated matrix also helped identify issues in the gold standard matrix. ETC Text Capture and ETC Matrix Generation are low-barrier and effective tools for extracting measurement values from spider taxonomic descriptions and are more effective when the descriptions are self-contained. Special conventions that make the description text less self-contained challenge automated extraction of data from biodiversity descriptions and hinder the automated reuse of the published knowledge. The tools will be updated to support new requirements revealed in this case study.
Five task clusters that enable efficient and effective digitization of biological collections
Nelson, Gil; Paul, Deborah; Riccardi, Gregory; Mast, Austin R.
2012-01-01
Abstract This paper describes and illustrates five major clusters of related tasks (herein referred to as task clusters) that are common to efficient and effective practices in the digitization of biological specimen data and media. Examples of these clusters come from the observation of diverse digitization processes. The staff of iDigBio (The U.S. National Science Foundation’s National Resource for Advancing Digitization of Biological Collections) visited active biological and paleontological collections digitization programs for the purpose of documenting and assessing current digitization practices and tools. These observations identified five task clusters that comprise the digitization process leading up to data publication: (1) pre-digitization curation and staging, (2) specimen image capture, (3) specimen image processing, (4) electronic data capture, and (5) georeferencing locality descriptions. While not all institutions are completing each of these task clusters for each specimen, these clusters describe a composite picture of digitization of biological and paleontological specimens across the programs that were observed. We describe these clusters, three workflow patterns that dominate the implemention of these clusters, and offer a set of workflow recommendations for digitization programs. PMID:22859876
Steele Gray, Carolyn; Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl
2016-02-18
Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis-Possible, Implementable, (to be) Challenged, (to be) Killed-guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers.
NASA Astrophysics Data System (ADS)
Hansen, K. M.; Christensen, J. H.; Brandt, J.; Frohn, L. M.; Geels, C.
2004-07-01
The Danish Eulerian Hemispheric Model (DEHM) is a 3-D dynamical atmospheric transport model originally developed to describe the atmospheric transport of sulphur into the Arctic. A new version of the model, DEHM-POP, developed to study the atmospheric transport and environmental fate of persistent organic pollutants (POPs) is presented. During environmental cycling, POPs can be deposited and re-emitted several times before reaching a final destination. A description of the exchange processes between the land/ocean surfaces and the atmosphere is included in the model to account for this multi-hop transport. The α-isomer of the pesticide hexachlorocyclohexane (α-HCH) is used as tracer in the model development. The structure of the model and processes included are described in detail. The results from a model simulation showing the atmospheric transport for the years 1991 to 1998 are presented and evaluated against measurements. The annual averaged atmospheric concentration of α-HCH for the 1990s is well described by the model; however, the shorter-term average concentration for most of the stations is not well captured. This indicates that the present simple surface description needs to be refined to get a better description of the air-surface exchange processes of POPs.
NASA Astrophysics Data System (ADS)
Kandel, D. D.; Western, A. W.; Grayson, R. B.
2004-12-01
Mismatches in scale between the fundamental processes, the model and supporting data are a major limitation in hydrologic modelling. Surface runoff generation via infiltration excess and the process of soil erosion are fundamentally short time-scale phenomena and their average behaviour is mostly determined by the short time-scale peak intensities of rainfall. Ideally, these processes should be simulated using time-steps of the order of minutes to appropriately resolve the effect of rainfall intensity variations. However, sub-daily data support is often inadequate and the processes are usually simulated by calibrating daily (or even coarser) time-step models. Generally process descriptions are not modified but rather effective parameter values are used to account for the effect of temporal lumping, assuming that the effect of the scale mismatch can be counterbalanced by tuning the parameter values at the model time-step of interest. Often this results in parameter values that are difficult to interpret physically. A similar approach is often taken spatially. This is problematic as these processes generally operate or interact non-linearly. This indicates a need for better techniques to simulate sub-daily processes using daily time-step models while still using widely available daily information. A new method applicable to many rainfall-runoff-erosion models is presented. The method is based on temporal scaling using statistical distributions of rainfall intensity to represent sub-daily intensity variations in a daily time-step model. This allows the effect of short time-scale nonlinear processes to be captured while modelling at a daily time-step, which is often attractive due to the wide availability of daily forcing data. The approach relies on characterising the rainfall intensity variation within a day using a cumulative distribution function (cdf). This cdf is then modified by various linear and nonlinear processes typically represented in hydrological and erosion models. The statistical description of sub-daily variability is thus propagated through the model, allowing the effects of variability to be captured in the simulations. This results in cdfs of various fluxes, the integration of which over a day gives respective daily totals. Using 42-plot-years of surface runoff and soil erosion data from field studies in different environments from Australia and Nepal, simulation results from this cdf approach are compared with the sub-hourly (2-minute for Nepal and 6-minute for Australia) and daily models having similar process descriptions. Significant improvements in the simulation of surface runoff and erosion are achieved, compared with a daily model that uses average daily rainfall intensities. The cdf model compares well with a sub-hourly time-step model. This suggests that the approach captures the important effects of sub-daily variability while utilizing commonly available daily information. It is also found that the model parameters are more robustly defined using the cdf approach compared with the effective values obtained at the daily scale. This suggests that the cdf approach may offer improved model transferability spatially (to other areas) and temporally (to other periods).
Casula, P.; Nichols, J.D.
2003-01-01
When capturing and marking of individuals is possible, the application of newly developed capture-recapture models can remove several sources of bias in the estimation of population parameters such as local abundance and sex ratio. For example, observation of distorted sex ratios in counts or captures can reflect either different abundances of the sexes or different sex-specific capture probabilities, and capture-recapture models can help distinguish between these two possibilities. Robust design models and a model selection procedure based on information-theoretic methods were applied to study the local population structure of the endemic Sardinian chalk hill blue butterfly, Polyommatus coridon gennargenti. Seasonal variations of abundance, plus daily and weather-related variations of active populations of males and females were investigated. Evidence was found of protandry and male pioneering of the breeding space. Temporary emigration probability, which describes the proportion of the population not exposed to capture (e.g. absent from the study area) during the sampling process, was estimated, differed between sexes, and was related to temperature, a factor known to influence animal activity. The correlation between temporary emigration and average daily temperature suggested interpreting temporary emigration as inactivity of animals. Robust design models were used successfully to provide a detailed description of the population structure and activity in this butterfly and are recommended for studies of local abundance and animal activity in the field.
Multiple electron processes of He and Ne by proton impact
NASA Astrophysics Data System (ADS)
Terekhin, Pavel Nikolaevich; Montenegro, Pablo; Quinto, Michele; Monti, Juan; Fojon, Omar; Rivarola, Roberto
2016-05-01
A detailed investigation of multiple electron processes (single and multiple ionization, single capture, transfer-ionization) of He and Ne is presented for proton impact at intermediate and high collision energies. Exclusive absolute cross sections for these processes have been obtained by calculation of transition probabilities in the independent electron and independent event models as a function of impact parameter in the framework of the continuum distorted wave-eikonal initial state theory. A binomial analysis is employed to calculate exclusive probabilities. The comparison with available theoretical and experimental results shows that exclusive probabilities are needed for a reliable description of the experimental data. The developed approach can be used for obtaining the input database for modeling multiple electron processes of charged particles passing through the matter.
Using Ontologies to Formalize Services Specifications in Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann
2004-01-01
One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.
Tropical Rainfall Measuring Mission (TRMM). Phase B: Data capture facility definition study
NASA Technical Reports Server (NTRS)
1990-01-01
The National Aeronautics and Aerospace Administration (NASA) and the National Space Development Agency of Japan (NASDA) initiated the Tropical Rainfall Measuring Mission (TRMM) to obtain more accurate measurements of tropical rainfall then ever before. The measurements are to improve scientific understanding and knowledge of the mechanisms effecting the intra-annual and interannual variability of the Earth's climate. The TRMM is largely dependent upon the handling and processing of the data by the TRMM Ground System supporting the mission. The objective of the TRMM is to obtain three years of climatological determinations of rainfall in the tropics, culminating in data sets of 30-day average rainfall over 5-degree square areas, and associated estimates of vertical distribution of latent heat release. The scope of this study is limited to the functions performed by TRMM Data Capture Facility (TDCF). These functions include capturing the TRMM spacecraft return link data stream; processing the data in the real-time, quick-look, and routine production modes, as appropriate; and distributing real time, quick-look, and production data products to users. The following topics are addressed: (1) TRMM end-to-end system description; (2) TRMM mission operations concept; (3) baseline requirements; (4) assumptions related to mission requirements; (5) external interface; (6) TDCF architecture and design options; (7) critical issues and tradeoffs; and (8) recommendation for the final TDCF selection process.
ERIC Educational Resources Information Center
Gilmore, Joanna; Feldon, David
2010-01-01
This study extends research on graduate student development by examining descriptive findings and validity of a self-report survey designed to capture graduate students' assessments of their teaching and research skills. Descriptive findings provide some information about areas of growth among graduate students' in the first years of their…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guy Cerimele
2011-09-30
This Preliminary Public Design Report consolidates for public use nonproprietary design information on the Mountaineer Commercial Scale Carbon Capture & Storage project. The report is based on the preliminary design information developed during the Phase I - Project Definition Phase, spanning the time period of February 1, 2010 through September 30, 2011. The report includes descriptions and/or discussions for: (1) DOE's Clean Coal Power Initiative, overall project & Phase I objectives, and the historical evolution of DOE and American Electric Power (AEP) sponsored projects leading to the current project; (2) Alstom's Chilled Ammonia Process (CAP) carbon capture retrofit technology andmore » the carbon storage and monitoring system; (3) AEP's retrofit approach in terms of plant operational and integration philosophy; (4) The process island equipment and balance of plant systems for the CAP technology; (5) The carbon storage system, addressing injection wells, monitoring wells, system monitoring and controls logic philosophy; (6) Overall project estimate that includes the overnight cost estimate, cost escalation for future year expenditures, and major project risks that factored into the development of the risk based contingency; and (7) AEP's decision to suspend further work on the project at the end of Phase I, notwithstanding its assessment that the Alstom CAP technology is ready for commercial demonstration at the intended scale.« less
Application of agent-based system for bioprocess description and process improvement.
Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J
2010-01-01
Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers
High-speed AFM for scanning the architecture of living cells
NASA Astrophysics Data System (ADS)
Li, Jing; Deng, Zhifeng; Chen, Daixie; Ao, Zhuo; Sun, Quanmei; Feng, Jiantao; Yin, Bohua; Han, Li; Han, Dong
2013-08-01
We address the modelling of tip-cell membrane interactions under high speed atomic force microscopy. Using a home-made device with a scanning area of 100 × 100 μm2, in situ imaging of living cells is successfully performed under loading rates from 1 to 50 Hz, intending to enable detailed descriptions of physiological processes in living samples.We address the modelling of tip-cell membrane interactions under high speed atomic force microscopy. Using a home-made device with a scanning area of 100 × 100 μm2, in situ imaging of living cells is successfully performed under loading rates from 1 to 50 Hz, intending to enable detailed descriptions of physiological processes in living samples. Electronic supplementary information (ESI) available: Movie of the real-time change of inner surface within fresh blood vessel. The movie was captured at a speed of 30 Hz in the range of 80 μm × 80 μm. See DOI: 10.1039/c3nr01464a
The drift diffusion model as the choice rule in reinforcement learning.
Pedersen, Mads Lund; Frank, Michael J; Biele, Guido
2017-08-01
Current reinforcement-learning models often assume simplified decision processes that do not fully reflect the dynamic complexities of choice processes. Conversely, sequential-sampling models of decision making account for both choice accuracy and response time, but assume that decisions are based on static decision values. To combine these two computational models of decision making and learning, we implemented reinforcement-learning models in which the drift diffusion model describes the choice process, thereby capturing both within- and across-trial dynamics. To exemplify the utility of this approach, we quantitatively fit data from a common reinforcement-learning paradigm using hierarchical Bayesian parameter estimation, and compared model variants to determine whether they could capture the effects of stimulant medication in adult patients with attention-deficit hyperactivity disorder (ADHD). The model with the best relative fit provided a good description of the learning process, choices, and response times. A parameter recovery experiment showed that the hierarchical Bayesian modeling approach enabled accurate estimation of the model parameters. The model approach described here, using simultaneous estimation of reinforcement-learning and drift diffusion model parameters, shows promise for revealing new insights into the cognitive and neural mechanisms of learning and decision making, as well as the alteration of such processes in clinical groups.
The drift diffusion model as the choice rule in reinforcement learning
Frank, Michael J.
2017-01-01
Current reinforcement-learning models often assume simplified decision processes that do not fully reflect the dynamic complexities of choice processes. Conversely, sequential-sampling models of decision making account for both choice accuracy and response time, but assume that decisions are based on static decision values. To combine these two computational models of decision making and learning, we implemented reinforcement-learning models in which the drift diffusion model describes the choice process, thereby capturing both within- and across-trial dynamics. To exemplify the utility of this approach, we quantitatively fit data from a common reinforcement-learning paradigm using hierarchical Bayesian parameter estimation, and compared model variants to determine whether they could capture the effects of stimulant medication in adult patients with attention-deficit hyper-activity disorder (ADHD). The model with the best relative fit provided a good description of the learning process, choices, and response times. A parameter recovery experiment showed that the hierarchical Bayesian modeling approach enabled accurate estimation of the model parameters. The model approach described here, using simultaneous estimation of reinforcement-learning and drift diffusion model parameters, shows promise for revealing new insights into the cognitive and neural mechanisms of learning and decision making, as well as the alteration of such processes in clinical groups. PMID:27966103
The aluminum smelting process and innovative alternative technologies.
Kvande, Halvor; Drabløs, Per Arne
2014-05-01
The industrial aluminum production process is addressed. The purpose is to give a short but comprehensive description of the electrolysis cell technology, the raw materials used, and the health and safety relevance of the process. This article is based on a study of the extensive chemical and medical literature on primary aluminum production. At present, there are two main technological challenges for the process--to reduce energy consumption and to mitigate greenhouse gas emissions. A future step may be carbon dioxide gas capture and sequestration related to the electric power generation from fossil sources. Workers' health and safety have now become an integrated part of the aluminum business. Work-related injuries and illnesses are preventable, and the ultimate goal to eliminate accidents with lost-time injuries may hopefully be approached in the future.
Space Shuttle Guidance, Navigation, and Rendezvous Knowledge Capture Reports. Revision 1
NASA Technical Reports Server (NTRS)
Goodman, John L.
2011-01-01
This document is a catalog and readers guide to lessons learned, experience, and technical history reports, as well as compilation volumes prepared by United Space Alliance personnel for the NASA/Johnson Space Center (JSC) Flight Dynamics Division.1 It is intended to make it easier for future generations of engineers to locate knowledge capture documentation from the Shuttle Program. The first chapter covers observations on documentation quality and research challenges encountered during the Space Shuttle and Orion programs. The second chapter covers the knowledge capture approach used to create many of the reports covered in this document. These chapters are intended to provide future flight programs with insight that could be used to formulate knowledge capture and management strategies. The following chapters contain descriptions of each knowledge capture report. The majority of the reports concern the Space Shuttle. Three are included that were written in support of the Orion Program. Most of the reports were written from the years 2001 to 2011. Lessons learned reports concern primarily the shuttle Global Positioning System (GPS) upgrade and the knowledge capture process. Experience reports on navigation and rendezvous provide examples of how challenges were overcome and how best practices were identified and applied. Some reports are of a more technical history nature covering navigation and rendezvous. They provide an overview of mission activities and the evolution of operations concepts and trajectory design. The lessons learned, experience, and history reports would be considered secondary sources by historians and archivists.
NASA Astrophysics Data System (ADS)
Luo, Lin-Bo; An, Sang-Woo; Wang, Chang-Shuai; Li, Ying-Chun; Chong, Jong-Wha
2012-09-01
Digital cameras usually decrease exposure time to capture motion-blur-free images. However, this operation will generate an under-exposed image with a low-budget complementary metal-oxide semiconductor image sensor (CIS). Conventional color correction algorithms can efficiently correct under-exposed images; however, they are generally not performed in real time and need at least one frame memory if they are implemented by hardware. The authors propose a real-time look-up table-based color correction method that corrects under-exposed images with hardware without using frame memory. The method utilizes histogram matching of two preview images, which are exposed for a long and short time, respectively, to construct an improved look-up table (ILUT) and then corrects the captured under-exposed image in real time. Because the ILUT is calculated in real time before processing the captured image, this method does not require frame memory to buffer image data, and therefore can greatly save the cost of CIS. This method not only supports single image capture, but also bracketing to capture three images at a time. The proposed method was implemented by hardware description language and verified by a field-programmable gate array with a 5 M CIS. Simulations show that the system can perform in real time with a low cost and can correct the color of under-exposed images well.
How the twain can meet: Prospect theory and models of heuristics in risky choice.
Pachur, Thorsten; Suter, Renata S; Hertwig, Ralph
2017-03-01
Two influential approaches to modeling choice between risky options are algebraic models (which focus on predicting the overt decisions) and models of heuristics (which are also concerned with capturing the underlying cognitive process). Because they rest on fundamentally different assumptions and algorithms, the two approaches are usually treated as antithetical, or even incommensurable. Drawing on cumulative prospect theory (CPT; Tversky & Kahneman, 1992) as the currently most influential instance of a descriptive algebraic model, we demonstrate how the two modeling traditions can be linked. CPT's algebraic functions characterize choices in terms of psychophysical (diminishing sensitivity to probabilities and outcomes) as well as psychological (risk aversion and loss aversion) constructs. Models of heuristics characterize choices as rooted in simple information-processing principles such as lexicographic and limited search. In computer simulations, we estimated CPT's parameters for choices produced by various heuristics. The resulting CPT parameter profiles portray each of the choice-generating heuristics in psychologically meaningful ways-capturing, for instance, differences in how the heuristics process probability information. Furthermore, CPT parameters can reflect a key property of many heuristics, lexicographic search, and track the environment-dependent behavior of heuristics. Finally, we show, both in an empirical and a model recovery study, how CPT parameter profiles can be used to detect the operation of heuristics. We also address the limits of CPT's ability to capture choices produced by heuristics. Our results highlight an untapped potential of CPT as a measurement tool to characterize the information processing underlying risky choice. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Barghouty, A. F.
2014-01-01
Accurate estimates of electroncapture cross sections at energies relevant to the modeling of the transport, acceleration, and interaction of energetic neutral atoms (ENA) in space (approximately few MeV per nucleon) and especially for multi-electron ions must rely on detailed, but computationally expensive, quantum-mechanical description of the collision process. Kuang's semi-classical approach is an elegant and efficient way to arrive at these estimates. Motivated by ENA modeling efforts for apace applications, we shall briefly present this approach along with sample applications and report on current progress.
Production data in media systems and press front ends: capture, formats and database methods
NASA Astrophysics Data System (ADS)
Karttunen, Simo
1997-02-01
The nature, purpose and data presentation features of media jobs are analyzed in relation to the content, document, process and resource management in media production. Formats are the natural way of presenting, collecting and storing information, contents, document components and final documents. The state of the art and the trends in the media formats and production data are reviewed. The types and the amount of production data are listed, e.g. events, schedules, product descriptions, reports, visual support, quality, process states and color data. The data exchange must be vendor-neutral. Adequate infrastructure and system architecture are defined for production and media data. The roles of open servers and intranets are evaluated and their potential roles as future solutions are anticipated. The press frontend is the part of print media production where large files dominate. The new output alternatives, i.e. film recorders, direct plate output (CTP and CTP-on-press) and digital, plateless printing lines need new workflow tools and very efficient file and format management. The paper analyzes the capture, formatting and storing of job files and respective production data, such as the event logs of the processes. Intranet, browsers, Java applets and open web severs will be used to capture production data, especially where intranets are used anyhow, or where several companies are networked to plan, design and use documents and printed products. The user aspects of installing intranets is stressed since there are numerous more traditional and more dedicated networking solutions on the market.
The Ansel Adams zone system: HDR capture and range compression by chemical processing
NASA Astrophysics Data System (ADS)
McCann, John J.
2010-02-01
We tend to think of digital imaging and the tools of PhotoshopTM as a new phenomenon in imaging. We are also familiar with multiple-exposure HDR techniques intended to capture a wider range of scene information, than conventional film photography. We know about tone-scale adjustments to make better pictures. We tend to think of everyday, consumer, silver-halide photography as a fixed window of scene capture with a limited, standard range of response. This description of photography is certainly true, between 1950 and 2000, for instant films and negatives processed at the drugstore. These systems had fixed dynamic range and fixed tone-scale response to light. All pixels in the film have the same response to light, so the same light exposure from different pixels was rendered as the same film density. Ansel Adams, along with Fred Archer, formulated the Zone System, staring in 1940. It was earlier than the trillions of consumer photos in the second half of the 20th century, yet it was much more sophisticated than today's digital techniques. This talk will describe the chemical mechanisms of the zone system in the parlance of digital image processing. It will describe the Zone System's chemical techniques for image synthesis. It also discusses dodging and burning techniques to fit the HDR scene into the LDR print. Although current HDR imaging shares some of the Zone System's achievements, it usually does not achieve all of them.
Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl
2016-01-01
Background Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. Objective This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Methods Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Results Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis—Possible, Implementable, (to be) Challenged, (to be) Killed—guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Conclusions Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers. PMID:26892952
Elongated dust particles growth in a spherical glow discharge in ethanol
NASA Astrophysics Data System (ADS)
Fedoseev, A. V.; Sukhinin, G. I.; Sakhapov, S. Z.; Zaikovskii, A. V.; Novopashin, S. A.
2018-01-01
The formation of elongated dust particles in a spherical dc glow discharge in ethanol was observed for the first time. Dust particles were formed in the process of coagulation of ethanol dissociation products in the plasma of gas discharge. During the process the particles were captured into clouds in the electric potential wells of strong striations of spherical discharge. The size and the shape of dust particles are easily detected by naked eye after the illumination of the laser sheet. The description of the experimental setup and conditions, the analysis of size, shape and composition of the particles, the explanation of spatial ordering and orientation of these particles are presented.
Four simple ocean carbon models
NASA Technical Reports Server (NTRS)
Moore, Berrien, III
1992-01-01
This paper briefly reviews the key processes that determine oceanic CO2 uptake and sets this description within the context of four simple ocean carbon models. These models capture, in varying degrees, these key processes and establish a clear foundation for more realistic models that incorporate more directly the underlying physics and biology of the ocean rather than relying on simple parametric schemes. The purpose of this paper is more pedagogical than purely scientific. The problems encountered by current attempts to understand the global carbon cycle not only require our efforts but set a demand for a new generation of scientist, and it is hoped that this paper and the text in which it appears will help in this development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allan, M.E.; Wilson, M.L.; Wightman, J.
1996-12-31
The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity & permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based onmore » marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic & petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allan, M.E.; Wilson, M.L.; Wightman, J.
1996-01-01
The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on markermore » correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less
Barreto, Mauricio; Burbano, María Elena; Young, David G
2002-07-01
A new Lutzomyia species in the subgenus Trichophoromyia, L. pabloi, is described and illustrated. A description of the previously unknown female of L. howardi Young is also presented. These specimens were captured in the Amazon region of Colombia.
Gil, Yolanda; Michel, Felix; Ratnakar, Varun; Read, Jordan S.; Hauder, Matheus; Duffy, Christopher; Hanson, Paul C.; Dugan, Hilary
2015-01-01
The Web was originally developed to support collaboration in science. Although scientists benefit from many forms of collaboration on the Web (e.g., blogs, wikis, forums, code sharing, etc.), most collaborative projects are coordinated over email, phone calls, and in-person meetings. Our goal is to develop a collaborative infrastructure for scientists to work on complex science questions that require multi-disciplinary contributions to gather and analyze data, that cannot occur without significant coordination to synthesize findings, and that grow organically to accommodate new contributors as needed as the work evolves over time. Our approach is to develop an organic data science framework based on a task-centered organization of the collaboration, includes principles from social sciences for successful on-line communities, and exposes an open science process. Our approach is implemented as an extension of a semantic wiki platform, and captures formal representations of task decomposition structures, relations between tasks and users, and other properties of tasks, data, and other relevant science objects. All these entities are captured through the semantic wiki user interface, represented as semantic web objects, and exported as linked data.
Stochastic dynamics of intermittent pore-scale particle motion in three-dimensional porous media
NASA Astrophysics Data System (ADS)
Morales, V. L.; Dentz, M.; Willmann, M.; Holzner, M.
2017-12-01
A proper understanding of velocity dynamics is key for making transport predictions through porous media at any scale. We study the velocity evolution process from particle dynamics at the pore-scale with particular interest in preasymptotic (non-Fickian) behavior. Experimental measurements from 3-dimensional particle tracking velocimetry are used to obtain Lagrangian velocity statistics for three different types of media heterogeneity. Particle velocities are found to be intermittent in nature, log-normally distributed and non-stationary. We show that these velocity characteristics can be captured with a correlated Ornstein-Uhlenbeck process for a random walk in space that is parameterized from velocity distributions. Our simple model is rigorously tested for accurate reproduction of velocity variability in magnitude and frequency. We further show that it captures exceptionally well the preasymptotic mean and mean squared displacement in the ballistic and superdiffusive regimes, and can be extended to determine if and when Fickian behavior will be reached. Our approach reproduces both preasymptotic and asymptotic transport behavior with a single transport model, demonstrating correct description of the fundamental controls of anomalous transport.
Groza, Tudor; Hunter, Jane; Zankl, Andreas
2012-10-15
Over the course of the last few years there has been a significant amount of research performed on ontology-based formalization of phenotype descriptions. In order to fully capture the intrinsic value and knowledge expressed within them, we need to take advantage of their inner structure, which implicitly combines qualities and anatomical entities. The first step in this process is the segmentation of the phenotype descriptions into their atomic elements. We present a two-phase hybrid segmentation method that combines a series individual classifiers using different aggregation schemes (set operations and simple majority voting). The approach is tested on a corpus comprised of skeletal phenotype descriptions emerged from the Human Phenotype Ontology. Experimental results show that the best hybrid method achieves an F-Score of 97.05% in the first phase and F-Scores of 97.16% / 94.50% in the second phase. The performance of the initial segmentation of anatomical entities and qualities (phase I) is not affected by the presence / absence of external resources, such as domain dictionaries. From a generic perspective, hybrid methods may not always improve the segmentation accuracy as they are heavily dependent on the goal and data characteristics.
NASA Astrophysics Data System (ADS)
Hansen, K. M.; Christensen, J. H.; Brandt, J.; Frohn, L. M.; Geels, C.
2004-03-01
The Danish Eulerian Hemispheric Model (DEHM) is a 3-D dynamical atmospheric transport model originally developed to describe the atmospheric transport of sulphur into the Arctic. A new version of the model, DEHM-POP, developed to study the atmospheric transport and environmental fate of persistent organic pollutants (POPs) is presented. During environmental cycling, POPs can be deposited and re-emitted several times before reaching a final destination. A description of the exchange processes between the land/ocean surfaces and the atmosphere is included in the model to account for this multi-hop transport. The α-isomer of the pesticide hexachlorocyclohexane (α-HCH) is used as tracer in the model development. The structure of the model and processes included are described in detail. The results from a model simulation showing the atmospheric transport for the years 1991 to 1998 are presented and evaluated against measurements. The annual averaged atmospheric concentration of α-HCH for the 1990s is well described by the model; however, the shorter-term average concentration for most of the stations is not well captured. This indicates that the present simple surface description needs to be refined to get a better description of the air-surface exchange proceses of POPs.
Towards a mechanistic foundation of evolutionary theory.
Doebeli, Michael; Ispolatov, Yaroslav; Simon, Burt
2017-02-15
Most evolutionary thinking is based on the notion of fitness and related ideas such as fitness landscapes and evolutionary optima. Nevertheless, it is often unclear what fitness actually is, and its meaning often depends on the context. Here we argue that fitness should not be a basal ingredient in verbal or mathematical descriptions of evolution. Instead, we propose that evolutionary birth-death processes, in which individuals give birth and die at ever-changing rates, should be the basis of evolutionary theory, because such processes capture the fundamental events that generate evolutionary dynamics. In evolutionary birth-death processes, fitness is at best a derived quantity, and owing to the potential complexity of such processes, there is no guarantee that there is a simple scalar, such as fitness, that would describe long-term evolutionary outcomes. We discuss how evolutionary birth-death processes can provide useful perspectives on a number of central issues in evolution.
Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording
ERIC Educational Resources Information Center
Mayer, Kimberly L.; DiGennaro Reed, Florence D.
2013-01-01
Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…
Measures for brain connectivity analysis: nodes centrality and their invariant patterns
NASA Astrophysics Data System (ADS)
da Silva, Laysa Mayra Uchôa; Baltazar, Carlos Arruda; Silva, Camila Aquemi; Ribeiro, Mauricio Watanabe; de Aratanha, Maria Adelia Albano; Deolindo, Camila Sardeto; Rodrigues, Abner Cardoso; Machado, Birajara Soares
2017-07-01
The high dynamical complexity of the brain is related to its small-world topology, which enable both segregated and integrated information processing capabilities. Several measures of connectivity estimation have already been employed to characterize functional brain networks from multivariate electrophysiological data. However, understanding the properties of each measure that lead to a better description of the real topology and capture the complex phenomena present in the brain remains challenging. In this work we compared four nonlinear connectivity measures and show that each method characterizes distinct features of brain interactions. The results suggest an invariance of global network parameters from different behavioral states and that more complete description may be reached considering local features, independently of the connectivity measure employed. Our findings also point to future perspectives in connectivity studies that combine distinct and complementary dependence measures in assembling higher dimensions manifolds.
NursesforTomorrow: a proactive approach to nursing resource analysis.
Bournes, Debra A; Plummer, Carolyn; Miller, Robert; Ferguson-Paré, Mary
2010-03-01
This paper describes the background, development, implementation and utilization of NursesforTomorrow (N4T), a practical and comprehensive nursing human resources analysis method to capture regional, institutional and patient care unit-specific actual and predicted nurse vacancies, nurse staff characteristics and nurse staffing changes. Reports generated from the process include forecasted shortfalls or surpluses of nurses, percentage of novice nurses, occupancy, sick time, overtime, agency use and other metrics. Readers will benefit from a description of the ways in which the data generated from the nursing resource analysis process are utilized at senior leadership, program and unit levels to support proactive hiring and resource allocation decisions and to predict unit-specific recruitment and retention patterns across multiple healthcare organizations and regions.
Representational geometry: integrating cognition, computation, and the brain
Kriegeskorte, Nikolaus; Kievit, Rogier A.
2013-01-01
The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. PMID:23876494
Automatic anatomical structures location based on dynamic shape measurement
NASA Astrophysics Data System (ADS)
Witkowski, Marcin; Rapp, Walter; Sitnik, Robert; Kujawinska, Malgorzata; Vander Sloten, Jos; Haex, Bart; Bogaert, Nico; Heitmann, Kjell
2005-09-01
New image processing methods and active photonics apparatus have made possible the development of relatively inexpensive optical systems for complex shape and object measurements. We present dynamic 360° scanning method for analysis of human lower body biomechanics, with an emphasis on the analysis of the knee joint. The anatomical structure (of high medical interest) that is possible to scan and analyze, is patella. Tracking of patella position and orientation under dynamic conditions may lead to detect pathological patella movements and help in knee joint disease diagnosis. The processed data is obtained from a dynamic laser triangulation surface measurement system, able to capture slow to normal movements with a scan frequency between 15 and 30 Hz. These frequency rates are enough to capture controlled movements used e.g. for medical examination purposes. The purpose of the work presented is to develop surface analysis methods that may be used as support of diagnosis of motoric abilities of lower limbs. The paper presents algorithms used to process acquired lower limbs surface data in order to find the position and orientation of patella. The algorithms implemented include input data preparation, curvature description methods, knee region discrimination and patella assumed position/orientation calculation. Additionally, a method of 4D (3D + time) medical data visualization is proposed. Also some exemplary results are presented.
Kadakia, Ekta; Shah, Lipa; Amiji, Mansoor M
2017-07-01
Nanoemulsions have shown potential in delivering drug across epithelial and endothelial cell barriers, which express efflux transporters. However, their transport mechanisms are not entirely understood. Our goal was to investigate the cellular permeability of nanoemulsion-encapsulated drugs and apply mathematical modeling to elucidate transport mechanisms and sensitive nanoemulsion attributes. Transport studies were performed in Caco-2 cells, using fish oil nanoemulsions and a model substrate, rhodamine-123. Permeability data was modeled using a semi-mechanistic approach, capturing the following cellular processes: endocytotic uptake of the nanoemulsion, release of rhodamine-123 from the nanoemulsion, efflux and passive permeability of rhodamine-123 in aqueous solution. Nanoemulsions not only improved the permeability of rhodamine-123, but were also less sensitive to efflux transporters. The model captured bidirectional permeability results and identified sensitive processes, such as the release of the nanoemulsion-encapsulated drug and cellular uptake of the nanoemulsion. Mathematical description of cellular processes, improved our understanding of transport mechanisms, such as nanoemulsions don't inhibit efflux to improve drug permeability. Instead, their endocytotic uptake, results in higher intracellular drug concentrations, thereby increasing the concentration gradient and transcellular permeability across biological barriers. Modeling results indicated optimizing nanoemulsion attributes like the droplet size and intracellular drug release rate, may further improve drug permeability.
NASA Technical Reports Server (NTRS)
Dehghani, Navid; Tankenson, Michael
2006-01-01
This paper details an architectural description of the Mission Data Processing and Control System (MPCS), an event-driven, multi-mission ground data processing components providing uplink, downlink, and data management capabilities which will support the Mars Science Laboratory (MSL) project as its first target mission. MPCS is developed based on a set of small reusable components, implemented in Java, each designed with a specific function and well-defined interfaces. An industry standard messaging bus is used to transfer information among system components. Components generate standard messages which are used to capture system information, as well as triggers to support the event-driven architecture of the system. Event-driven systems are highly desirable for processing high-rate telemetry (science and engineering) data, and for supporting automation for many mission operations processes.
A critical component in the design of the Chemical Effects in Biological Systems (CEBS) Knowledgebase is a strategy to capture toxicogenomics study protocols and the toxicity endpoint data (clinical pathology and histopathology). A Study is generally an experiment carried out du...
The Aluminum Smelting Process and Innovative Alternative Technologies
Drabløs, Per Arne
2014-01-01
Objective: The industrial aluminum production process is addressed. The purpose is to give a short but comprehensive description of the electrolysis cell technology, the raw materials used, and the health and safety relevance of the process. Methods: This article is based on a study of the extensive chemical and medical literature on primary aluminum production. Results: At present, there are two main technological challenges for the process—to reduce energy consumption and to mitigate greenhouse gas emissions. A future step may be carbon dioxide gas capture and sequestration related to the electric power generation from fossil sources. Conclusions: Workers' health and safety have now become an integrated part of the aluminum business. Work-related injuries and illnesses are preventable, and the ultimate goal to eliminate accidents with lost-time injuries may hopefully be approached in the future. PMID:24806723
A Descriptive and Interpretative Information System for the IODP
NASA Astrophysics Data System (ADS)
Blum, P.; Foster, P. A.; Mateo, Z.
2006-12-01
The ODP/IODP has a long and rich history of collecting descriptive and interpretative information (DESCINFO) from rock and sediment cores from the world's oceans. Unlike instrumental data, DESCINFO generated by subject experts is biased by the scientific and cultural background of the observers and their choices of classification schemes. As a result, global searches of DESCINFO and its integration with other data are problematical. To address this issue, the IODP-USIO is in the process of designing and implementing a DESCINFO system for IODP Phase 2 (2007-2013) that meets the user expectations expressed over the past decade. The requirements include support of (1) detailed, material property-based descriptions as well as classification-based descriptions; (2) global searches by physical sample and digital data sources as well as any of the descriptive parameters; (3) user-friendly data capture tools for a variety of workflows; and (4) extensive visualization of DESCINFO data along with instrumental data and images; and (5) portability/interoperability such that the system can work with database schemas of other organizations - a specific challenge given the schema and semantic heterogeneity not only among the three IODP operators but within the geosciences in general. The DESCINFO approach is based on the definition of a set of generic observable parameters that are populated with numeric or text values. Text values are derived from controlled, extensible hierarchical value lists that allow descriptions at the appropriate level of detail and ensure successful data searches. Material descriptions can be completed independently of domain-specific classifications, genetic concepts, and interpretative frameworks.
Qualitative dynamics semantics for SBGN process description.
Rougny, Adrien; Froidevaux, Christine; Calzone, Laurence; Paulevé, Loïc
2016-06-16
Qualitative dynamics semantics provide a coarse-grain modeling of networks dynamics by abstracting away kinetic parameters. They allow to capture general features of systems dynamics, such as attractors or reachability properties, for which scalable analyses exist. The Systems Biology Graphical Notation Process Description language (SBGN-PD) has become a standard to represent reaction networks. However, no qualitative dynamics semantics taking into account all the main features available in SBGN-PD had been proposed so far. We propose two qualitative dynamics semantics for SBGN-PD reaction networks, namely the general semantics and the stories semantics, that we formalize using asynchronous automata networks. While the general semantics extends standard Boolean semantics of reaction networks by taking into account all the main features of SBGN-PD, the stories semantics allows to model several molecules of a network by a unique variable. The obtained qualitative models can be checked against dynamical properties and therefore validated with respect to biological knowledge. We apply our framework to reason on the qualitative dynamics of a large network (more than 200 nodes) modeling the regulation of the cell cycle by RB/E2F. The proposed semantics provide a direct formalization of SBGN-PD networks in dynamical qualitative models that can be further analyzed using standard tools for discrete models. The dynamics in stories semantics have a lower dimension than the general one and prune multiple behaviors (which can be considered as spurious) by enforcing the mutual exclusiveness between the activity of different nodes of a same story. Overall, the qualitative semantics for SBGN-PD allow to capture efficiently important dynamical features of reaction network models and can be exploited to further refine them.
Programmable Potentials: Approximate N-body potentials from coarse-level logic.
Thakur, Gunjan S; Mohr, Ryan; Mezić, Igor
2016-09-27
This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the "coefficients" of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out.
Programmable Potentials: Approximate N-body potentials from coarse-level logic
NASA Astrophysics Data System (ADS)
Thakur, Gunjan S.; Mohr, Ryan; Mezić, Igor
2016-09-01
This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the “coefficients” of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out.
Programmable Potentials: Approximate N-body potentials from coarse-level logic
Thakur, Gunjan S.; Mohr, Ryan; Mezić, Igor
2016-01-01
This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the “coefficients” of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out. PMID:27671683
Statistical inference for capture-recapture experiments
Pollock, Kenneth H.; Nichols, James D.; Brownie, Cavell; Hines, James E.
1990-01-01
This monograph presents a detailed, practical exposition on the design, analysis, and interpretation of capture-recapture studies. The Lincoln-Petersen model (Chapter 2) and the closed population models (Chapter 3) are presented only briefly because these models have been covered in detail elsewhere. The Jolly- Seber open population model, which is central to the monograph, is covered in detail in Chapter 4. In Chapter 5 we consider the "enumeration" or "calendar of captures" approach, which is widely used by mammalogists and other vertebrate ecologists. We strongly recommend that it be abandoned in favor of analyses based on the Jolly-Seber model. We consider 2 restricted versions of the Jolly-Seber model. We believe the first of these, which allows losses (mortality or emigration) but not additions (births or immigration), is likely to be useful in practice. Another series of restrictive models requires the assumptions of a constant survival rate or a constant survival rate and a constant capture rate for the duration of the study. Detailed examples are given that illustrate the usefulness of these restrictions. There often can be a substantial gain in precision over Jolly-Seber estimates. In Chapter 5 we also consider 2 generalizations of the Jolly-Seber model. The temporary trap response model allows newly marked animals to have different survival and capture rates for 1 period. The other generalization is the cohort Jolly-Seber model. Ideally all animals would be marked as young, and age effects considered by using the Jolly-Seber model on each cohort separately. In Chapter 6 we present a detailed description of an age-dependent Jolly-Seber model, which can be used when 2 or more identifiable age classes are marked. In Chapter 7 we present a detailed description of the "robust" design. Under this design each primary period contains several secondary sampling periods. We propose an estimation procedure based on closed and open population models that allows for heterogeneity and trap response of capture rates (hence the name robust design). We begin by considering just 1 age class and then extend to 2 age classes. When there are 2 age classes it is possible to distinguish immigrants and births. In Chapter 8 we give a detailed discussion of the design of capture-recapture studies. First, capture-recapture is compared to other possible sampling procedures. Next, the design of capture-recapture studies to minimize assumption violations is considered. Finally, we consider the precision of parameter estimates and present figures on proportional standard errors for a variety of initial parameter values to aid the biologist about to plan a study. A new program, JOLLY, has been written to accompany the material on the Jolly-Seber model (Chapter 4) and its extensions (Chapter 5). Another new program, JOLLYAGE, has been written for a special case of the age-dependent model (Chapter 6) where there are only 2 age classes. In Chapter 9 a brief description of the different versions of the 2 programs is given. Chapter 10 gives a brief description of some alternative approaches that were not considered in this monograph. We believe that an excellent overall view of capture- recapture models may be obtained by reading the monograph by White et al. (1982) emphasizing closed models and then reading this monograph where we concentrate on open models. The important recent monograph by Burnham et al. (1987) could then be read if there were interest in the comparison of different populations.
A strongly goal-directed close-range vision system for spacecraft docking
NASA Technical Reports Server (NTRS)
Boyer, Kim L.; Goddard, Ralph E.
1991-01-01
In this presentation, we will propose a strongly goal-oriented stereo vision system to establish proper docking approach motions for automated rendezvous and capture (AR&C). From an input sequence of stereo video image pairs, the system produces a current best estimate of: contact position; contact vector; contact velocity; and contact orientation. The processing demands imposed by this particular problem and its environment dictate a special case solution; such a system should necessarily be, in some sense, minimalist. By this we mean the system should construct a scene description just sufficiently rich to solve the problem at hand and should do no more processing than is absolutely necessary. In addition, the imaging resolution should be just sufficient. Extracting additional information and constructing higher level scene representations wastes energy and computational resources and injects an unnecessary degree of complexity, increasing the likelihood of malfunction. We therefore take a departure from most prior stereopsis work, including our own, and propose a system based on associative memory. The purpose of the memory is to immediately associate a set of motor commands with a set of input visual patterns in the two cameras. That is, rather than explicitly computing point correspondences and object positions in world coordinates and trying to reason forward from this information to a plan of action, we are trying to capture the essence of reflex behavior through the action of associative memory. The explicit construction of point correspondences and 3D scene descriptions, followed by online velocity and point of impact calculations, is prohibitively expensive from a computational point of view for the problem at hand. Learned patterns on the four image planes, left and right at two discrete but closely spaced instants in time, will be bused directly to infer the spacecraft reaction. This will be a continuing online process as the docking collar approaches.
Capturing the temporal evolution of choice across prefrontal cortex
Hunt, Laurence T; Behrens, Timothy EJ; Hosokawa, Takayuki; Wallis, Jonathan D; Kennerley, Steven W
2015-01-01
Activity in prefrontal cortex (PFC) has been richly described using economic models of choice. Yet such descriptions fail to capture the dynamics of decision formation. Describing dynamic neural processes has proven challenging due to the problem of indexing the internal state of PFC and its trial-by-trial variation. Using primate neurophysiology and human magnetoencephalography, we here recover a single-trial index of PFC internal states from multiple simultaneously recorded PFC subregions. This index can explain the origins of neural representations of economic variables in PFC. It describes the relationship between neural dynamics and behaviour in both human and monkey PFC, directly bridging between human neuroimaging data and underlying neuronal activity. Moreover, it reveals a functionally dissociable interaction between orbitofrontal cortex, anterior cingulate cortex and dorsolateral PFC in guiding cost-benefit decisions. We cast our observations in terms of a recurrent neural network model of choice, providing formal links to mechanistic dynamical accounts of decision-making. DOI: http://dx.doi.org/10.7554/eLife.11945.001 PMID:26653139
Implementation of a national anti-tuberculosis drug resistance survey in Tanzania.
Chonde, Timothy M; Doulla, Basra; van Leth, Frank; Mfinanga, Sayoki G M; Range, Nyagosya; Lwilla, Fred; Mfaume, Saidi M; van Deun, Armand; Zignol, Matteo; Cobelens, Frank G; Egwaga, Saidi M
2008-12-30
A drug resistance survey is an essential public health management tool for evaluating and improving the performance of National Tuberculosis control programmes. The current manuscript describes the implementation of the first national drug resistance survey in Tanzania. Description of the implementation process of a national anti-tuberculosis drug resistance survey in Tanzania, in relation to the study protocol and Standard Operating Procedures. Factors contributing positively to the implementation of the survey were a continuous commitment of the key stakeholders, the existence of a well organized National Tuberculosis Programme, and a detailed design of cluster-specific arrangements for rapid sputum transportation. Factors contributing negatively to the implementation were a long delay between training and actual survey activities, limited monitoring of activities, and an unclear design of the data capture forms leading to difficulties in form-filling. Careful preparation of the survey, timing of planned activities, a strong emphasis on data capture tools and data management, and timely supervision are essential for a proper implementation of a national drug resistance survey.
Capturing and modelling high-complex alluvial topography with UAS-borne laser scanning
NASA Astrophysics Data System (ADS)
Mandlburger, Gottfried; Wieser, Martin; Pfennigbauer, Martin
2015-04-01
Due to fluvial activity alluvial forests are zones of highest complexity and relief energy. Alluvial forests are dominated by new and pristine channels in consequence of current and historic flood events. Apart from topographic features, the vegetation structure is typically very complex featuring, both, dense under story as well as high trees. Furthermore, deadwood and debris carried from upstream during periods of high discharge within the river channel are deposited in these areas. Therefore, precise modelling of the micro relief of alluvial forests using standard tools like Airborne Laser Scanning (ALS) is hardly feasible. Terrestrial Laser Scanning (TLS), in turn, is very time consuming for capturing larger areas as many scan positions are necessary for obtaining complete coverage due to view occlusions in the forest. In the recent past, the technological development of Unmanned Arial Systems (UAS) has reached a level that light-weight survey-grade laser scanners can be operated from these platforms. For capturing alluvial topography this could bridge the gap between ALS and TLS in terms of providing a very detailed description of the topography and the vegetation structure due to the achievable very high point density of >100 points per m2. In our contribution we demonstrate the feasibility to apply UAS-borne laser scanning for capturing and modelling the complex topography of the study area Neubacher Au, an alluvial forest at the pre-alpine River Pielach (Lower Austria). The area was captured with Riegl's VUX-1 compact time-of-flight laser scanner mounted on a RiCopter (X-8 array octocopter). The scanner features an effective scan rate of 500 kHz and was flown in 50-100 m above ground. At this flying height the laser footprint is 25-50 mm allowing mapping of very small surface details. Furthermore, online waveform processing of the backscattered laser energy enables the retrieval of multiple targets for single laser shots resulting in a dense point cloud of, both, the ground surface and the alluvial vegetation. From the acquired point cloud the following products could be derived: (i) a very high resolution Digital Terrain Model (10 cm raster), (ii) a high resolution model of the water surface of the River Pielach (especially useful for validation of topo-bathymetry LiDAR data) and (iii) a detailed description of the complex vegetation structure.
A Principled Approach to the Specification of System Architectures for Space Missions
NASA Technical Reports Server (NTRS)
McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad
2015-01-01
Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.
Representational geometry: integrating cognition, computation, and the brain.
Kriegeskorte, Nikolaus; Kievit, Rogier A
2013-08-01
The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. Copyright © 2013 Elsevier Ltd. All rights reserved.
Heldenbrant, David J; Koech, Phillip K; Rainbolt, James E; Bearden, Mark D; Zheng, Feng
2014-02-18
A system and process are disclosed for selective removal and recovery of H.sub.2S from a gaseous volume, e.g., from natural gas. Anhydrous organic, sorbents chemically capture H.sub.2S gas to form hydrosulfide salts. Regeneration of the capture solvent involves addition of an anti-solvent that releases the captured H.sub.2S gas from the capture sorbent. The capture sorbent and anti-solvent are reactivated for reuse, e.g., by simple distillation.
Liu, Yi-Hung; Chen, Yan-Jen
2011-01-01
Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms. PMID:22016625
Fischhoff, Baruch
2010-09-01
The study of judgment and decision making entails three interrelated forms of research: (1) normative analysis, identifying the best courses of action, given decision makers' values; (2) descriptive studies, examining actual behavior in terms comparable to the normative analyses; and (3) prescriptive interventions, helping individuals to make better choices, bridging the gap between the normative ideal and the descriptive reality. The research is grounded in analytical foundations shared by economics, psychology, philosophy, and management science. Those foundations provide a framework for accommodating affective and social factors that shape and complement the cognitive processes of decision making. The decision sciences have grown through applications requiring collaboration with subject matter experts, familiar with the substance of the choices and the opportunities for interventions. Over the past half century, the field has shifted its emphasis from predicting choices, which can be successful without theoretical insight, to understanding the processes shaping them. Those processes are often revealed through biases that suggest non-normative processes. The practical importance of these biases depends on the sensitivity of specific decisions and the support that individuals have in making them. As a result, the field offers no simple summary of individuals' competence as decision makers, but a suite of theories and methods suited to capturing these sensitivities. Copyright © 2010 John Wiley & Sons, Ltd. For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.
Liu, Yi-Hung; Chen, Yan-Jen
2011-01-01
Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.
Blank, Carrine E; Cui, Hong; Moore, Lisa R; Walls, Ramona L
2016-01-01
MicrO is an ontology of microbiological terms, including prokaryotic qualities and processes, material entities (such as cell components), chemical entities (such as microbiological culture media and medium ingredients), and assays. The ontology was built to support the ongoing development of a natural language processing algorithm, MicroPIE (or, Microbial Phenomics Information Extractor). During the MicroPIE design process, we realized there was a need for a prokaryotic ontology which would capture the evolutionary diversity of phenotypes and metabolic processes across the tree of life, capture the diversity of synonyms and information contained in the taxonomic literature, and relate microbiological entities and processes to terms in a large number of other ontologies, most particularly the Gene Ontology (GO), the Phenotypic Quality Ontology (PATO), and the Chemical Entities of Biological Interest (ChEBI). We thus constructed MicrO to be rich in logical axioms and synonyms gathered from the taxonomic literature. MicrO currently has ~14550 classes (~2550 of which are new, the remainder being microbiologically-relevant classes imported from other ontologies), connected by ~24,130 logical axioms (5,446 of which are new), and is available at (http://purl.obolibrary.org/obo/MicrO.owl) and on the project website at https://github.com/carrineblank/MicrO. MicrO has been integrated into the OBO Foundry Library (http://www.obofoundry.org/ontology/micro.html), so that other ontologies can borrow and re-use classes. Term requests and user feedback can be made using MicrO's Issue Tracker in GitHub. We designed MicrO such that it can support the ongoing and future development of algorithms that can leverage the controlled vocabulary and logical inference power provided by the ontology. By connecting microbial classes with large numbers of chemical entities, material entities, biological processes, molecular functions, and qualities using a dense array of logical axioms, we intend MicrO to be a powerful new tool to increase the computing power of bioinformatics tools such as the automated text mining of prokaryotic taxonomic descriptions using natural language processing. We also intend MicrO to support the development of new bioinformatics tools that aim to develop new connections between microbial phenotypes and genotypes (i.e., the gene content in genomes). Future ontology development will include incorporation of pathogenic phenotypes and prokaryotic habitats.
Time-dependent spin-density-functional-theory description of He+-He collisions
NASA Astrophysics Data System (ADS)
Baxter, Matthew; Kirchner, Tom; Engel, Eberhard
2017-09-01
Theoretical total cross-section results for all ionization and capture processes in the He+-He collision system are presented in the approximate impact energy range of 10-1000 keV/amu. Calculations were performed within the framework of time-dependent spin-density functional theory. The Krieger-Li-Iafrate approximation was used to determine an accurate exchange-correlation potential in the exchange-only limit. The results of two models, one where electron translation factors in the orbitals used to calculate the potential are ignored and another where partial electron translation factors are included, are compared with available experimental data as well as a selection of previous theoretical calculations.
NASA Astrophysics Data System (ADS)
Bouda, Martin; Saiers, James E.
2017-12-01
Root system architecture (RSA) can significantly affect plant access to water, total transpiration, as well as its partitioning by soil depth, with implications for surface heat, water, and carbon budgets. Despite recent advances in land surface model (LSM) descriptions of plant hydraulics, descriptions of RSA have not been included because of their three-dimensional complexity, which makes them generally too computationally costly. Here we demonstrate a new, process-based 1D layered model that captures the dynamic shifts in water potential gradients of 3D RSA under different soil moisture conditions: the RSA stencil. Using root systems calibrated to the rooting profiles of four plant functional types (PFT) of the Community Land Model, we show that the RSA stencil predicts plant water potentials within 2% to the outputs of a full 3D model, under the same assumptions on soil moisture heterogeneity, despite its trivial computational cost, resulting in improved predictions of water uptake and soil moisture compared to a model without RSA in a transient simulation. Our results suggest that LSM predictions of soil moisture dynamics and dependent variables can be improved by the implementation of this model, calibrated for individual PFTs using field observations.
Experimental evaluation of the impact of packet capturing tools for web services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choe, Yung Ryn; Mohapatra, Prasant; Chuah, Chen-Nee
Network measurement is a discipline that provides the techniques to collect data that are fundamental to many branches of computer science. While many capturing tools and comparisons have made available in the literature and elsewhere, the impact of these packet capturing tools on existing processes have not been thoroughly studied. While not a concern for collection methods in which dedicated servers are used, many usage scenarios of packet capturing now requires the packet capturing tool to run concurrently with operational processes. In this work we perform experimental evaluations of the performance impact that packet capturing process have on web-based services;more » in particular, we observe the impact on web servers. We find that packet capturing processes indeed impact the performance of web servers, but on a multi-core system the impact varies depending on whether the packet capturing and web hosting processes are co-located or not. In addition, the architecture and behavior of the web server and process scheduling is coupled with the behavior of the packet capturing process, which in turn also affect the web server's performance.« less
Gustavsson, Susanne; Gremyr, Ida; Kenne Sarenmalm, Elisabeth
2016-03-01
The aim of this article was to explore whether current quality dimensions for health care services are sufficient to capture how parents perceive and contribute to quality of health care. New quality improvement initiatives that actively involve patients must be examined with a critical view on established quality dimensions to ensure that these measures support patient involvement. This paper used a qualitative and descriptive design. This paper is based on interviews with parents participating in two experience-based co-design projects in a Swedish hospital that included qualitative content analysis of data from 12 parent interviews in paediatric care. Health care professionals often overemphasize their own significance for value creation in care processes and underappreciate parents' ability to influence and contribute to better quality. However, quality is not based solely on how professionals accomplish their task, but is co-created by health care professionals and parents. Consequently, assessment of quality outcomes also must include parents' ability and context. This paper questions current models of quality dimensions in health care, and suggests additional sub-dimensions, such as family quality and involvement quality. This paper underscores the importance of involving parents in health care improvements with health care professionals to capture as many dimensions of quality as possible. © 2015 John Wiley & Sons Ltd.
Investigation of Episodic Flow from Unsaturated Porous Media into a Macropore
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. K. Podgorney; J. P. Fairley
Th e recent literature contains numerous observations of episodic or intermittent fl ow in unsaturated flow systems under both constant fl ux and ponded boundary conditions. Flow systems composed of a heterogeneous porous media, as well as discrete fracture networks, have been cited as examples of systems that can exhibit episodic fl ow. Episodic outfl ow events are significant because relatively large volumes of water can move rapidly through an unsaturated system, carrying water and contaminants to depth greatly ahead of a wetting front predicted by a one-dimensional, gravity-driven diff usive infiltration model. In this study, we model the behaviormore » of water flow through a sand column underlain by an impermeable-walled macropore. Relative permeability and capillary pressure relationships were developed that capture the complex interrelationships between the macropore and the overlying porous media that control fl ow out of the system. The potential for episodic flow is assessed and compared to results of conventional modeling approaches and experimental data from the literature. Model results using coupled matrix–macropore relative permeability and capillary pressure relationships capture the behavior observed in laboratory experiments remarkably well, while simulations using conventional relative permeability and capillary pressure functions fail to capture some of the observed fl ow dynamics. Capturing the rapid downward movement of water suggests that the matrix-macropore capillary pressure and relative permeability functions developed have the potential to improve descriptions of fl ow and transport processes in heterogeneous, variably saturated media.« less
Analytical Chemistry: A Literary Approach.
ERIC Educational Resources Information Center
Lucy, Charles A.
2000-01-01
Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)
Climate Science Performance, Data and Productivity on Titan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, Benjamin W; Worley, Patrick H; Gaddis, Abigail L
2015-01-01
Climate Science models are flagship codes for the largest of high performance computing (HPC) resources, both in visibility, with the newly launched Department of Energy (DOE) Accelerated Climate Model for Energy (ACME) effort, and in terms of significant fractions of system usage. The performance of the DOE ACME model is captured with application level timers and examined through a sizeable run archive. Performance and variability of compute, queue time and ancillary services are examined. As Climate Science advances in the use of HPC resources there has been an increase in the required human and data systems to achieve programs goals.more » A description of current workflow processes (hardware, software, human) and planned automation of the workflow, along with historical and projected data in motion and at rest data usage, are detailed. The combination of these two topics motivates a description of future systems requirements for DOE Climate Modeling efforts, focusing on the growth of data storage and network and disk bandwidth required to handle data at an acceptable rate.« less
Comparative A/B testing a mobile data acquisition app for hydrogeochemistry
NASA Astrophysics Data System (ADS)
Klump, Jens; Golodoniuc, Pavel; Reid, Nathan; Gray, David; Ross, Shawn
2015-04-01
In the context of a larger study on the Capricorn Orogen of Western Australia, the CSIRO Mineral Discovery Program is conducting a regional study of the hydrogeochemistry on water from agricultural and other bores. Over time, the sampling process was standardised and a form for capturing metadata and data from initial measurements was developed. In 2014 an extensive technology review was conducted with an aim to automate field data acquisition process. A prototype hydrogeochemistry data capture form was implemented as a mobile application for Windows Mobile devices. This version of the software was a standalone application with an interface to export data as CSV files. A second candidate version of the hydrogeochemistry data capture form was implemented as an Android mobile application in the FAIMS framework. FAIMS is a framework for mobile field data capture, originally developed by at the University of New South Wales for archaeological field data collection. A benefit of the FAIMS application was the ability to associate photographs taken with the device's embedded camera with the captured data. FAIMS also allows networked collaboration within a field team, using the mobile applications as asynchronous rich clients. The network infrastructure can be installed in the field ("FAIMS in a Box") to supply data synchronisation, backup and transfer. This aspect will be tested in the next field season. A benefit of the FAIMS application was the ability to associate photographs taken with the device's embedded camera with the captured data. Having two data capture applications available allowed us to conduct an A/B test, comparing two different implementations for the same task. Both applications were trialled in the field by different field crews and user feedback will be used to improve the usability of the app for the next field season. A key learning was that the ergonomics of the app is at paramount importance to gain the user acceptance. This extends from general fit with the standard procedures used in the field during data acquisition to self-descriptive and intuitive user interface features well aligned with the workflows and sequence of actions performed by a user that ultimately contributes to the implementation of a Collect-As-You-Go approach. In the Australian outback, issues such as absence of network connectivity, heat and sun glare may challenge the utility of tablet based applications in the field. Due to limitations of tablet use in the field we also consider the use of smart pens for data capture. A smart pen application based on Anoto forms and software by Formidable will be tested in the next field season.
A model for process representation and synthesis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Thomas, R. H.
1971-01-01
The problem of representing groups of loosely connected processes is investigated, and a model for process representation useful for synthesizing complex patterns of process behavior is developed. There are three parts, the first part isolates the concepts which form the basis for the process representation model by focusing on questions such as: What is a process; What is an event; Should one process be able to restrict the capabilities of another? The second part develops a model for process representation which captures the concepts and intuitions developed in the first part. The model presented is able to describe both the internal structure of individual processes and the interface structure between interacting processes. Much of the model's descriptive power derives from its use of the notion of process state as a vehicle for relating the internal and external aspects of process behavior. The third part demonstrates by example that the model for process representation is a useful one for synthesizing process behavior patterns. In it the model is used to define a variety of interesting process behavior patterns. The dissertation closes by suggesting how the model could be used as a semantic base for a very potent language extension facility.
NASA Astrophysics Data System (ADS)
Li, Lesheng; Giokas, Paul G.; Kanai, Yosuke; Moran, Andrew M.
2014-06-01
Kinetic models based on Fermi's Golden Rule are commonly employed to understand photoinduced electron transfer dynamics at molecule-semiconductor interfaces. Implicit in such second-order perturbative descriptions is the assumption that nuclear relaxation of the photoexcited electron donor is fast compared to electron injection into the semiconductor. This approximation breaks down in systems where electron transfer transitions occur on 100-fs time scale. Here, we present a fourth-order perturbative model that captures the interplay between time-coincident electron transfer and nuclear relaxation processes initiated by light absorption. The model consists of a fairly small number of parameters, which can be derived from standard spectroscopic measurements (e.g., linear absorbance, fluorescence) and/or first-principles electronic structure calculations. Insights provided by the model are illustrated for a two-level donor molecule coupled to both (i) a single acceptor level and (ii) a density of states (DOS) calculated for TiO2 using a first-principles electronic structure theory. These numerical calculations show that second-order kinetic theories fail to capture basic physical effects when the DOS exhibits narrow maxima near the energy of the molecular excited state. Overall, we conclude that the present fourth-order rate formula constitutes a rigorous and intuitive framework for understanding photoinduced electron transfer dynamics that occur on the 100-fs time scale.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leib, Thomas; Cole, Dan
In late September 2014 development of the Lake Charles Clean Energy (LCCE) Plant was abandoned resulting in termination of Lake Charles Carbon Capture and Sequestration (CCS) Project which was a subset the LCCE Plant. As a result, the project was only funded through Phase 2A (Design) and did not enter Phase 2B (Construction) or Phase 2C (Operations). This report was prepared relying on information prepared and provided by engineering companies which were engaged by Leucadia Energy, LLC to prepare or review Front End Engineering and Design (FEED) for the Lake Charles Clean Energy Project, which includes the Carbon Capture andmore » Sequestration (CCS) Project in Lake Charles, Louisiana. The Lake Charles Carbon Capture and Sequestration (CCS) Project was to be a large-scale industrial CCS project intended to demonstrate advanced technologies that capture and sequester carbon dioxide (CO 2) emissions from industrial sources into underground formations. The Scope of work was divided into two discrete sections; 1) Capture and Compression prepared by the Recipient Leucadia Energy, LLC, and 2) Transport and Sequestration prepared by sub-Recipient Denbury Onshore, LLC. Capture and Compression-The Lake Charles CCS Project Final Technical Report describes the systems and equipment that would be necessary to capture CO 2 generated in a large industrial gasification process and sequester the CO 2 into underground formations. The purpose of each system is defined along with a description of its equipment and operation. Criteria for selection of major equipment are provided and ancillary utilities necessary for safe and reliable operation in compliance with environmental regulations are described. Construction considerations are described including a general arrangement of the CCS process units within the overall gasification project. A cost estimate is provided, delineated by system area with cost breakdown showing equipment, piping and materials, construction labor, engineering, and other costs. The CCS Project Final Technical Report is based on a Front End Engineering and Design (FEED) study prepared by SK E&C, completed in [June] 2014. Subsequently, Fluor Enterprises completed a FEED validation study in mid-September 2014. The design analyses indicated that the FEED package was sufficient and as expected. However, Fluor considered the construction risk based on a stick-build approach to be unacceptable, but construction risk would be substantially mitigated through utilization of modular construction where site labor and schedule uncertainty is minimized. Fluor’s estimate of the overall EPC project cost utilizing the revised construction plan was comparable to SKE&C’s value after reflecting Fluor’s assessment of project scope and risk characteristic. Development was halted upon conclusion of Phase 2A FEED and the project was not constructed.Transport and Sequestration – The overall objective of the pipeline project was to construct a pipeline to transport captured CO 2 from the Lake Charles Clean Energy project to the existing Denbury Green Line and then to the Hastings Field in Southeast Texas to demonstrate effective geologic sequestration of captured CO 2 through commercial EOR operations. The overall objective of the MVA portion of the project was to demonstrate effective geologic sequestration of captured CO 2 through commercial Enhanced Oil Recovery (EOR) operations in order to evaluate costs, operational processes and technical performance. The DOE target for the project was to capture and implement a research MVA program to demonstrate the sequestration through EOR of approximately one million tons of CO 2 per year as an integral component of commercial operations.« less
Theories of Simplification and Scaling of Spatially Distributed Processes. Chapter 12
NASA Technical Reports Server (NTRS)
Levin, Simon A.; Pacala, Stephen W.
1997-01-01
The problem of scaling is at the heart of ecological theory, the essence of understanding and of the development of a predictive capability. The description of any system depends on the spatial, temporal, and organizational perspective chosen; hence it is essential to understand not only how patterns and dynamics vary with scale, but also how patterns at one scale are manifestations of processes operating at other scales. Evolution has shaped the characteristics of species in ways that result in scale displacement: Each species experiences the environment at its own unique set of spatial and temporal scales and interfaces the biota through unique assemblages of phenotypes. In this way, coexistence becomes possible, and biodiversity is enhanced. By averaging over space, time, and biological interactions, a genotype filters variation at fine scales and selects the arena in which it will face the vicissitudes of nature. Variation at finer scales is then noise, of minor importance to the survival and dynamics of the species, and consequently of minor importance in any attempt at description. In attempting to model ecological interactions in space, contributors throughout this book have struggled with a trade-off between simplification and "realistic" complexity and detail. Although the challenge of simplification is widely recognized in ecology, less appreciated is the intertwining of scaling questions and scaling laws with the process of simplification. In the context of this chapter simplification will in general mean the use of spatial or ensemble means and low-order moments to capture more detailed interactions by integrating over given areas. In this way, one can derive descriptions of the system at different spatial scales, which provides the essentials for the extraction of scaling laws by examination of how system properties vary with scale.
76 FR 53763 - Immigration Benefits Business Transformation, Increment I
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-29
...The Department of Homeland Security (DHS) is amending its regulations to enable U.S. Citizenship and Immigration Services (USCIS) to migrate from a paper file-based, non-integrated systems environment to an electronic customer-focused, centralized case management environment for benefit processing. This transformation process will allow USCIS to streamline benefit processing, eliminate the capture and processing of redundant data, and reduce the number of and automate its forms. This transformation process will be a phased multi-year initiative to restructure USCIS business processes and related information technology systems. DHS is removing references to form numbers, form titles, expired regulatory provisions, and descriptions of internal procedures, many of which will change during transformation. DHS is also finalizing interim rules that permitted submission of benefit requests with an electronic signature when such requests are submitted in an electronic format rather than on a paper form and that removed references to filing locations for immigration benefits. In addition, in this rule DHS is publishing the final rule for six other interim rules published during the past several years, most of which received no public comments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szoka de Valladares, M.R.; Mack, S.
The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less
Gamma-widths, lifetimes and fluctuations in the nuclear quasi-continuum
NASA Astrophysics Data System (ADS)
Guttormsen, M.; Larsen, A. C.; Midtbø, J. E.; Crespo Campo, L.; Görgen, A.; Ingeberg, V. W.; Renstrøm, T.; Siem, S.; Tveten, G. M.; Zeiser, F.; Kirsch, L. E.
2018-05-01
Statistical γ-decay from highly excited states is determined by the nuclear level density (NLD) and the γ-ray strength function (γSF). These average quantities have been measured for several nuclei using the Oslo method. For the first time, we exploit the NLD and γSF to evaluate the γ-width in the energy region below the neutron binding energy, often called the quasi-continuum region. The lifetimes of states in the quasi-continuum are important benchmarks for a theoretical description of nuclear structure and dynamics at high temperature. The lifetimes may also have impact on reaction rates for the rapid neutron-capture process, now demonstrated to take place in neutron star mergers.
X-ray absorption spectra: Graphene, h-BN, and their alloy
NASA Astrophysics Data System (ADS)
Bhowmick, Somnath; Rusz, Jan; Eriksson, Olle
2013-04-01
Using first-principles density functional theory calculations, in conjunction with the Mahan-Nozières-de Dominicis theory, we calculate the x-ray absorption spectra of the alloys of graphene and monolayer hexagonal boron nitride on a Ni (111) substrate. The chemical neighborhood of the constituent atoms (B, C, and N) inside the alloy differs from that of the parent phases. In a systematic way, we capture the change in the K-edge spectral shape, depending on the chemical neighborhood of B, C, and N. Our work also reiterates the importance of the dynamical core-hole screening for a proper description of the x-ray absorption process in sp2-bonded layered materials.
Information properties of morphologically complex words modulate brain activity during word reading
Hultén, Annika; Lehtonen, Minna; Lagus, Krista; Salmelin, Riitta
2018-01-01
Abstract Neuroimaging studies of the reading process point to functionally distinct stages in word recognition. Yet, current understanding of the operations linked to those various stages is mainly descriptive in nature. Approaches developed in the field of computational linguistics may offer a more quantitative approach for understanding brain dynamics. Our aim was to evaluate whether a statistical model of morphology, with well‐defined computational principles, can capture the neural dynamics of reading, using the concept of surprisal from information theory as the common measure. The Morfessor model, created for unsupervised discovery of morphemes, is based on the minimum description length principle and attempts to find optimal units of representation for complex words. In a word recognition task, we correlated brain responses to word surprisal values derived from Morfessor and from other psycholinguistic variables that have been linked with various levels of linguistic abstraction. The magnetoencephalography data analysis focused on spatially, temporally and functionally distinct components of cortical activation observed in reading tasks. The early occipital and occipito‐temporal responses were correlated with parameters relating to visual complexity and orthographic properties, whereas the later bilateral superior temporal activation was correlated with whole‐word based and morphological models. The results show that the word processing costs estimated by the statistical Morfessor model are relevant for brain dynamics of reading during late processing stages. PMID:29524274
NASA Astrophysics Data System (ADS)
González, Diego Luis; Pimpinelli, Alberto; Einstein, T. L.
2017-07-01
We study the effect of hindered aggregation on the island formation process in a one- (1D) and two-dimensional (2D) point-island model for epitaxial growth with arbitrary critical nucleus size i . In our model, the attachment of monomers to preexisting islands is hindered by an additional attachment barrier, characterized by length la. For la=0 the islands behave as perfect sinks while for la→∞ they behave as reflecting boundaries. For intermediate values of la, the system exhibits a crossover between two different kinds of processes, diffusion-limited aggregation and attachment-limited aggregation. We calculate the growth exponents of the density of islands and monomers for the low coverage and aggregation regimes. The capture-zone (CZ) distributions are also calculated for different values of i and la. In order to obtain a good spatial description of the nucleation process, we propose a fragmentation model, which is based on an approximate description of nucleation inside of the gaps for 1D and the CZs for 2D. In both cases, the nucleation is described by using two different physically rooted probabilities, which are related with the microscopic parameters of the model (i and la). We test our analytical model with extensive numerical simulations and previously established results. The proposed model describes excellently the statistical behavior of the system for arbitrary values of la and i =1 , 2, and 3.
Information properties of morphologically complex words modulate brain activity during word reading.
Hakala, Tero; Hultén, Annika; Lehtonen, Minna; Lagus, Krista; Salmelin, Riitta
2018-06-01
Neuroimaging studies of the reading process point to functionally distinct stages in word recognition. Yet, current understanding of the operations linked to those various stages is mainly descriptive in nature. Approaches developed in the field of computational linguistics may offer a more quantitative approach for understanding brain dynamics. Our aim was to evaluate whether a statistical model of morphology, with well-defined computational principles, can capture the neural dynamics of reading, using the concept of surprisal from information theory as the common measure. The Morfessor model, created for unsupervised discovery of morphemes, is based on the minimum description length principle and attempts to find optimal units of representation for complex words. In a word recognition task, we correlated brain responses to word surprisal values derived from Morfessor and from other psycholinguistic variables that have been linked with various levels of linguistic abstraction. The magnetoencephalography data analysis focused on spatially, temporally and functionally distinct components of cortical activation observed in reading tasks. The early occipital and occipito-temporal responses were correlated with parameters relating to visual complexity and orthographic properties, whereas the later bilateral superior temporal activation was correlated with whole-word based and morphological models. The results show that the word processing costs estimated by the statistical Morfessor model are relevant for brain dynamics of reading during late processing stages. © 2018 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yue, Peng; Gong, Jianya; Di, Liping
Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less
Bench-Scale Process for Low-Cost Carbon Dioxide (CO2) Capture Using a Phase-Changing Absorbent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Westendorf, Tiffany; Caraher, Joel; Chen, Wei
2015-03-31
The objective of this project is to design and build a bench-scale process for a novel phase-changing aminosilicone-based CO2-capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO2-capture absorbent for post-combustion capture of CO2 from coal-fired power plants with 90% capture efficiency and 95% CO2 purity at a cost of $40/tonne of CO2 captured by 2025 and a cost of <$10/tonne of CO2 captured by 2035. In the first budget period of this project, the bench-scale phase-changing CO2 capture process was designed using data and operating experience generated under a previous project (ARPA-emore » project DE-AR0000084). Sizing and specification of all major unit operations was completed, including detailed process and instrumentation diagrams. The system was designed to operate over a wide range of operating conditions to allow for exploration of the effect of process variables on CO2 capture performance.« less
The Astrophysical r-Process 50 Years after B{sup 2}FH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kratz, K.-L.; Pfeiffer, B.; Farouqi, K.
Since the historical papers by Burbidge et al. and Cameron 50 years ago, it is generally accepted that half of the chemical elements above Fe are formed in explosive stellar scenarios by a rapid neutron-capture process (the classical ''r-process''). Already from their essential ideas, it became clear that a correct modelling of this nucleosynthesis process requires both, the knowledge of various nuclear properties very far from stability and a detailed description of the astrophysical environments. However, it took about three decades, until in 1986 the first experimental nuclear-physics data on the neutron-magic r-isotopes {sup 80}Zn and {sup 130}Cd could bemore » obtained, which act as key ''waiting points'' in the respective A{approx_equal}80 and 130 peaks of the Solar-System (SS) r-abundances (N{sub r,{center_dot}}). Since then, using steadily improved nuclear data, we have optimized our r-process calculations to reproduce the present observables of the isotopic N{sub r,{center_dot}} ''residuals'', as well as the more recent elemental abundances in ultra-metal-poor, r-process-enriched halo stars. Concerning the latter observations, we support the basic idea about two different types of r-processes. Based on our many years' experience with the site-independent ''waiting-point approach'', we recently have extended our studies to fully dynamical network calculations for the most likely astrophysical r-process scenario, i.e. the high-entropy wind (HEW) of core-collapse type II supernovae (SN II). Again, an excellent reproduction of all observables for the ''main'' r-process has been achieved. However, a major difference is the nucleosynthesis origin of the lighter heavy elements in the 29{<=}Z{<=}45 mass region. Here, the HEW model predicts-instead of a ''weak'' neutron-capture r-process component-a primary rapid charged-particle process. This may explain the recent observations of a non-correlation of these elements with the heavier ''main'' r-process elements.« less
Jeong, Kyeong-Min; Kim, Hee-Seung; Hong, Sung-In; Lee, Sung-Keun; Jo, Na-Young; Kim, Yong-Soo; Lim, Hong-Gi; Park, Jae-Hyeung
2012-10-08
Speed enhancement of integral imaging based incoherent Fourier hologram capture using a graphic processing unit is reported. Integral imaging based method enables exact hologram capture of real-existing three-dimensional objects under regular incoherent illumination. In our implementation, we apply parallel computation scheme using the graphic processing unit, accelerating the processing speed. Using enhanced speed of hologram capture, we also implement a pseudo real-time hologram capture and optical reconstruction system. The overall operation speed is measured to be 1 frame per second.
Implementation of a national anti-tuberculosis drug resistance survey in Tanzania
Chonde, Timothy M; Doulla, Basra; van Leth, Frank; Mfinanga, Sayoki GM; Range, Nyagosya; Lwilla, Fred; Mfaume, Saidi M; van Deun, Armand; Zignol, Matteo; Cobelens, Frank G; Egwaga, Saidi M
2008-01-01
Background A drug resistance survey is an essential public health management tool for evaluating and improving the performance of National Tuberculosis control programmes. The current manuscript describes the implementation of the first national drug resistance survey in Tanzania. Methods Description of the implementation process of a national anti-tuberculosis drug resistance survey in Tanzania, in relation to the study protocol and Standard Operating Procedures. Results Factors contributing positively to the implementation of the survey were a continuous commitment of the key stakeholders, the existence of a well organized National Tuberculosis Programme, and a detailed design of cluster-specific arrangements for rapid sputum transportation. Factors contributing negatively to the implementation were a long delay between training and actual survey activities, limited monitoring of activities, and an unclear design of the data capture forms leading to difficulties in form-filling. Conclusion Careful preparation of the survey, timing of planned activities, a strong emphasis on data capture tools and data management, and timely supervision are essential for a proper implementation of a national drug resistance survey. PMID:19116022
Mathematical modelling of the active hearing process in mosquitoes
Avitabile, D.; Homer, M.; Champneys, A. R.; Jackson, J. C.; Robert, D.
2010-01-01
Insects have evolved diverse and delicate morphological structures in order to capture the inherently low energy of a propagating sound wave. In mosquitoes, the capture of acoustic energy and its transduction into neuronal signals are assisted by the active mechanical participation of the scolopidia. We propose a simple microscopic mechanistic model of the active amplification in the mosquito species Toxorhynchites brevipalpis. The model is based on the description of the antenna as a forced-damped oscillator coupled to a set of active threads (ensembles of scolopidia) that provide an impulsive force when they twitch. This twitching is in turn controlled by channels that are opened and closed if the antennal oscillation reaches a critical amplitude. The model matches both qualitatively and quantitatively with recent experiments: spontaneous oscillations, nonlinear amplification, hysteresis, 2 : 1 resonances, frequency response and gain loss owing to hypoxia. The numerical simulations presented here also generate new hypotheses. In particular, the model seems to indicate that scolopidia located towards the tip of Johnston's organ are responsible for the entrainment of the other scolopidia and that they give the largest contribution to the mechanical amplification. PMID:19447819
Li, Kangkang; Yu, Hai; Feron, Paul; Tade, Moses; Wardhaugh, Leigh
2015-08-18
Using a rate-based model, we assessed the technical feasibility and energy performance of an advanced aqueous-ammonia-based postcombustion capture process integrated with a coal-fired power station. The capture process consists of three identical process trains in parallel, each containing a CO2 capture unit, an NH3 recycling unit, a water separation unit, and a CO2 compressor. A sensitivity study of important parameters, such as NH3 concentration, lean CO2 loading, and stripper pressure, was performed to minimize the energy consumption involved in the CO2 capture process. Process modifications of the rich-split process and the interheating process were investigated to further reduce the solvent regeneration energy. The integrated capture system was then evaluated in terms of the mass balance and the energy consumption of each unit. The results show that our advanced ammonia process is technically feasible and energy-competitive, with a low net power-plant efficiency penalty of 7.7%.
Hasenauer, J; Wolf, V; Kazeroonian, A; Theis, F J
2014-09-01
The time-evolution of continuous-time discrete-state biochemical processes is governed by the Chemical Master Equation (CME), which describes the probability of the molecular counts of each chemical species. As the corresponding number of discrete states is, for most processes, large, a direct numerical simulation of the CME is in general infeasible. In this paper we introduce the method of conditional moments (MCM), a novel approximation method for the solution of the CME. The MCM employs a discrete stochastic description for low-copy number species and a moment-based description for medium/high-copy number species. The moments of the medium/high-copy number species are conditioned on the state of the low abundance species, which allows us to capture complex correlation structures arising, e.g., for multi-attractor and oscillatory systems. We prove that the MCM provides a generalization of previous approximations of the CME based on hybrid modeling and moment-based methods. Furthermore, it improves upon these existing methods, as we illustrate using a model for the dynamics of stochastic single-gene expression. This application example shows that due to the more general structure, the MCM allows for the approximation of multi-modal distributions.
Exogenous (automatic) attention to emotional stimuli: a review.
Carretié, Luis
2014-12-01
Current knowledge on the architecture of exogenous attention (also called automatic, bottom-up, or stimulus-driven attention, among other terms) has been mainly obtained from studies employing neutral, anodyne stimuli. Since, from an evolutionary perspective, exogenous attention can be understood as an adaptive tool for rapidly detecting salient events, reorienting processing resources to them, and enhancing processing mechanisms, emotional events (which are, by definition, salient for the individual) would seem crucial to a comprehensive understanding of this process. This review, focusing on the visual modality, describes 55 experiments in which both emotional and neutral irrelevant distractors are presented at the same time as ongoing task targets. Qualitative and, when possible, meta-analytic descriptions of results are provided. The most conspicuous result is that, as confirmed by behavioral and/or neural indices, emotional distractors capture exogenous attention to a significantly greater extent than do neutral distractors. The modulatory effects of the nature of distractors capturing attention, of the ongoing task characteristics, and of individual differences, previously proposed as mediating factors, are also described. Additionally, studies reviewed here provide temporal and spatial information-partially absent in traditional cognitive models-on the neural basis of preattention/evaluation, reorienting, and sensory amplification, the main subprocesses involved in exogenous attention. A model integrating these different levels of information is proposed. The present review, which reveals that there are several key issues for which experimental data are surprisingly scarce, confirms the relevance of including emotional distractors in studies on exogenous attention.
RUIZ-RAMOS, MARGARITA; MÍNGUEZ, M. INÉS
2006-01-01
• Background Plant structural (i.e. architectural) models explicitly describe plant morphology by providing detailed descriptions of the display of leaf and stem surfaces within heterogeneous canopies and thus provide the opportunity for modelling the functioning of plant organs in their microenvironments. The outcome is a class of structural–functional crop models that combines advantages of current structural and process approaches to crop modelling. ALAMEDA is such a model. • Methods The formalism of Lindenmayer systems (L-systems) was chosen for the development of a structural model of the faba bean canopy, providing both numerical and dynamic graphical outputs. It was parameterized according to the results obtained through detailed morphological and phenological descriptions that capture the detailed geometry and topology of the crop. The analysis distinguishes between relationships of general application for all sowing dates and stem ranks and others valid only for all stems of a single crop cycle. • Results and Conclusions The results reveal that in faba bean, structural parameterization valid for the entire plant may be drawn from a single stem. ALAMEDA was formed by linking the structural model to the growth model ‘Simulation d'Allongement des Feuilles’ (SAF) with the ability to simulate approx. 3500 crop organs and components of a group of nine plants. Model performance was verified for organ length, plant height and leaf area. The L-system formalism was able to capture the complex architecture of canopy leaf area of this indeterminate crop and, with the growth relationships, generate a 3D dynamic crop simulation. Future development and improvement of the model are discussed. PMID:16390842
System and process for capture of acid gasses at elevated pressure from gaseous process streams
Heldebrant, David J.; Koech, Phillip K.; Linehan, John C.; Rainbolt, James E.; Bearden, Mark D.; Zheng, Feng
2016-09-06
A system, method, and material that enables the pressure-activated reversible chemical capture of acid gasses such as CO.sub.2 from gas volumes such as streams, flows or any other volume. Once the acid gas is chemically captured, the resulting product typically a zwitterionic salt, can be subjected to a reduced pressure whereupon the resulting product will release the captures acid gas and the capture material will be regenerated. The invention includes this process as well as the materials and systems for carrying out and enabling this process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miebach, Barbara; McDuffie, Dwayne; Spiry, Irina
The objective of this project is to design and build a bench-scale process for a novel phase-changing CO 2 capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO 2 capture absorbent for post-combustion capture of CO 2 from coal-fired power plants with 90% capture efficiency and 95% CO 2 purity at a cost of $40/tonne of CO 2 captured by 2025 and a cost of <$10/tonne of CO 2 captured by 2035. This report presents system and economic analysis for a process that uses a phase changing aminosilicone solvent to remove COmore » 2 from pulverized coal (PC) power plant flue gas. The aminosilicone solvent is a pure 1,3-bis(3-aminopropyl)-1,1,3,3-tetramethyldisiloxane (GAP-0). Performance of the phase-changing aminosilicone technology is compared to that of a conventional carbon capture system using aqueous monoethanolamine (MEA). This analysis demonstrates that the aminosilicone process has significant advantages relative to an MEA-based system. The first-year CO 2 removal cost for the phase-changing CO 2 capture process is $52.1/tonne, compared to $66.4/tonne for the aqueous amine process. The phase-changing CO 2 capture process is less costly than MEA because of advantageous solvent properties that include higher working capacity, lower corrosivity, lower vapor pressure, and lower heat capacity. The phase-changing aminosilicone process has approximately 32% lower equipment capital cost compared to that of the aqueous amine process. However, this solvent is susceptible to thermal degradation at CSTR desorber operating temperatures, which could add as much as $88/tonne to the CO 2 capture cost associated with solvent makeup. Future work is focused on mitigating this critical risk by developing an advanced low-temperature desorber that can deliver comparable desorption performance and significantly reduced thermal degradation rate.« less
NASA Astrophysics Data System (ADS)
Leung, Juliana Y.; Srinivasan, Sanjay
2016-09-01
Modeling transport process at large scale requires proper scale-up of subsurface heterogeneity and an understanding of its interaction with the underlying transport mechanisms. A technique based on volume averaging is applied to quantitatively assess the scaling characteristics of effective mass transfer coefficient in heterogeneous reservoir models. The effective mass transfer coefficient represents the combined contribution from diffusion and dispersion to the transport of non-reactive solute particles within a fluid phase. Although treatment of transport problems with the volume averaging technique has been published in the past, application to geological systems exhibiting realistic spatial variability remains a challenge. Previously, the authors developed a new procedure where results from a fine-scale numerical flow simulation reflecting the full physics of the transport process albeit over a sub-volume of the reservoir are integrated with the volume averaging technique to provide effective description of transport properties. The procedure is extended such that spatial averaging is performed at the local-heterogeneity scale. In this paper, the transport of a passive (non-reactive) solute is simulated on multiple reservoir models exhibiting different patterns of heterogeneities, and the scaling behavior of effective mass transfer coefficient (Keff) is examined and compared. One such set of models exhibit power-law (fractal) characteristics, and the variability of dispersion and Keff with scale is in good agreement with analytical expressions described in the literature. This work offers an insight into the impacts of heterogeneity on the scaling of effective transport parameters. A key finding is that spatial heterogeneity models with similar univariate and bivariate statistics may exhibit different scaling characteristics because of the influence of higher order statistics. More mixing is observed in the channelized models with higher-order continuity. It reinforces the notion that the flow response is influenced by the higher-order statistical description of heterogeneity. An important implication is that when scaling-up transport response from lab-scale results to the field scale, it is necessary to account for the scale-up of heterogeneity. Since the characteristics of higher-order multivariate distributions and large-scale heterogeneity are typically not captured in small-scale experiments, a reservoir modeling framework that captures the uncertainty in heterogeneity description should be adopted.
Uy, Raymonde Charles Y; Kury, Fabricio P; Fontelo, Paul A
2015-01-01
The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions.
Structure of a low-population intermediate state in the release of an enzyme product.
De Simone, Alfonso; Aprile, Francesco A; Dhulesia, Anne; Dobson, Christopher M; Vendruscolo, Michele
2015-01-09
Enzymes can increase the rate of biomolecular reactions by several orders of magnitude. Although the steps of substrate capture and product release are essential in the enzymatic process, complete atomic-level descriptions of these steps are difficult to obtain because of the transient nature of the intermediate conformations, which makes them largely inaccessible to standard structure determination methods. We describe here the determination of the structure of a low-population intermediate in the product release process by human lysozyme through a combination of NMR spectroscopy and molecular dynamics simulations. We validate this structure by rationally designing two mutations, the first engineered to destabilise the intermediate and the second to stabilise it, thus slowing down or speeding up, respectively, product release. These results illustrate how product release by an enzyme can be facilitated by the presence of a metastable intermediate with transient weak interactions between the enzyme and product.
Amerson, Roxanne; Livingston, Wade G
2014-04-01
This qualitative descriptive study used reflexive photography to evaluate the learning process of cultural competence during an international service-learning project in Guatemala. Reflexive photography is an innovative qualitative research technique that examines participants' interactions with their environment through their personal reflections on images that they captured during their experience. A purposive sample of 10 baccalaureate nursing students traveled to Guatemala, where they conducted family and community assessments, engaged in home visits, and provided health education. Data collection involved over 100 photographs and a personal interview with each student. The themes developed from the photographs and interviews provided insight into the activities of an international experience that influence the cognitive, practical, and affective learning of cultural competence. Making home visits and teaching others from a different culture increased students' transcultural self-efficacy. Reflexive photography is a more robust method of self-reflection, especially for visual learners.
NASA Astrophysics Data System (ADS)
Bonilla, L. L.; Carretero, M.; Terragni, F.; Birnir, B.
2016-08-01
Angiogenesis is a multiscale process by which blood vessels grow from existing ones and carry oxygen to distant organs. Angiogenesis is essential for normal organ growth and wounded tissue repair but it may also be induced by tumours to amplify their own growth. Mathematical and computational models contribute to understanding angiogenesis and developing anti-angiogenic drugs, but most work only involves numerical simulations and analysis has lagged. A recent stochastic model of tumour-induced angiogenesis including blood vessel branching, elongation, and anastomosis captures some of its intrinsic multiscale structures, yet allows one to extract a deterministic integropartial differential description of the vessel tip density. Here we find that the latter advances chemotactically towards the tumour driven by a soliton (similar to the famous Korteweg-de Vries soliton) whose shape and velocity change slowly. Analysing these collective coordinates paves the way for controlling angiogenesis through the soliton, the engine that drives this process.
Stationary stability for evolutionary dynamics in finite populations
Harper, Marc; Fryer, Dashiell
2016-08-25
Here, we demonstrate a vast expansion of the theory of evolutionary stability to finite populations with mutation, connecting the theory of the stationary distribution of the Moran process with the Lyapunov theory of evolutionary stability. We define the notion of stationary stability for the Moran process with mutation and generalizations, as well as a generalized notion of evolutionary stability that includes mutation called an incentive stable state (ISS) candidate. For sufficiently large populations, extrema of the stationary distribution are ISS candidates and we give a family of Lyapunov quantities that are locally minimized at the stationary extrema and at ISSmore » candidates. In various examples, including for the Moran andWright–Fisher processes, we show that the local maxima of the stationary distribution capture the traditionally-defined evolutionarily stable states. The classical stability theory of the replicator dynamic is recovered in the large population limit. Finally we include descriptions of possible extensions to populations of variable size and populations evolving on graphs.« less
[INVITED] Computational intelligence for smart laser materials processing
NASA Astrophysics Data System (ADS)
Casalino, Giuseppe
2018-03-01
Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loehle, C.
1994-05-01
The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if somethingmore » is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.« less
Direct capture of CO 2 from ambient air
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanz-Perez, Eloy S.; Murdock, Christopher R.; Didas, Stephanie A.
The increase in the global atmospheric CO 2 concentration resulting from over a century of combustion of fossil fuels has been associated with significant global climate change. With the global population increase driving continued increases in fossil fuel use, humanity’s primary reliance on fossil energy for the next several decades is assured. Traditional modes of carbon capture such as precombustion and postcombustion CO 2 capture from large point sources can help slow the rate of increase of the atmospheric CO 2 concentration, but only the direct removal of CO 2 from the air, or “direct air capture” (DAC), can actuallymore » reduce the global atmospheric CO 2 concentration. The past decade has seen a steep rise in the use of chemical sorbents that are cycled through sorption and desorption cycles for CO 2 removal from ultradilute gases such as air. This Review provides a historical overview of the field of DAC, along with an exhaustive description of the use of chemical sorbents targeted at this application. Solvents and solid sorbents that interact strongly with CO 2 are described, including basic solvents, supported amine and ammonium materials, and metal-organic frameworks (MOFs), as the primary classes of chemical sorbents. Hypothetical processes for the deployment of such sorbents are discussed, as well as the limited array of technoeconomic analyses published on DAC. Overall, it is concluded that there are many new materials that could play a role in emerging DAC technologies. Furthermore, these materials need to be further investigated and developed with a practical sorbent-air contacting process in mind if society is to make rapid progress in deploying DAC as a means of mitigating climate change.« less
Linking vegetation structure, function and physiology through spectroscopic remote sensing
NASA Astrophysics Data System (ADS)
Serbin, S.; Singh, A.; Couture, J. J.; Shiklomanov, A. N.; Rogers, A.; Desai, A. R.; Kruger, E. L.; Townsend, P. A.
2015-12-01
Terrestrial ecosystem process models require detailed information on ecosystem states and canopy properties to properly simulate the fluxes of carbon (C), water and energy from the land to the atmosphere and assess the vulnerability of ecosystems to perturbations. Current models fail to adequately capture the magnitude, spatial variation, and seasonality of terrestrial C uptake and storage, leading to significant uncertainties in the size and fate of the terrestrial C sink. By and large, these parameter and process uncertainties arise from inadequate spatial and temporal representation of plant traits, vegetation structure, and functioning. With increases in computational power and changes to model architecture and approaches, it is now possible for models to leverage detailed, data rich and spatially explicit descriptions of ecosystems to inform parameter distributions and trait tradeoffs. In this regard, spectroscopy and imaging spectroscopy data have been shown to be invaluable observational datasets to capture broad-scale spatial and, eventually, temporal dynamics in important vegetation properties. We illustrate the linkage of plant traits and spectral observations to supply key data constraints for model parameterization. These constraints can come either in the form of the raw spectroscopic data (reflectance, absorbtance) or physiological traits derived from spectroscopy. In this presentation we highlight our ongoing work to build ecological scaling relationships between critical vegetation characteristics and optical properties across diverse and complex canopies, including temperate broadleaf and conifer forests, Mediterranean vegetation, Arctic systems, and agriculture. We focus on work at the leaf, stand, and landscape scales, illustrating the importance of capturing the underlying variability in a range of parameters (including vertical variation within canopies) to enable more efficient scaling of traits related to functional diversity of ecosystems.
Direct capture of CO 2 from ambient air
Sanz-Perez, Eloy S.; Murdock, Christopher R.; Didas, Stephanie A.; ...
2016-08-25
The increase in the global atmospheric CO 2 concentration resulting from over a century of combustion of fossil fuels has been associated with significant global climate change. With the global population increase driving continued increases in fossil fuel use, humanity’s primary reliance on fossil energy for the next several decades is assured. Traditional modes of carbon capture such as precombustion and postcombustion CO 2 capture from large point sources can help slow the rate of increase of the atmospheric CO 2 concentration, but only the direct removal of CO 2 from the air, or “direct air capture” (DAC), can actuallymore » reduce the global atmospheric CO 2 concentration. The past decade has seen a steep rise in the use of chemical sorbents that are cycled through sorption and desorption cycles for CO 2 removal from ultradilute gases such as air. This Review provides a historical overview of the field of DAC, along with an exhaustive description of the use of chemical sorbents targeted at this application. Solvents and solid sorbents that interact strongly with CO 2 are described, including basic solvents, supported amine and ammonium materials, and metal-organic frameworks (MOFs), as the primary classes of chemical sorbents. Hypothetical processes for the deployment of such sorbents are discussed, as well as the limited array of technoeconomic analyses published on DAC. Overall, it is concluded that there are many new materials that could play a role in emerging DAC technologies. Furthermore, these materials need to be further investigated and developed with a practical sorbent-air contacting process in mind if society is to make rapid progress in deploying DAC as a means of mitigating climate change.« less
Collective Poisson process with periodic rates: applications in physics from micro-to nanodevices.
da Silva, Roberto; Lamb, Luis C; Wirth, Gilson Inacio
2011-01-28
Continuous reductions in the dimensions of semiconductor devices have led to an increasing number of noise sources, including random telegraph signals (RTS) due to the capture and emission of electrons by traps at random positions between oxide and semiconductor. The models traditionally used for microscopic devices become of limited validity in nano- and mesoscale systems since, in such systems, distributed quantities such as electron and trap densities, and concepts like electron mobility, become inadequate to model electrical behaviour. In addition, current experimental works have shown that RTS in semiconductor devices based on carbon nanotubes lead to giant current fluctuations. Therefore, the physics of this phenomenon and techniques to decrease the amplitudes of RTS need to be better understood. This problem can be described as a collective Poisson process under different, but time-independent, rates, τ(c) and τ(e), that control the capture and emission of electrons by traps distributed over the oxide. Thus, models that consider calculations performed under time-dependent periodic capture and emission rates should be of interest in order to model more efficient devices. We show a complete theoretical description of a model that is capable of showing a noise reduction of current fluctuations in the time domain, and a reduction of the power spectral density in the frequency domain, in semiconductor devices as predicted by previous experimental work. We do so through numerical integrations and a novel Monte Carlo Markov chain (MCMC) algorithm based on microscopic discrete values. The proposed model also handles the ballistic regime, relevant in nano- and mesoscale devices. Finally, we show that the ballistic regime leads to nonlinearity in the electrical behaviour.
A new method for digital video documentation in surgical procedures and minimally invasive surgery.
Wurnig, P N; Hollaus, P H; Wurnig, C H; Wolf, R K; Ohtsuka, T; Pridun, N S
2003-02-01
Documentation of surgical procedures is limited to the accuracy of description, which depends on the vocabulary and the descriptive prowess of the surgeon. Even analog video recording could not solve the problem of documentation satisfactorily due to the abundance of recorded material. By capturing the video digitally, most problems are solved in the circumstances described in this article. We developed a cheap and useful digital video capturing system that consists of conventional computer components. Video images and clips can be captured intraoperatively and are immediately available. The system is a commercial personal computer specially configured for digital video capturing and is connected by wire to the video tower. Filming was done with a conventional endoscopic video camera. A total of 65 open and endoscopic procedures were documented in an orthopedic and a thoracic surgery unit. The median number of clips per surgical procedure was 6 (range, 1-17), and the median storage volume was 49 MB (range, 3-360 MB) in compressed form. The median duration of a video clip was 4 min 25 s (range, 45 s to 21 min). Median time for editing a video clip was 12 min for an advanced user (including cutting, title for the movie, and compression). The quality of the clips renders them suitable for presentations. This digital video documentation system allows easy capturing of intraoperative video sequences in high quality. All possibilities of documentation can be performed. With the use of an endoscopic video camera, no compromises with respect to sterility and surgical elbowroom are necessary. The cost is much lower than commercially available systems, and setting changes can be performed easily without trained specialists.
Effects of the number of people on efficient capture and sample collection: a lion case study.
Ferreira, Sam M; Maruping, Nkabeng T; Schoultz, Darius; Smit, Travis R
2013-05-24
Certain carnivore research projects and approaches depend on successful capture of individuals of interest. The number of people present at a capture site may determine success of a capture. In this study 36 lion capture cases in the Kruger National Park were used to evaluate whether the number of people present at a capture site influenced lion response rates and whether the number of people at a sampling site influenced the time it took to process the collected samples. The analyses suggest that when nine or fewer people were present, lions appeared faster at a call-up locality compared with when there were more than nine people. The number of people, however, did not influence the time it took to process the lions. It is proposed that efficient lion capturing should spatially separate capture and processing sites and minimise the number of people at a capture site.
Capturing Functional Independence Measure (FIM®) Ratings.
Torres, Audrey
The aim of the study was to identify interventions to capture admission functional independence measure (FIM®) ratings on the day of admission to an inpatient rehabilitation facility. A quantitative evidence-based practice quality improvement study utilizing descriptive statistics. Admission FIM® ratings from patients discharged in June 2012 (retrospective review) were compared to admission FIM® ratings from patients discharged in June 2014 (prospective review). The logic model was utilized to determine the project inputs, outputs, and outcomes. Interventions to capture admission FIM® ratings on the day of admission are essential to accurately predict the patient's burden of care, length of stay, and reimbursement. Waiting until Day 2 or Day 3 after admission to capture the admission FIM® assessment resulted in inflated admission FIM® ratings and suboptimal quality outcomes. Interventions to capture admission FIM® ratings on the day of admission were successful at improving the quality of care, length of stay efficiency, and accurately recording admission FIM® ratings to determine the patient's burden of care.
NASA Astrophysics Data System (ADS)
Lüdde, Hans Jürgen; Horbatsch, Marko; Kirchner, Tom
2018-05-01
We apply a recently introduced model for an independent-atom-like calculation of ion-impact electron transfer and ionization cross sections to proton collisions from water, neon, and carbon clusters. The model is based on a geometrical interpretation of the cluster cross section as an effective area composed of overlapping circular disks that are representative of the atomic contributions. The latter are calculated using a time-dependent density-functional-theory-based single-particle description with accurate exchange-only ground-state potentials. We find that the net capture and ionization cross sections in p-X n collisions are proportional to n α with 2/3 ≤ α ≤ 1. For capture from water clusters at 100 keV impact energy α is close to one, which is substantially different from the value α = 2/3 predicted by a previous theoretical work based on the simplest-level electron nuclear dynamics method. For ionization at 100 keV and for capture at lower energies we find smaller α values than for capture at 100 keV. This can be understood by considering the magnitude of the atomic cross sections and the resulting overlaps of the circular disks that make up the cluster cross section in our model. Results for neon and carbon clusters confirm these trends. Simple parametrizations are found which fit the cross sections remarkably well and suggest that they depend on the relevant bond lengths.
Choices of capture chromatography technology in antibody manufacturing processes.
DiLeo, Michael; Ley, Arthur; Nixon, Andrew E; Chen, Jie
2017-11-15
The capture process employed in monoclonal antibody downstream purification is not only the most critically impacted process by increased antibody titer resulting from optimized mammalian cell culture expression systems, but also the most important purification step in determining overall process throughput, product quality, and economics. Advances in separation technology for capturing antibodies from complex feedstocks have been one focus of downstream purification process innovation for past 10 years. In this study, we evaluated new generation chromatography resins used in the antibody capture process including Protein A, cation exchange, and mixed mode chromatography to address the benefits and unique challenges posed by each chromatography approach. Our results demonstrate the benefit of improved binding capacity of new generation Protein A resins, address the concern of high concentration surge caused aggregation when using new generation cation exchange resins with over 100mg/mL binding capacity, and highlight the potential of multimodal cation exchange resins for capture process design. The new landscape of capture chromatography technologies provides options to achieve overall downstream purification outcome with high product quality and process efficiency. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knio, Omar
2017-05-05
The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solutionmore » can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.« less
Nonlocal transport in the presence of transport barriers
NASA Astrophysics Data System (ADS)
Del-Castillo-Negrete, D.
2013-10-01
There is experimental, numerical, and theoretical evidence that transport in plasmas can, under certain circumstances, depart from the standard local, diffusive description. Examples include fast pulse propagation phenomena in perturbative experiments, non-diffusive scaling in L-mode plasmas, and non-Gaussian statistics of fluctuations. From the theoretical perspective, non-diffusive transport descriptions follow from the relaxation of the restrictive assumptions (locality, scale separation, and Gaussian/Markovian statistics) at the foundation of diffusive models. We discuss an alternative class of models able to capture some of the observed non-diffusive transport phenomenology. The models are based on a class of nonlocal, integro-differential operators that provide a unifying framework to describe non- Fickian scale-free transport, and non-Markovian (memory) effects. We study the interplay between nonlocality and internal transport barriers (ITBs) in perturbative transport including cold edge pulses and power modulation. Of particular interest in the nonlocal ``tunnelling'' of perturbations through ITBs. Also, flux-gradient diagrams are discussed as diagnostics to detect nonlocal transport processes in numerical simulations and experiments. Work supported by the US Department of Energy.
Site-based data curation based on hot spring geobiology
Palmer, Carole L.; Thomer, Andrea K.; Baker, Karen S.; Wickett, Karen M.; Hendrix, Christie L.; Rodman, Ann; Sigler, Stacey; Fouke, Bruce W.
2017-01-01
Site-Based Data Curation (SBDC) is an approach to managing research data that prioritizes sharing and reuse of data collected at scientifically significant sites. The SBDC framework is based on geobiology research at natural hot spring sites in Yellowstone National Park as an exemplar case of high value field data in contemporary, cross-disciplinary earth systems science. Through stakeholder analysis and investigation of data artifacts, we determined that meaningful and valid reuse of digital hot spring data requires systematic documentation of sampling processes and particular contextual information about the site of data collection. We propose a Minimum Information Framework for recording the necessary metadata on sampling locations, with anchor measurements and description of the hot spring vent distinct from the outflow system, and multi-scale field photography to capture vital information about hot spring structures. The SBDC framework can serve as a global model for the collection and description of hot spring systems field data that can be readily adapted for application to the curation of data from other kinds scientifically significant sites. PMID:28253269
Simultaneous capture of metal, sulfur and chlorine by sorbents during fluidized bed incineration.
Ho, T C; Chuang, T C; Chelluri, S; Lee, Y; Hopper, J R
2001-01-01
Metal capture experiments were carried out in an atmospheric fluidized bed incinerator to investigate the effect of sulfur and chlorine on metal capture efficiency and the potential for simultaneous capture of metal, sulfur and chlorine by sorbents. In addition to experimental investigation, the effect of sulfur and chlorine on the metal capture process was also theoretically investigated through performing equilibrium calculations based on the minimization of system free energy. The observed results have indicated that, in general, the existence of sulfur and chlorine enhances the efficiency of metal capture especially at low to medium combustion temperatures. The capture mechanisms appear to include particulate scrubbing and chemisorption depending on the type of sorbents. Among the three sorbents tested, calcined limestone is capable of capturing all the three air pollutants simultaneously. The results also indicate that a mixture of the three sorbents, in general, captures more metals than a single sorbent during the process. In addition, the existence of sulfur and chlorine apparently enhances the metal capture process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
BEVINS, R.R.
This document has been updated during the definitive design portion of the first phase of the W-314 Project to capture additional software requirements and is planned to be updated during the second phase of the W-314 Project to cover the second phase of the Project's scope. The objective is to provide requirement traceability by recording the analysis/basis for the functional descriptions of the master pump shutdown system. This document identifies the sources of the requirements and/or how these were derived. Each requirement is validated either by quoting the source or an analysis process involving the required functionality, performance characteristics, operationsmore » input or engineering judgment.« less
Deformation in Metallic Glass: Connecting Atoms to Continua
NASA Astrophysics Data System (ADS)
Hinkle, Adam R.; Falk, Michael L.; Rycroft, Chris H.; Shields, Michael D.
Metallic glasses like other amorphous solids experience strain localization as the primary mode of failure. However, the development of continuum constitutive laws which provide a quantitative description of disorder and mechanical deformation remains an open challenge. Recent progress has shown the necessity of accurately capturing fluctuations in material structure, in particular the statistical changes in potential energy of the atomic constituents during the non-equilibrium process of applied shear. Here we directly cross-compare molecular dynamics shear simulations of a ZrCu glass with continuum shear transformation zone (STZ) theory representations. We present preliminary results for a methodology to coarse-grain detailed molecular dynamics data with the goal of initializing a continuum representation in the STZ theory. NSF Grants Awards 1107838, 1408685, and 0801471.
Noriega, Rodrigo; Salleo, Alberto; Spakowitz, Andrew J.
2013-01-01
Existing models for the electronic properties of conjugated polymers do not capture the spatial arrangement of the disordered macromolecular chains over which charge transport occurs. Here, we present an analytical and computational description in which the morphology of individual polymer chains is dictated by well-known statistical models and the electronic coupling between units is determined using Marcus theory. The multiscale transport of charges in these materials (high mobility at short length scales, low mobility at long length scales) is naturally described with our framework. Additionally, the dependence of mobility with electric field and temperature is explained in terms of conformational variability and spatial correlation. Our model offers a predictive approach to connecting processing conditions with transport behavior. PMID:24062459
Noriega, Rodrigo; Salleo, Alberto; Spakowitz, Andrew J
2013-10-08
Existing models for the electronic properties of conjugated polymers do not capture the spatial arrangement of the disordered macromolecular chains over which charge transport occurs. Here, we present an analytical and computational description in which the morphology of individual polymer chains is dictated by well-known statistical models and the electronic coupling between units is determined using Marcus theory. The multiscale transport of charges in these materials (high mobility at short length scales, low mobility at long length scales) is naturally described with our framework. Additionally, the dependence of mobility with electric field and temperature is explained in terms of conformational variability and spatial correlation. Our model offers a predictive approach to connecting processing conditions with transport behavior.
Neutron Capture Rates and the r-Process Abundance Pattern in Shocked Neutrino-Driven Winds
NASA Astrophysics Data System (ADS)
Barringer, Daniel; Surman, Rebecca
2009-10-01
The r-process is an important process in nucleosynthesis in which nuclei will undergo rapid neutron captures. Models of the r-process require nuclear data such as neutron capture rates for thousands of individual nuclei, many of which lie far from stability. Among the potential sites for the r-process, and the one that we investigate, is the shocked neutrino-driven wind in core-collapse supernovae. Here we examine the importance of the neutron capture rates of specific, individual nuclei in the second r-process abundance peak occurring at A ˜ 130 for a range of parameterized neutrino-driven wind trajectories. Of specific interest are the nuclei whose capture rates affect the abundances of nuclei outside of the A ˜ 130 peak. We found that increasing the neutron capture rate for a number of nuclei including ^135In, ^132Sn, ^133Sb, ^137Sb, and ^136Te can produce changes in the resulting abundance pattern of up to 13%.
Anantharaman, Rahul; Peters, Thijs; Xing, Wen; Fontaine, Marie-Laure; Bredesen, Rune
2016-10-20
Dual phase membranes are highly CO 2 -selective membranes with an operating temperature above 400 °C. The focus of this work is to quantify the potential of dual phase membranes in pre- and post-combustion CO 2 capture processes. The process evaluations show that the dual phase membranes integrated with an NGCC power plant for CO 2 capture are not competitive with the MEA process for post-combustion capture. However, dual phase membrane concepts outperform the reference Selexol technology for pre-combustion CO 2 capture in an IGCC process. The two processes evaluated in this work, post-combustion NGCC and pre-combustion IGCC, represent extremes in CO 2 partial pressure fed to the separation unit. Based on the evaluations it is expected that dual phase membranes could be competitive for post-combustion capture from a pulverized coal fired power plant (PCC) and pre-combustion capture from an Integrated Reforming Cycle (IRCC).
Second Supplement to A Catalog of the Mosquitoes of the World (Diptera: Culicidae)
1984-01-01
104. Brunhes, J. 1977a. Les moustiques de l’archipel des Comores I. - Inventaire, &partition et description de quatre esptces ou sous-espscies...nouvelles. Cah. O.R.S.T.O.M. Ser. Entomol. Med. Parasitol. 15:131-152. Brunhes, J. 1977b. Les moustiques de l’archipel des Comores 11. - Description de...Dieng. 1978. Aedes (Stegomyia) neoafricanus un nouvelle espzcie de moustique capture’e au Sgne’gal Oriental (Diptera: Culicidae), Cah. O.R.S.T.O.M
Querol, Jorge; Tarongí, José Miguel; Forte, Giuseppe; Gómez, José Javier; Camps, Adriano
2017-05-10
MERITXELL is a ground-based multisensor instrument that includes a multiband dual-polarization radiometer, a GNSS reflectometer, and several optical sensors. Its main goals are twofold: to test data fusion techniques, and to develop Radio-Frequency Interference (RFI) detection, localization and mitigation techniques. The former is necessary to retrieve complementary data useful to develop geophysical models with improved accuracy, whereas the latter aims at solving one of the most important problems of microwave radiometry. This paper describes the hardware design, the instrument control architecture, the calibration of the radiometer, and several captures of RFI signals taken with MERITXELL in urban environment. The multiband radiometer has a dual linear polarization total-power radiometer topology, and it covers the L-, S-, C-, X-, K-, Ka-, and W-band. Its back-end stage is based on a spectrum analyzer structure which allows to perform real-time signal processing, while the rest of the sensors are controlled by a host computer where the off-line processing takes place. The calibration of the radiometer is performed using the hot-cold load procedure, together with the tipping curves technique in the case of the five upper frequency bands. Finally, some captures of RFI signals are shown for most of the radiometric bands under analysis, which evidence the problem of RFI in microwave radiometry, and the limitations they impose in external calibration.
Querol, Jorge; Tarongí, José Miguel; Forte, Giuseppe; Gómez, José Javier; Camps, Adriano
2017-01-01
MERITXELL is a ground-based multisensor instrument that includes a multiband dual-polarization radiometer, a GNSS reflectometer, and several optical sensors. Its main goals are twofold: to test data fusion techniques, and to develop Radio-Frequency Interference (RFI) detection, localization and mitigation techniques. The former is necessary to retrieve complementary data useful to develop geophysical models with improved accuracy, whereas the latter aims at solving one of the most important problems of microwave radiometry. This paper describes the hardware design, the instrument control architecture, the calibration of the radiometer, and several captures of RFI signals taken with MERITXELL in urban environment. The multiband radiometer has a dual linear polarization total-power radiometer topology, and it covers the L-, S-, C-, X-, K-, Ka-, and W-band. Its back-end stage is based on a spectrum analyzer structure which allows to perform real-time signal processing, while the rest of the sensors are controlled by a host computer where the off-line processing takes place. The calibration of the radiometer is performed using the hot-cold load procedure, together with the tipping curves technique in the case of the five upper frequency bands. Finally, some captures of RFI signals are shown for most of the radiometric bands under analysis, which evidence the problem of RFI in microwave radiometry, and the limitations they impose in external calibration. PMID:28489056
Technical and economical evaluation of carbon dioxide capture and conversion to methanol process
NASA Astrophysics Data System (ADS)
Putra, Aditya Anugerah; Juwari, Handogo, Renanto
2017-05-01
Phenomenon of global warming, which is indicated by increasing of earth's surface temperature, is caused by high level of greenhouse gases level in the atmosphere. Carbon dioxide, which increases year by year because of high demand of energy, gives the largest contribution in greenhouse gases. One of the most applied solution to mitigate carbon dioxide level is post-combustion carbon capture technology. Although the technology can absorb up to 90% of carbon dioxide produced, some worries occur that captured carbon dioxide that is stored underground will be released over time. Utilizing captured carbon dioxide could be a promising solution. Captured carbon dioxide can be converted into more valuable material, such as methanol. This research will evaluate the conversion process of captured carbon dioxide to methanol, technically and economically. From the research, it is found that technically methanol can be made from captured carbon dioxide. Product gives 25.6905 kg/s flow with 99.69% purity of methanol. Economical evaluation of the whole conversion process shows that the process is economically feasible. The capture and conversion process needs 176,101,157.69 per year for total annual cost and can be overcome by revenue gained from methanol product sales.
Analysis of coherent dynamical processes through computer vision
NASA Astrophysics Data System (ADS)
Hack, M. J. Philipp
2016-11-01
Visualizations of turbulent boundary layers show an abundance of characteristic arc-shaped structures whose apparent similarity suggests a common origin in a coherent dynamical process. While the structures have been likened to the hairpin vortices observed in the late stages of transitional flow, a consistent description of the underlying mechanism has remained elusive. Detailed studies are complicated by the chaotic nature of turbulence which modulates each manifestation of the process and which renders the isolation of individual structures a challenging task. The present study applies methods from the field of computer vision to capture the time evolution of turbulent flow features and explore the associated physical mechanisms. The algorithm uses morphological operations to condense the structure of the turbulent flow field into a graph described by nodes and links. The low-dimensional geometric information is stored in a database and allows the identification and analysis of equivalent dynamical processes across multiple scales. The framework is not limited to turbulent boundary layers and can also be applied to different types of flows as well as problems from other fields of science.
A highly scalable information system as extendable framework solution for medical R&D projects.
Holzmüller-Laue, Silke; Göde, Bernd; Stoll, Regina; Thurow, Kerstin
2009-01-01
For research projects in preventive medicine a flexible information management is needed that offers a free planning and documentation of project specific examinations. The system should allow a simple, preferably automated data acquisition from several distributed sources (e.g., mobile sensors, stationary diagnostic systems, questionnaires, manual inputs) as well as an effective data management, data use and analysis. An information system fulfilling these requirements has been developed at the Center for Life Science Automation (celisca). This system combines data of multiple investigations and multiple devices and displays them on a single screen. The integration of mobile sensor systems for comfortable, location-independent capture of time-based physiological parameter and the possibility of observation of these measurements directly by this system allow new scenarios. The web-based information system presented in this paper is configurable by user interfaces. It covers medical process descriptions, operative process data visualizations, a user-friendly process data processing, modern online interfaces (data bases, web services, XML) as well as a comfortable support of extended data analysis with third-party applications.
A cluster expansion model for predicting activation barrier of atomic processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rehman, Tafizur; Jaipal, M.; Chatterjee, Abhijit, E-mail: achatter@iitk.ac.in
2013-06-15
We introduce a procedure based on cluster expansion models for predicting the activation barrier of atomic processes encountered while studying the dynamics of a material system using the kinetic Monte Carlo (KMC) method. Starting with an interatomic potential description, a mathematical derivation is presented to show that the local environment dependence of the activation barrier can be captured using cluster interaction models. Next, we develop a systematic procedure for training the cluster interaction model on-the-fly, which involves: (i) obtaining activation barriers for handful local environments using nudged elastic band (NEB) calculations, (ii) identifying the local environment by analyzing the NEBmore » results, and (iii) estimating the cluster interaction model parameters from the activation barrier data. Once a cluster expansion model has been trained, it is used to predict activation barriers without requiring any additional NEB calculations. Numerical studies are performed to validate the cluster expansion model by studying hop processes in Ag/Ag(100). We show that the use of cluster expansion model with KMC enables efficient generation of an accurate process rate catalog.« less
Deep hierarchical attention network for video description
NASA Astrophysics Data System (ADS)
Li, Shuohao; Tang, Min; Zhang, Jun
2018-03-01
Pairing video to natural language description remains a challenge in computer vision and machine translation. Inspired by image description, which uses an encoder-decoder model for reducing visual scene into a single sentence, we propose a deep hierarchical attention network for video description. The proposed model uses convolutional neural network (CNN) and bidirectional LSTM network as encoders while a hierarchical attention network is used as the decoder. Compared to encoder-decoder models used in video description, the bidirectional LSTM network can capture the temporal structure among video frames. Moreover, the hierarchical attention network has an advantage over single-layer attention network on global context modeling. To make a fair comparison with other methods, we evaluate the proposed architecture with different types of CNN structures and decoders. Experimental results on the standard datasets show that our model has a more superior performance than the state-of-the-art techniques.
Gully measurement strategies in a pixel using python
NASA Astrophysics Data System (ADS)
Wells, Robert; Momm, Henrique; Bennett, Sean; Dabney, Seth
2015-04-01
Gullies are often the single largest sediment sources within the landscape; however, measurement and process description of these channels presents challenges that have limited complete understanding. A strategy currently being employed in the field and laboratory to measure topography of gullies utilizes inexpensive, off-the-shelf cameras and software. Photogrammetry may be entering an enlightened period, as users have numerous choices (cameras, lenses, and software) and many are utilizing the technology to define their surroundings; however, the key for those seeking answers will be what happens once topography is represented as a three-dimensional digital surface model. Perhaps the model can be compared with another model to visualize change, either in topography or in vegetation cover, or both. With these models of our landscape, prediction technology should be rejuvenated and/or reinvented. Over the past several decades, researchers have endeavored to capture the erosion process and transfer these observations through oral and written word. Several have hypothesized a fundamental system for gully expression in the landscape; however, this understanding has not transferred well into our prediction technology. Unlike many materials, soils often times do not behave in a predictable fashion. Which soil physical properties lend themselves to erosion process description? In most cases, several disciplines are required to visualize the erosion process and its impact on our landscape. With a small camera, the landscape becomes more accessible and this accessibility will lead to a deeper understanding and development of uncompromised erosion theory. Why? Conservation of our soil resources is inherently linked to a complete understanding of soil wasting.
González, Diego Luis; Pimpinelli, Alberto; Einstein, T L
2017-07-01
We study the effect of hindered aggregation on the island formation process in a one- (1D) and two-dimensional (2D) point-island model for epitaxial growth with arbitrary critical nucleus size i. In our model, the attachment of monomers to preexisting islands is hindered by an additional attachment barrier, characterized by length l_{a}. For l_{a}=0 the islands behave as perfect sinks while for l_{a}→∞ they behave as reflecting boundaries. For intermediate values of l_{a}, the system exhibits a crossover between two different kinds of processes, diffusion-limited aggregation and attachment-limited aggregation. We calculate the growth exponents of the density of islands and monomers for the low coverage and aggregation regimes. The capture-zone (CZ) distributions are also calculated for different values of i and l_{a}. In order to obtain a good spatial description of the nucleation process, we propose a fragmentation model, which is based on an approximate description of nucleation inside of the gaps for 1D and the CZs for 2D. In both cases, the nucleation is described by using two different physically rooted probabilities, which are related with the microscopic parameters of the model (i and l_{a}). We test our analytical model with extensive numerical simulations and previously established results. The proposed model describes excellently the statistical behavior of the system for arbitrary values of l_{a} and i=1, 2, and 3.
NASA Astrophysics Data System (ADS)
Li, Helen; Lee, Robben; Lee, Tyzy; Xue, Teddy; Liu, Hermes; Wu, Hall; Wan, Qijian; Du, Chunshan; Hu, Xinyi; Liu, Zhengfang
2018-03-01
As technology advances, escalating layout design complexity and chip size make defect inspection becomes more challenging than ever before. The YE (Yield Enhancement) engineers are seeking for an efficient strategy to ensure accuracy without suffering running time. A smart way is to set different resolutions for different pattern structures, for examples, logic pattern areas have a higher scan resolution while the dummy areas have a lower resolution, SRAM area may have another different resolution. This can significantly reduce the scan processing time meanwhile the accuracy does not suffer. Due to the limitation of the inspection equipment, the layout must be processed in order to output the Care Area marker in line with the requirement of the equipment, for instance, the marker shapes must be rectangle and the number of the rectangle shapes should be as small as possible. The challenge is how to select the different Care Areas by pattern structures, merge the areas efficiently and then partition them into pieces of rectangle shapes. This paper presents a solution based on Calibre DRC and Pattern Matching. Calibre equation-based DRC is a powerful layout processing engine and Calibre Pattern Matching's automated visual capture capability enables designers to define these geometries as layout patterns and store them in libraries which can be re-used in multiple design layouts. Pattern Matching simplifies the description of very complex relationships between pattern shapes efficiently and accurately. Pattern matching's true power is on display when it is integrated with normal DRC deck. In this application of defects inspection, we first run Calibre DRC to get rule based Care Area then use Calibre Pattern Matching's automated pattern capture capability to capture Care Area shapes which need a higher scan resolution with a tune able pattern halo. In the pattern matching step, when the patterns are matched, a bounding box marker will be output to identify the high resolution area. The equation-based DRC and Pattern Matching effectively work together for different scan phases.
NASA Astrophysics Data System (ADS)
Gleghorn, Jason P.; Smith, James P.; Kirby, Brian J.
2013-09-01
Microfluidic obstacle arrays have been used in numerous applications, and their ability to sort particles or capture rare cells from complex samples has broad and impactful applications in biology and medicine. We have investigated the transport and collision dynamics of particles in periodic obstacle arrays to guide the design of convective, rather than diffusive, transport-based immunocapture microdevices. Ballistic and full computational fluid dynamics simulations are used to understand the collision modes that evolve in cylindrical obstacle arrays with various geometries. We identify previously unrecognized collision mode structures and differential size-based collision frequencies that emerge from these arrays. Previous descriptions of transverse displacements that assume unidirectional flow in these obstacle arrays cannot capture mode transitions properly as these descriptions fail to capture the dependence of the mode transitions on column spacing and the attendant change in the flow field. Using these analytical and computational simulations, we elucidate design parameters that induce high collision rates for all particles larger than a threshold size or selectively increase collision frequencies for a narrow range of particle sizes within a polydisperse population. Furthermore, we investigate how the particle Péclet number affects collision dynamics and mode transitions and demonstrate that experimental observations from various obstacle array geometries are well described by our computational model.
A minimization principle for the description of modes associated with finite-time instabilities
Babaee, H.
2016-01-01
We introduce a minimization formulation for the determination of a finite-dimensional, time-dependent, orthonormal basis that captures directions of the phase space associated with transient instabilities. While these instabilities have finite lifetime, they can play a crucial role either by altering the system dynamics through the activation of other instabilities or by creating sudden nonlinear energy transfers that lead to extreme responses. However, their essentially transient character makes their description a particularly challenging task. We develop a minimization framework that focuses on the optimal approximation of the system dynamics in the neighbourhood of the system state. This minimization formulation results in differential equations that evolve a time-dependent basis so that it optimally approximates the most unstable directions. We demonstrate the capability of the method for two families of problems: (i) linear systems, including the advection–diffusion operator in a strongly non-normal regime as well as the Orr–Sommerfeld/Squire operator, and (ii) nonlinear problems, including a low-dimensional system with transient instabilities and the vertical jet in cross-flow. We demonstrate that the time-dependent subspace captures the strongly transient non-normal energy growth (in the short-time regime), while for longer times the modes capture the expected asymptotic behaviour. PMID:27118900
Stirling, Rob G; Evans, S M; McLaughlin, P; Senthuren, M; Millar, J; Gooi, J; Irving, L; Mitchell, P; Haydon, A; Ruben, J; Conron, M; Leong, T; Watkins, N; McNeil, J J
2014-10-01
Lung cancer remains a major disease burden in Victoria (Australia) and requires a complex and multidisciplinary approach to ensure optimal care and outcomes. To date, no uniform mechanism is available to capture standardized population-based outcomes and thereby provide benchmarking. The establishment of such a data platform is, therefore, a primary requisite to enable description of process and outcome in lung cancer care and to drive improvement in the quality of care provided to individuals with lung cancer. A disease quality registry pilot has been established to capture prospective data on all adult patients with clinical or tissue diagnoses of small cell and non-small cell lung cancer. Steering and management committees provide clinical governance and supervise quality indicator selection. Quality indicators were selected following extensive literature review and evaluation of established clinical practice guidelines. A minimum dataset has been established and training and data capture by data collectors is facilitated using a web-based portal. Case ascertainment is established by regular institutional reporting of ICD-10 discharge coding. Recruitment is optimized by provision of opt-out consent. The collection of a standardized minimum data set optimizes capacity for harmonized population-based data capture. Data collection has commenced in a variety of settings reflecting metropolitan and rural, and public, and private health care institutions. The data set provides scope for the construction of a risk-adjusted model for outcomes. A data access policy and a mechanism for escalation policy for outcome outliers has been established. The Victorian Lung Cancer Registry provides a unique capacity to provide and confirm quality assessment in lung cancer and to drive improvement in quality of care across multidisciplinary stakeholders.
Monitored Geologic Repository Project Description Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. M. Curry
2001-01-30
The primary objective of the Monitored Geologic Repository Project Description Document (PDD) is to allocate the functions, requirements, and assumptions to the systems at Level 5 of the Civilian Radioactive Waste Management System (CRWMS) architecture identified in Section 4. It provides traceability of the requirements to those contained in Section 3 of the ''Monitored Geologic Repository Requirements Document'' (MGR RD) (YMP 2000a) and other higher-level requirements documents. In addition, the PDD allocates design related assumptions to work products of non-design organizations. The document provides Monitored Geologic Repository (MGR) technical requirements in support of design and performance assessment in preparing formore » the Site Recommendation (SR) and License Application (LA) milestones. The technical requirements documented in the PDD are to be captured in the System Description Documents (SDDs) which address each of the systems at Level 5 of the CRWMS architecture. The design engineers obtain the technical requirements from the SDDs and by reference from the SDDs to the PDD. The design organizations and other organizations will obtain design related assumptions directly from the PDD. These organizations may establish additional assumptions for their individual activities, but such assumptions are not to conflict with the assumptions in the PDD. The PDD will serve as the primary link between the technical requirements captured in the SDDs and the design requirements captured in US Department of Energy (DOE) documents. The approved PDD is placed under Level 3 baseline control by the CRWMS Management and Operating Contractor (M and O) and the following portions of the PDD constitute the Technical Design Baseline for the MGR: the design characteristics listed in Table 1-1, the MGR Architecture (Section 4.1), the Technical Requirements (Section 5), and the Controlled Project Assumptions (Section 6).« less
Role of clay minerals in the formation of atmospheric aggregates of Saharan dust
NASA Astrophysics Data System (ADS)
Cuadros, Javier; Diaz-Hernandez, José L.; Sanchez-Navas, Antonio; Garcia-Casco, Antonio
2015-11-01
Saharan dust can travel long distances in different directions across the Atlantic and Europe, sometimes in episodes of high dust concentration. In recent years it has been discovered that Saharan dust aerosols can aggregate into large, approximately spherical particles of up to 100 μm generated within raindrops that then evaporate, so that the aggregate deposition takes place most times in dry conditions. These aerosol aggregates are an interesting phenomenon resulting from the interaction of mineral aerosols and atmospheric conditions. They have been termed "iberulites" due to their discovery and description from aerosol deposits in the Iberian Peninsula. Here, these aggregates are further investigated, in particular the role of the clay minerals in the aggregation process of aerosol particles. Iberulites, and common aerosol particles for reference, were studied from the following periods or single dust events and locations: June 1998 in Tenerife, Canary Islands; June 2001 to August 2002, Granada, Spain; 13-20 August 2012, Granada; and 1-6 June 2014, Granada. Their mineralogy, chemistry and texture were analysed using X-ray diffraction, electron microprobe analysis, SEM and TEM. The mineral composition and structure of the iberulites consists of quartz, carbonate and feldspar grains surrounded by a matrix of clay minerals (illite, smectite and kaolinite) that also surrounds the entire aggregate. Minor phases, also distributed homogenously within the iberulites, are sulfates and Fe oxides. Clays are apparently more abundant in the iberulites than in the total aerosol deposit, suggesting that iberulite formation concentrates clays. Details of the structure and composition of iberulites differ from descriptions of previous samples, which indicates dependence on dust sources and atmospheric conditions, possibly including anthropic activity. Iberulites are formed by coalescence of aerosol mineral particles captured by precursor water droplets. The concentration of clays in the iberulites is suggested to be the result of higher efficiency for clay capture than for the capture of larger mineral grains. The high hygroscopicity of clay minerals probably causes retention of water in the evaporation stage and some secondary minerals (mainly gypsum) are associated with clays.
NASA Technical Reports Server (NTRS)
Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad
2016-01-01
Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.
NASA Astrophysics Data System (ADS)
Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad
2016-09-01
Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.
Taubner, Svenja; Wiswede, Daniel; Kessler, Henrik
2013-01-01
Objective: The heterogeneity between patients with depression cannot be captured adequately with existing descriptive systems of diagnosis and neurobiological models of depression. Furthermore, considering the highly individual nature of depression, the application of general stimuli in past research efforts may not capture the essence of the disorder. This study aims to identify subtypes of depression by using empirically derived personality syndromes, and to explore neural correlates of the derived personality syndromes. Materials and Methods: In the present exploratory study, an individually tailored and psychodynamically based functional magnetic resonance imaging paradigm using dysfunctional relationship patterns was presented to 20 chronically depressed patients. Results from the Shedler–Westen Assessment Procedure (SWAP-200) were analyzed by Q-factor analysis to identify clinically relevant subgroups of depression and related brain activation. Results: The principle component analysis of SWAP-200 items from all 20 patients lead to a two-factor solution: “Depressive Personality” and “Emotional-Hostile-Externalizing Personality.” Both factors were used in a whole-brain correlational analysis but only the second factor yielded significant positive correlations in four regions: a large cluster in the right orbitofrontal cortex (OFC), the left ventral striatum, a small cluster in the left temporal pole, and another small cluster in the right middle frontal gyrus. Discussion: The degree to which patients with depression score high on the factor “Emotional-Hostile-Externalizing Personality” correlated with relatively higher activity in three key areas involved in emotion processing, evaluation of reward/punishment, negative cognitions, depressive pathology, and social knowledge (OFC, ventral striatum, temporal pole). Results may contribute to an alternative description of neural correlates of depression showing differential brain activation dependent on the extent of specific personality syndromes in depression. PMID:24363644
Diversity of abundance patterns of neutron-capture elements in very metal-poor stars
NASA Astrophysics Data System (ADS)
Aoki, Misa; Aoki, Wako; Ishimaru, Yuhri; Wanajo, Shinya
2014-05-01
Observations of Very Metal-Poor stars indicate that there are at least two sites to r-process; "weak r-process" and "main r-process". A question is whether these two are well separated or there exists a variation in the r-process. We present the results of abundance analysis of neutron-capture elements in the two Very Metal-Poor stars HD107752 and HD110184 in the Milky Way halo observed with the Subaru Telescope HDS. The abundance patterns show overabundace at light n-capture elements (e.g. Sr, Y), inferring the element yielding of weak r-process, while heavy neutron-capture elements (e.g. Ba, Eu) are deficient; however, the overabundance of light ones is not as significant as that previously found in stars representing the weak r-process (e.g. HD122563; Honda et al. 2006). Our study show diversity in the abundance patterns from light to heavy neutron-capture elements in VMP stars, suggesting a variation in r-process, which may depend on electron fraction of environment.
Measuring the progress of capacity building in the Alberta Policy Coalition for Cancer Prevention.
Raine, Kim D; Sosa Hernandez, Cristabel; Nykiforuk, Candace I J; Reed, Shandy; Montemurro, Genevieve; Lytvyak, Ellina; MacLellan-Wright, Mary-Frances
2014-07-01
The Alberta Policy Coalition for Cancer Prevention (APCCP) represents practitioners, policy makers, researchers, and community organizations working together to coordinate efforts and advocate for policy change to reduce chronic diseases. The aim of this research was to capture changes in the APCCP's capacity to advance its goals over the course of its operation. We adapted the Public Health Agency of Canada's validated Community Capacity-Building Tool to capture policy work. All members of the APCCP were invited to complete the tool in 2010 and 2011. Responses were analyzed using descriptive statistics and t tests. Qualitative comments were analyzed using thematic content analysis. A group process for reaching consensus provided context to the survey responses and contributed to a participatory analysis. Significant improvement was observed in eight out of nine capacity domains. Lessons learned highlight the importance of balancing volume and diversity of intersectoral representation to ensure effective participation, as well as aligning professional and economic resources. Defining involvement and roles within a coalition can be a challenging activity contingent on the interests of each sector represented. The participatory analysis enabled the group to reflect on progress made and future directions for policy advocacy. © 2013 Society for Public Health Education.
A fast numerical scheme for causal relativistic hydrodynamics with dissipation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takamoto, Makoto, E-mail: takamoto@tap.scphys.kyoto-u.ac.jp; Inutsuka, Shu-ichiro
2011-08-01
Highlights: {yields} We have developed a new multi-dimensional numerical scheme for causal relativistic hydrodynamics with dissipation. {yields} Our new scheme can calculate the evolution of dissipative relativistic hydrodynamics faster and more effectively than existing schemes. {yields} Since we use the Riemann solver for solving the advection steps, our method can capture shocks very accurately. - Abstract: In this paper, we develop a stable and fast numerical scheme for relativistic dissipative hydrodynamics based on Israel-Stewart theory. Israel-Stewart theory is a stable and causal description of dissipation in relativistic hydrodynamics although it includes relaxation process with the timescale for collision of constituentmore » particles, which introduces stiff equations and makes practical numerical calculation difficult. In our new scheme, we use Strang's splitting method, and use the piecewise exact solutions for solving the extremely short timescale problem. In addition, since we split the calculations into inviscid step and dissipative step, Riemann solver can be used for obtaining numerical flux for the inviscid step. The use of Riemann solver enables us to capture shocks very accurately. Simple numerical examples are shown. The present scheme can be applied to various high energy phenomena of astrophysics and nuclear physics.« less
Using Musical Intervals to Demonstrate Superposition of Waves and Fourier Analysis
ERIC Educational Resources Information Center
LoPresto, Michael C.
2013-01-01
What follows is a description of a demonstration of superposition of waves and Fourier analysis using a set of four tuning forks mounted on resonance boxes and oscilloscope software to create, capture and analyze the waveforms and Fourier spectra of musical intervals.
Rational and Mechanistic Perspectives on Reinforcement Learning
ERIC Educational Resources Information Center
Chater, Nick
2009-01-01
This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: "mechanistic" and "rational." Reinforcement learning is often viewed in mechanistic terms--as…
Oxidizing and Scavenging Characteristics of April Rains - OSCAR data report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benkovitz, C.M.; Evans, V.A.; Tichler, J.L.
The organization of this report is as follows: Chapter 1 presents a description of the OSCAR experiment, including its objectives, design, and field deployment. Chapter 2 presents the OSCAR Central Data Coordination function and summarizes the tasks needed to compile each data set. Chapters 3 through 6 address each of the four OSCAR events. A synoptic description of each event is presented in these chapters, followed by a summary of the data captured during the event. Chapter 3 and Appendices C-G then present detailed tabular and graphical displays of the data captured during this event by the intermediate-density precipitation chemistrymore » network, the BNL aircraft and the surface air chemistry measurements conducted by BNL and by state/province agency networks. Data from the high-density precipitation chemistry network are being presented in a separate series of reports by Pacific Northwest Laboratory. Detailed displays of the data for events 2 to 4 have not been included in this report; however, selected portions could be developed for interested parties.« less
Nicolaou, K C; Baran, Phil S
2002-08-02
Imagine an artist carving a sculpture from a marble slab and finding gold nuggets in the process. This thought is not a far-fetched description of the work of a synthetic chemist pursuing the total synthesis of a natural product. At the end of the day, he or she will be judged by the artistry of the final work and the weight of the gold discovered in the process. However, as colorful as this description of total synthesis may be, it does not entirely capture the essence of the endeavor, for there is much more to be told, especially with regard to the contrast of frustrating failures and exhilarating moments of discovery. To fully appreciate the often Herculean nature of the task and the rewards that accompany it, one must sense the details of the enterprise behind the scenes. A more vivid description of total synthesis as a struggle against a tough opponent is perhaps appropriate to dramatize these elements of the experience. In this article we describe one such endeavor of total synthesis which, in addition to reaching the target molecule, resulted in a wealth of new synthetic strategies and technologies for chemical synthesis. The total synthesis of the CP molecules is compared to Theseus' most celebrated athlos (Greek for exploit, accomplishment): the conquest of the dreaded Minotaur, which he accomplished through brilliance, skill, and bravery having traversed the famous labyrinth with the help of Ariadne. This story from Greek mythology comes alive in modern synthetic expeditions toward natural products as exemplified by the total synthesis of the CP molecules which serve as a paradigm for modern total synthesis endeavors, where the objectives are discovery and invention in the broader sense of organic synthesis.
Narayanan, Jaishree; Dobrin, Sofia; Choi, Janet; Rubin, Susan; Pham, Anna; Patel, Vimal; Frigerio, Roberta; Maurer, Darryck; Gupta, Payal; Link, Lourdes; Walters, Shaun; Wang, Chi; Ji, Yuan; Maraganore, Demetrius M
2017-01-01
Using the electronic medical record (EMR) to capture structured clinical data at the point of care would be a practical way to support quality improvement and practice-based research in epilepsy. We describe our stepwise process for building structured clinical documentation support tools in the EMR that define best practices in epilepsy, and we describe how we incorporated these toolkits into our clinical workflow. These tools write notes and capture hundreds of fields of data including several score tests: Generalized Anxiety Disorder-7 items, Neurological Disorders Depression Inventory for Epilepsy, Epworth Sleepiness Scale, Quality of Life in Epilepsy-10 items, Montreal Cognitive Assessment/Short Test of Mental Status, and Medical Research Council Prognostic Index. The tools summarize brain imaging, blood laboratory, and electroencephalography results, and document neuromodulation treatments. The tools provide Best Practices Advisories and other clinical decision support when appropriate. The tools prompt enrollment in a DNA biobanking study. We have thus far enrolled 231 patients for initial visits and are starting our first annual follow-up visits and provide a brief description of our cohort. We are sharing these EMR tools and captured data with other epilepsy clinics as part of a Neurology Practice Based Research Network, and are using the tools to conduct pragmatic trials using subgroup-based adaptive designs. © 2016 The Authors. Epilepsia published by Wiley Periodicals, Inc. on behalf of International League Against Epilepsy.
NASA Astrophysics Data System (ADS)
Jankovic, Marko; Paul, Jan; Kirchner, Frank
2016-04-01
Recent studies of the space debris population in low Earth orbit (LEO) have concluded that certain regions have already reached a critical density of objects. This will eventually lead to a cascading process called the Kessler syndrome. The time may have come to seriously consider active debris removal (ADR) missions as the only viable way of preserving the space environment for future generations. Among all objects in the current environment, the SL-8 (Kosmos 3M second stages) rocket bodies (R/Bs) are some of the most suitable targets for future robotic ADR missions. However, to date, an autonomous relative navigation to and capture of an non-cooperative target has never been performed. Therefore, there is a need for more advanced, autonomous and modular systems that can cope with uncontrolled, tumbling objects. The guidance, navigation and control (GNC) system is one of the most critical ones. The main objective of this paper is to present a preliminary concept of a modular GNC architecture that should enable a safe and fuel-efficient capture of a known but uncooperative target, such as Kosmos 3M R/B. In particular, the concept was developed having in mind the most critical part of an ADR mission, i.e. close range proximity operations, and state of the art algorithms in the field of autonomous rendezvous and docking. In the end, a brief description of the hardware in the loop (HIL) testing facility is made, foreseen for the practical evaluation of the developed architecture.
Image fusion via nonlocal sparse K-SVD dictionary learning.
Li, Ying; Li, Fangyi; Bai, Bendu; Shen, Qiang
2016-03-01
Image fusion aims to merge two or more images captured via various sensors of the same scene to construct a more informative image by integrating their details. Generally, such integration is achieved through the manipulation of the representations of the images concerned. Sparse representation plays an important role in the effective description of images, offering a great potential in a variety of image processing tasks, including image fusion. Supported by sparse representation, in this paper, an approach for image fusion by the use of a novel dictionary learning scheme is proposed. The nonlocal self-similarity property of the images is exploited, not only at the stage of learning the underlying description dictionary but during the process of image fusion. In particular, the property of nonlocal self-similarity is combined with the traditional sparse dictionary. This results in an improved learned dictionary, hereafter referred to as the nonlocal sparse K-SVD dictionary (where K-SVD stands for the K times singular value decomposition that is commonly used in the literature), and abbreviated to NL_SK_SVD. The performance of the NL_SK_SVD dictionary is applied for image fusion using simultaneous orthogonal matching pursuit. The proposed approach is evaluated with different types of images, and compared with a number of alternative image fusion techniques. The resultant superior fused images using the present approach demonstrates the efficacy of the NL_SK_SVD dictionary in sparse image representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Thomas; Kataria, Atish; Soukri, Mustapha
It is increasingly clear that CO 2 capture and sequestration (CCS) must play a critical role in curbing worldwide CO 2 emissions to the atmosphere. Development of these technologies to cost-effectively remove CO 2 from coal-fired power plants is very important to mitigating the impact these power plants have within the world’s power generation portfolio. Currently, conventional CO 2 capture technologies, such as aqueous-monoethanolamine based solvent systems, are prohibitively expensive and if implemented could result in a 75 to 100% increase in the cost of electricity for consumers worldwide. Solid sorbent CO 2 capture processes – such as RTI’s Advancedmore » Solid Sorbent CO 2, Capture Process – are promising alternatives to conventional, liquid solvents. Supported amine sorbents – of the nature RTI has developed – are particularly attractive due to their high CO 2 loadings, low heat capacities, reduced corrosivity/volatility and the potential to reduce the regeneration energy needed to carry out CO 2 capture. Previous work in this area has failed to adequately address various technology challenges such as sorbent stability and regenerability, sorbent scale-up, improved physical strength and attrition-resistance, proper heat management and temperature control, proper solids handling and circulation control, as well as the proper coupling of process engineering advancements that are tailored for a promising sorbent technology. The remaining challenges for these sorbent processes have provided the framework for the project team’s research and development and target for advancing the technology beyond lab- and bench-scale testing. Under a cooperative agreement with the US Department of Energy, and part of NETL’s CO 2 Capture Program, RTI has led an effort to address and mitigate the challenges associated with solid sorbent CO 2 capture. The overall objective of this project was to mitigate the technical and economic risks associated with the scale-up of solid sorbent-based CO 2 capture processes, enabling subsequent larger pilot demonstrations and ultimately commercial deployment. An integrated development approach has been a key focus of this project in which process development, sorbent development, and economic analyses have informed each of the other development processes. Development efforts have focused on improving the performance stability of sorbent candidates, refining process engineering and design, and evaluating the viability of the technology through detailed economic analyses. Sorbent advancements have led to a next generation, commercially-viable CO 2 capture sorbent exhibiting performance stability in various gas environments and a physically strong fluidizable form. The team has reduced sorbent production costs and optimized the production process and scale-up of PEI-impregnated, fluidizable sorbents. Refinement of the process engineering and design, as well as the construction and operation of a bench-scale research unit has demonstrated promising CO 2 capture performance under simulated coal-fired flue gas conditions. Parametric testing has shown how CO 2 capture performance is impacted by changing process variables, such as Adsorber temperature, Regenerator temperature, superficial flue gas velocity, solids circulation rate, CO 2 partial pressure in the Regenerator, and many others. Long-term testing has generated data for the project team to set the process conditions needed to operate a solids-based system for optimal performance, with continuous 90% CO 2 capture, and no operational interruptions. Data collected from all phases of testing has been used to develop a detailed techno-economic assessment of RTI’s technology. These detailed analyses show that RTI’s technology has significant economic advantages over current amine scrubbing and potential to achieve the DOE’s Carbon Capture Program’s goal of >90% CO 2 capture rate at a cost of < $40/T-CO 2 captured by 2025. Through this integrated technology development approach, the project team has advanced RTI’s CO 2 capture technology to TRL-4 (nearly TRL-5, with the missing variable being testing on actual, coal-fired flue gas), according to the DOE/FE definitions for Technology Readiness Levels. At a broader level, this project has advanced the whole of the solid sorbent CO 2 capture field, with advancements in process engineering and design, technical risk mitigation, sorbent scale-up optimization, and an understanding of the commercial viability and applicability of solid sorbent CO 2 capture technologies for the U.S. existing fleet of coal-fired power plants.« less
Avian models for toxicity testing
Hill, E.F.; Hoffman, D.J.
1984-01-01
The use of birds as test models in experimental and environmental toxicology as related to health effects is reviewed, and an overview of descriptive tests routinely used in wildlife toxicology is provided. Toxicologic research on birds may be applicable to human health both directly by their use as models for mechanistic and descriptive studies and indirectly as monitors of environmental quality. Topics include the use of birds as models for study of teratogenesis and embryotoxicity, neurotoxicity, behavior, trends of environmental pollution, and for use in predictive wildlife toxicology. Uses of domestic and wild-captured birds are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
BEVINS, R.R.
This study is a requirements document that presents analysis for the functional description for the master pump shutdown system. This document identifies the sources of the requirements and/or how these were derived. Each requirement is validated either by quoting the source or an analysis process involving the required functionality, performance characteristics, operations input or engineering judgment. The requirements in this study apply to the first phase of the W314 Project. This document has been updated during the definitive design portion of the first phase of the W314 Project to capture additional software requirements and is planned to be updated duringmore » the second phase of the W314 Project to cover the second phase of the project's scope.« less
Using observational methods in nursing research.
Salmon, Jenny
2015-07-08
Observation is a research data-collection method used generally to capture the activities of participants as well as when and where things are happening in a given setting. It checks description of the phenomena against what the researcher perceives to be fact in a rich experiential context. The method's main strength is that it provides direct access to the social phenomena under consideration. It can be used quantitatively or qualitatively, depending on the research question. Challenges in using observation relate to adopting the role of participant or non-participant researcher as observer. This article discusses some of the complexities involved when nurse researchers seek to collect observational data on social processes in naturalistic settings using unstructured or structured observational methods in qualitative research methodology. A glossary of research terms is provided.
Ab initio theory and modeling of water.
Chen, Mohan; Ko, Hsin-Yu; Remsing, Richard C; Calegari Andrade, Marcos F; Santra, Biswajit; Sun, Zhaoru; Selloni, Annabella; Car, Roberto; Klein, Michael L; Perdew, John P; Wu, Xifan
2017-10-10
Water is of the utmost importance for life and technology. However, a genuinely predictive ab initio model of water has eluded scientists. We demonstrate that a fully ab initio approach, relying on the strongly constrained and appropriately normed (SCAN) density functional, provides such a description of water. SCAN accurately describes the balance among covalent bonds, hydrogen bonds, and van der Waals interactions that dictates the structure and dynamics of liquid water. Notably, SCAN captures the density difference between water and ice I h at ambient conditions, as well as many important structural, electronic, and dynamic properties of liquid water. These successful predictions of the versatile SCAN functional open the gates to study complex processes in aqueous phase chemistry and the interactions of water with other materials in an efficient, accurate, and predictive, ab initio manner.
Ab initio theory and modeling of water
Chen, Mohan; Ko, Hsin-Yu; Remsing, Richard C.; Calegari Andrade, Marcos F.; Santra, Biswajit; Sun, Zhaoru; Selloni, Annabella; Car, Roberto; Klein, Michael L.; Perdew, John P.; Wu, Xifan
2017-01-01
Water is of the utmost importance for life and technology. However, a genuinely predictive ab initio model of water has eluded scientists. We demonstrate that a fully ab initio approach, relying on the strongly constrained and appropriately normed (SCAN) density functional, provides such a description of water. SCAN accurately describes the balance among covalent bonds, hydrogen bonds, and van der Waals interactions that dictates the structure and dynamics of liquid water. Notably, SCAN captures the density difference between water and ice Ih at ambient conditions, as well as many important structural, electronic, and dynamic properties of liquid water. These successful predictions of the versatile SCAN functional open the gates to study complex processes in aqueous phase chemistry and the interactions of water with other materials in an efficient, accurate, and predictive, ab initio manner. PMID:28973868
Bulen, Andrew; Carter, Jonathan J.; Varanka, Dalia E.
2011-01-01
To expand data functionality and capabilities for users of The National Map of the U.S. Geological Survey, data sets for six watersheds and three urban areas were converted from the Best Practices vector data model formats to Semantic Web data formats. This report describes and documents the conver-sion process. The report begins with an introduction to basic Semantic Web standards and the background of The National Map. Data were converted from a proprietary format to Geog-raphy Markup Language to capture the geometric footprint of topographic data features. Configuration files were designed to eliminate redundancy and make the conversion more efficient. A SPARQL endpoint was established for data validation and queries. The report concludes by describing the results of the conversion.
Accurate simulations of helium pick-up experiments using a rejection-free Monte Carlo method
NASA Astrophysics Data System (ADS)
Dutra, Matthew; Hinde, Robert
2018-04-01
In this paper, we present Monte Carlo simulations of helium droplet pick-up experiments with the intention of developing a robust and accurate theoretical approach for interpreting experimental helium droplet calorimetry data. Our approach is capable of capturing the evaporative behavior of helium droplets following dopant acquisition, allowing for a more realistic description of the pick-up process. Furthermore, we circumvent the traditional assumption of bulk helium behavior by utilizing density functional calculations of the size-dependent helium droplet chemical potential. The results of this new Monte Carlo technique are compared to commonly used Poisson pick-up statistics for simulations that reflect a broad range of experimental parameters. We conclude by offering an assessment of both of these theoretical approaches in the context of our observed results.
Two-step rapid sulfur capture. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-04-01
The primary goal of this program was to test the technical and economic feasibility of a novel dry sorbent injection process called the Two-Step Rapid Sulfur Capture process for several advanced coal utilization systems. The Two-Step Rapid Sulfur Capture process consists of limestone activation in a high temperature auxiliary burner for short times followed by sorbent quenching in a lower temperature sulfur containing coal combustion gas. The Two-Step Rapid Sulfur Capture process is based on the Non-Equilibrium Sulfur Capture process developed by the Energy Technology Office of Textron Defense Systems (ETO/TDS). Based on the Non-Equilibrium Sulfur Capture studies the rangemore » of conditions for optimum sorbent activation were thought to be: activation temperature > 2,200 K for activation times in the range of 10--30 ms. Therefore, the aim of the Two-Step process is to create a very active sorbent (under conditions similar to the bomb reactor) and complete the sulfur reaction under thermodynamically favorable conditions. A flow facility was designed and assembled to simulate the temperature, time, stoichiometry, and sulfur gas concentration prevalent in the advanced coal utilization systems such as gasifiers, fluidized bed combustors, mixed-metal oxide desulfurization systems, diesel engines, and gas turbines.« less
NASA Astrophysics Data System (ADS)
Zhu, Ming; Liu, Tingting; Wang, Shu; Zhang, Kesheng
2017-08-01
Existing two-frequency reconstructive methods can only capture primary (single) molecular relaxation processes in excitable gases. In this paper, we present a reconstructive method based on the novel decomposition of frequency-dependent acoustic relaxation spectra to capture the entire molecular multimode relaxation process. This decomposition of acoustic relaxation spectra is developed from the frequency-dependent effective specific heat, indicating that a multi-relaxation process is the sum of the interior single-relaxation processes. Based on this decomposition, we can reconstruct the entire multi-relaxation process by capturing the relaxation times and relaxation strengths of N interior single-relaxation processes, using the measurements of acoustic absorption and sound speed at 2N frequencies. Experimental data for the gas mixtures CO2-N2 and CO2-O2 validate our decomposition and reconstruction approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, Stephen Allan
2016-01-28
During the astrophysical r-process, multiple neutron captures occur so rapidly on target nuclei that their daughter nuclei generally do not have time to undergo radioactive decay before another neutron is captured. The r-process can be approximately simulated on Earth in certain types of thermonuclear explosions through an analogous process of rapid neutron captures known as the "prompt capture" process. Between 1952 and 1969, 23 nuclear tests were fielded by the US which were involved (at least partially) with the "prompt capture" process. Of these tests, 15 were at least partially successful. Some of these tests were conducted under the Plowsharemore » Peaceful Nuclear Explosion Program as scientific research experiments. It is now known that the USSR conducted similar nuclear tests during 1966 to 1979. The elements einsteinium and fermium were first discovered by this process. The most successful tests achieved 19 successive neutron captures on the initial target nuclei. A review of the US program, target nuclei used, heavy element yields, scientific achievements of the program, and how some of the results have been used by the astrophysical community is given. Finally, some unanswered questions concerning very neutron-rich nuclei that could potentially have been answered with additional nuclear experiments is presented.« less
NASA Astrophysics Data System (ADS)
Wendlandt, R. F.; Foremski, J. J.
2013-12-01
Laboratory experiments show that it is possible to integrate (1) the chemistry of serpentine dissolution, (2) capture of CO2 gas from the combustion of natural gas and coal-fired power plants using aqueous amine-based solvents, (3) long-term CO2 sequestration via solid phase carbonate precipitation, and (4) capture solvent regeneration with acid recycling in a single, continuous process. In our process, magnesium is released from serpentine at 300°C via heat treatment with ammonium sulfate salts or at temperatures as low as 50°C via reaction with sulfuric acid. We have also demonstrated that various solid carbonate phases can be precipitated directly from aqueous amine-based (NH3, MEA, DMEA) CO2 capture solvent solutions at room temperature. Direct precipitation from the capture solvent enables regenerating CO2 capture solvent without the need for heat and without the need to compress the CO2 off gas. We propose that known low-temperature electrochemical methods can be integrated with this process to regenerate the aqueous amine capture solvent and recycle acid for dissolution of magnesium-bearing mineral feedstocks and magnesium release. Although the direct precipitation of magnesite at ambient conditions remains elusive, experimental results demonstrate that at temperatures ranging from 20°C to 60°C, either nesquehonite Mg(HCO3)(OH)●2H2O or a double salt with the formula [NH4]2Mg(CO3)2●4H2O or an amorphous magnesium carbonate precipitate directly from the capture solvent. These phases are less desirable for CO2 sequestration than magnesite because they potentially remove constituents (water, ammonia) from the reaction system, reducing the overall efficiency of the sequestration process. Accordingly, the integrated process can be accomplished with minimal energy consumption and loss of CO2 capture and acid solvents, and a net generation of 1 to 4 moles of H2O/6 moles of CO2 sequestered (depending on the solid carbonate precipitate and amount of produced H2 and O2 gas reacted to produce heat and water). Features of the integrated process include the following: 1) the four separate processes have compatible chemistry, enabling design of an integrated, continuous process scheme for CO2 capture and sequestration; 2) all 4 stages of the process can be conducted at ambient or slightly elevated temperatures; 3) precipitating carbonate directly from the capture solvent eliminates the need for costly CO2 gas compression; and 4) recycling the acid used for serpentine dissolution and the solvent used for CO2 capture reduces feed stock costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, George S.; Brown, William Michael
2007-09-01
Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes tomore » make use of the new data.3« less
Neutron capture on short-lived nuclei via the surrogate (d,pγ) reaction
NASA Astrophysics Data System (ADS)
Cizewski, Jolie A.; Ratkiewicz, Andrew
2018-05-01
Rapid r-process nucleosynthesis is responsible for the creation of about half of the elements heavier than iron. Neutron capture on shortlived nuclei in cold processes or during freeze out from hot processes can have a significant impact on the final observed r-process abundances. We are validating the (d,pγ) reaction as a surrogate for neutron capture with measurements on 95Mo targets and a focus on discrete transitions. The experimental results have been analyzed within the Hauser-Feshbach approach with non-elastic breakup of the deuteron providing a neutron to be captured. Preliminary results support the (d,pγ) reaction as a valid surrogate for neutron capture. We are poised to measure the (d,pγ) reaction in inverse kinematics with unstable beams following the development of the experimental techniques.
The health care and life sciences community profile for dataset descriptions
Alexiev, Vladimir; Ansell, Peter; Bader, Gary; Baran, Joachim; Bolleman, Jerven T.; Callahan, Alison; Cruz-Toledo, José; Gaudet, Pascale; Gombocz, Erich A.; Gonzalez-Beltran, Alejandra N.; Groth, Paul; Haendel, Melissa; Ito, Maori; Jupp, Simon; Juty, Nick; Katayama, Toshiaki; Kobayashi, Norio; Krishnaswami, Kalpana; Laibe, Camille; Le Novère, Nicolas; Lin, Simon; Malone, James; Miller, Michael; Mungall, Christopher J.; Rietveld, Laurens; Wimalaratne, Sarala M.; Yamaguchi, Atsuko
2016-01-01
Access to consistent, high-quality metadata is critical to finding, understanding, and reusing scientific data. However, while there are many relevant vocabularies for the annotation of a dataset, none sufficiently captures all the necessary metadata. This prevents uniform indexing and querying of dataset repositories. Towards providing a practical guide for producing a high quality description of biomedical datasets, the W3C Semantic Web for Health Care and the Life Sciences Interest Group (HCLSIG) identified Resource Description Framework (RDF) vocabularies that could be used to specify common metadata elements and their value sets. The resulting guideline covers elements of description, identification, attribution, versioning, provenance, and content summarization. This guideline reuses existing vocabularies, and is intended to meet key functional requirements including indexing, discovery, exchange, query, and retrieval of datasets, thereby enabling the publication of FAIR data. The resulting metadata profile is generic and could be used by other domains with an interest in providing machine readable descriptions of versioned datasets. PMID:27602295
Creating Body Shapes From Verbal Descriptions by Linking Similarity Spaces.
Hill, Matthew Q; Streuber, Stephan; Hahn, Carina A; Black, Michael J; O'Toole, Alice J
2016-11-01
Brief verbal descriptions of people's bodies (e.g., "curvy," "long-legged") can elicit vivid mental images. The ease with which these mental images are created belies the complexity of three-dimensional body shapes. We explored the relationship between body shapes and body descriptions and showed that a small number of words can be used to generate categorically accurate representations of three-dimensional bodies. The dimensions of body-shape variation that emerged in a language-based similarity space were related to major dimensions of variation computed directly from three-dimensional laser scans of 2,094 bodies. This relationship allowed us to generate three-dimensional models of people in the shape space using only their coordinates on analogous dimensions in the language-based description space. Human descriptions of photographed bodies and their corresponding models matched closely. The natural mapping between the spaces illustrates the role of language as a concise code for body shape that captures perceptually salient global and local body features. © The Author(s) 2016.
Cycle development and design for CO{sub 2} capture from flue gas by vacuum swing adsorption
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jun Zhang; Paul A. Webley
CO{sub 2} capture and storage is an important component in the development of clean power generation processes. One CO{sub 2} capture technology is gas-phase adsorption, specifically pressure (or vacuum) swing adsorption. The complexity of these processes makes evaluation and assessment of new adsorbents difficult and time-consuming. In this study, we have developed a simple model specifically targeted at CO{sub 2} capture by pressure swing adsorption and validated our model by comparison with data from a fully instrumented pilot-scale pressure swing adsorption process. The model captures non-isothermal effects as well as nonlinear adsorption and nitrogen coadsorption. Using the model and ourmore » apparatus, we have designed and studied a large number of cycles for CO{sub 2} capture. We demonstrate that by careful management of adsorption fronts and assembly of cycles based on understanding of the roles of individual steps, we are able to quickly assess the effect of adsorbents and process parameters on capture performance and identify optimal operating regimes and cycles. We recommend this approach in contrast to exhaustive parametric studies which tend to depend on specifics of the chosen cycle and adsorbent. We show that appropriate combinations of process steps can yield excellent process performance and demonstrate how the pressure drop, and heat loss, etc. affect process performance through their effect on adsorption fronts and profiles. Finally, cyclic temperature profiles along the adsorption column can be readily used to infer concentration profiles - this has proved to be a very useful tool in cyclic function definition. Our research reveals excellent promise for the application of pressure/vacuum swing adsorption technology in the arena of CO{sub 2} capture from flue gases. 20 refs., 6 figs., 2 tabs.« less
Cycle development and design for CO2 capture from flue gas by vacuum swing adsorption.
Zhang, Jun; Webley, Paul A
2008-01-15
CO2 capture and storage is an important component in the development of clean power generation processes. One CO2 capture technology is gas-phase adsorption, specifically pressure (or vacuum) swing adsorption. The complexity of these processes makes evaluation and assessment of new adsorbents difficult and time-consuming. In this study, we have developed a simple model specifically targeted at CO2 capture by pressure swing adsorption and validated our model by comparison with data from a fully instrumented pilot-scale pressure swing adsorption process. The model captures nonisothermal effects as well as nonlinear adsorption and nitrogen coadsorption. Using the model and our apparatus, we have designed and studied a large number of cycles for CO2 capture. We demonstrate that by careful management of adsorption fronts and assembly of cycles based on understanding of the roles of individual steps, we are able to quickly assess the effect of adsorbents and process parameters on capture performance and identify optimal operating regimes and cycles. We recommend this approach in contrast to exhaustive parametric studies which tend to depend on specifics of the chosen cycle and adsorbent. We show that appropriate combinations of process steps can yield excellent process performance and demonstrate how the pressure drop, and heat loss, etc. affect process performance through their effect on adsorption fronts and profiles. Finally, cyclic temperature profiles along the adsorption column can be readily used to infer concentration profiles-this has proved to be a very useful tool in cyclic function definition. Our research reveals excellent promise for the application of pressure/vacuum swing adsorption technology in the arena of CO2 capture from flue gases.
Waste Form and Indrift Colloids-Associated Radionuclide Concentrations: Abstraction and Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Aguilar
This Model Report describes the analysis and abstractions of the colloids process model for the waste form and engineered barrier system components of the total system performance assessment calculations to be performed with the Total System Performance Assessment-License Application model. Included in this report is a description of (1) the types and concentrations of colloids that could be generated in the waste package from degradation of waste forms and the corrosion of the waste package materials, (2) types and concentrations of colloids produced from the steel components of the repository and their potential role in radionuclide transport, and (3) typesmore » and concentrations of colloids present in natural waters in the vicinity of Yucca Mountain. Additionally, attachment/detachment characteristics and mechanisms of colloids anticipated in the repository are addressed and discussed. The abstraction of the process model is intended to capture the most important characteristics of radionuclide-colloid behavior for use in predicting the potential impact of colloid-facilitated radionuclide transport on repository performance.« less
Uy, Raymonde Charles Y.; Kury, Fabricio P.; Fontelo, Paul A.
2015-01-01
The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions. PMID:26958264
40 CFR 63.11509 - What are my notification, reporting, and recordkeeping requirements?
Code of Federal Regulations, 2014 CFR
2014-07-01
... part. (3) The records required to show continuous compliance with each management practice and... management practices and equipment standards. (iii) Description of the capture and emission control systems... that is subject to the requirements in § 63.11507(b), “What are my standards and management practices...
40 CFR 63.11509 - What are my notification, reporting, and recordkeeping requirements?
Code of Federal Regulations, 2012 CFR
2012-07-01
... part. (3) The records required to show continuous compliance with each management practice and... management practices and equipment standards. (iii) Description of the capture and emission control systems... that is subject to the requirements in § 63.11507(b), “What are my standards and management practices...
40 CFR 63.11509 - What are my notification, reporting, and recordkeeping requirements?
Code of Federal Regulations, 2013 CFR
2013-07-01
... part. (3) The records required to show continuous compliance with each management practice and... management practices and equipment standards. (iii) Description of the capture and emission control systems... that is subject to the requirements in § 63.11507(b), “What are my standards and management practices...
Profiling a Mind Map User: A Descriptive Appraisal
ERIC Educational Resources Information Center
Tucker, Joanne M.; Armstrong, Gary R.; Massad, Victor J.
2010-01-01
Whether manually or through the use of software, a non-linear information organization framework known as mind mapping offers an alternative method for capturing thoughts, ideas and information to linear thinking modes such as outlining. Mind mapping is brainstorming, organizing, and problem solving. This paper examines mind mapping techniques,…
ERIC Educational Resources Information Center
Harris, Violet J.
2011-01-01
Author Virginia Hamilton had the gift of creating lyrical phrases that captured the complexities of life. Among her most notable phrase is the idea of the "hopescape," the metaphoric description of the pains and joys, triumphs and defeats, longing, and dreams that make us human. The publication of an edited volume that compiles a sampling of…
The Futility of Attempting to Codify Academic Achievement Standards
ERIC Educational Resources Information Center
Sadler, D. Royce
2014-01-01
Internationally, attempts at developing explicit descriptions of academic achievement standards have been steadily intensifying. The aim has been to capture the essence of the standards in words, symbols or diagrams (collectively referred to as codifications) so that standards can be: set and maintained at appropriate levels; made broadly…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Dale A.
This model description is supplemental to the Lawrence Livermore National Laboratory (LLNL) report LLNL-TR-642494, Technoeconomic Evaluation of MEA versus Mixed Amines for CO2 Removal at Near- Commercial Scale at Duke Energy Gibson 3 Plant. We describe the assumptions and methodology used in the Laboratory’s simulation of its understanding of Huaneng’s novel amine solvent for CO2 capture with 35% mixed amine. The results of that simulation have been described in LLNL-TR-642494. The simulation was performed using ASPEN 7.0. The composition of the Huaneng’s novel amine solvent was estimated based on information gleaned from Huaneng patents. The chemistry of the process wasmore » described using nine equations, representing reactions within the absorber and stripper columns using the ELECTNRTL property method. As a rate-based ASPEN simulation model was not available to Lawrence Livermore at the time of writing, the height of a theoretical plate was estimated using open literature for similar processes. Composition of the flue gas was estimated based on information supplied by Duke Energy for Unit 3 of the Gibson plant. The simulation was scaled at one million short tons of CO2 absorbed per year. To aid stability of the model, convergence of the main solvent recycle loop was implemented manually, as described in the Blocks section below. Automatic convergence of this loop led to instability during the model iterations. Manual convergence of the loop enabled accurate representation and maintenance of model stability.« less
Systems Analysis of Physical Absorption of CO2 in Ionic Liquids for Pre-Combustion Carbon Capture.
Zhai, Haibo; Rubin, Edward S
2018-04-17
This study develops an integrated technical and economic modeling framework to investigate the feasibility of ionic liquids (ILs) for precombustion carbon capture. The IL 1-hexyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide is modeled as a potential physical solvent for CO 2 capture at integrated gasification combined cycle (IGCC) power plants. The analysis reveals that the energy penalty of the IL-based capture system comes mainly from the process and product streams compression and solvent pumping, while the major capital cost components are the compressors and absorbers. On the basis of the plant-level analysis, the cost of CO 2 avoided by the IL-based capture and storage system is estimated to be $63 per tonne of CO 2 . Technical and economic comparisons between IL- and Selexol-based capture systems at the plant level show that an IL-based system could be a feasible option for CO 2 capture. Improving the CO 2 solubility of ILs can simplify the capture process configuration and lower the process energy and cost penalties to further enhance the viability of this technology.
NASA Astrophysics Data System (ADS)
Patel, Ravi; Kong, Bo; Capecelatro, Jesse; Fox, Rodney; Desjardins, Olivier
2017-11-01
Particle-laden turbulent flows are important features of many environmental and industrial processes. Euler-Euler (EE) simulations of these flows are more computationally efficient than Euler-Lagrange (EL) simulations. However, traditional EE methods, such as the two-fluid model, cannot faithfully capture dilute regions of flow with finite Stokes number particles. For this purpose, the multi-valued nature of the particle velocity field must be treated with a polykinetic description. Various quadrature-based moment methods (QBMM) can be used to approximate the full kinetic description by solving for a set of moments of the particle velocity distribution function (VDF) and providing closures for the higher-order moments. Early QBMM fail to maintain the strict hyperbolicity of the kinetic equations, producing unphysical delta shocks (i.e., mass accumulation at a point). In previous work, a 2-D conditional hyperbolic quadrature method of moments (CHyQMOM) was proposed as a fourth-order QBMM closure that maintains strict hyperbolicity. Here, we present the 3-D extension of CHyQMOM. We compare results from CHyQMOM to other QBMM and EL in the context of particle trajectory crossing, cluster-induced turbulence, and particle-laden channel flow. NSF CBET-1437903.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
Bölte, Sven; de Schipper, Elles; Holtmann, Martin; Karande, Sunil; de Vries, Petrus J; Selb, Melissa; Tannock, Rosemary
2014-12-01
In the study of health and quality of life in attention deficit/hyperactivity disorder (ADHD), it is of paramount importance to include assessment of functioning. The International Classification of Functioning, Disability and Health (ICF) provides a comprehensive, universally accepted framework for the description of functioning in relation to health conditions. In this paper, the authors outline the process to develop ICF Core Sets for ADHD. ICF Core Sets are subgroups of ICF categories selected to capture the aspects of functioning that are most likely to be affected in specific disorders. The ICF categories that will be included in the ICF Core Sets for ADHD will be determined at an ICF Core Set Consensus Conference, wherein evidence from four preliminary studies (a systematic review, an expert survey, a patient and caregiver qualitative study, and a clinical cross-sectional study) will be integrated. Comprehensive and Brief ICF Core Sets for ADHD will be developed with the goal of providing useful standards for research and clinical practice, and to generate a common language for the description of functioning in ADHD in different areas of life and across the lifespan.
Attentional Capture by Emotional Stimuli Is Modulated by Semantic Processing
ERIC Educational Resources Information Center
Huang, Yang-Ming; Baddeley, Alan; Young, Andrew W.
2008-01-01
The attentional blink paradigm was used to examine whether emotional stimuli always capture attention. The processing requirement for emotional stimuli in a rapid sequential visual presentation stream was manipulated to investigate the circumstances under which emotional distractors capture attention, as reflected in an enhanced attentional blink…
Ang, Yuchen; Wong, Ling Jing; Meier, Rudolf
2013-01-01
Abstract Many species descriptions, especially older ones, consist mostly of text and have few illustrations. Only the most conspicuous morphological features needed for species diagnosis and delimitation at the time of description are illustrated. Such descriptions can quickly become inadequate when new species or characters are discovered. We propose that descriptions should become more data-rich by presenting a large amount of images and illustrations to cover as much morphology as possible; these descriptions are more likely to remain adequate over time because their large amounts of visual data could capture character systems that may become important in the future. Such an approach can now be quickly and easily achieved given that high-quality digital photography is readily available. Here, we re-describe the sepsid fly Perochaeta orientalis (de Meijere 1913) (Diptera, Sepsidae) which has suffered from inadequate descriptions in the past, and use photomicrography, scanning electron microscopy and videography to document its external morphology and mating behaviour. All images and videos are embedded within the electronic publication. We discuss briefly benefits and problems with our approach. PMID:24363567
2011-01-01
Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303
Lamm, Ayelet T; Stadler, Michael R; Zhang, Huibin; Gent, Jonathan I; Fire, Andrew Z
2011-02-01
We have used a combination of three high-throughput RNA capture and sequencing methods to refine and augment the transcriptome map of a well-studied genetic model, Caenorhabditis elegans. The three methods include a standard (non-directional) library preparation protocol relying on cDNA priming and foldback that has been used in several previous studies for transcriptome characterization in this species, and two directional protocols, one involving direct capture of single-stranded RNA fragments and one involving circular-template PCR (CircLigase). We find that each RNA-seq approach shows specific limitations and biases, with the application of multiple methods providing a more complete map than was obtained from any single method. Of particular note in the analysis were substantial advantages of CircLigase-based and ssRNA-based capture for defining sequences and structures of the precise 5' ends (which were lost using the double-strand cDNA capture method). Of the three methods, ssRNA capture was most effective in defining sequences to the poly(A) junction. Using data sets from a spectrum of C. elegans strains and stages and the UCSC Genome Browser, we provide a series of tools, which facilitate rapid visualization and assignment of gene structures.
Interference effect between neutron direct and resonance capture reactions for neutron-rich nuclei
NASA Astrophysics Data System (ADS)
Minato, Futoshi; Fukui, Tokuro
2017-11-01
Interference effect of neutron capture cross section between the compound and direct processes is investigated. The compound process is calculated by resonance parameters and the direct process by the potential model. The interference effect is tested for neutron-rich 82Ge and 134Sn nuclei relevant to r-process and light nucleus 13C which is neutron poison in the s-process and produces long-lived radioactive nucleus 14C (T1/2 = 5700 y). The interference effects in those nuclei are significant around resonances, and low energy region if s-wave neutron direct capture is possible. Maxwellian averaged cross sections at kT = 30 and 300 keV are also calculated, and the interference effect changes the Maxwellian averaged capture cross section largely depending on resonance position.
Field Testing of Cryogenic Carbon Capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayre, Aaron; Frankman, Dave; Baxter, Andrew
Sustainable Energy Solutions has been developing Cryogenic Carbon Capture™ (CCC) since 2008. In that time two processes have been developed, the External Cooling Loop and Compressed Flue Gas Cryogenic Carbon Capture processes (CCC ECL™ and CCC CFG™ respectively). The CCC ECL™ process has been scaled up to a 1TPD CO2 system. In this process the flue gas is cooled by an external refrigerant loop. SES has tested CCC ECL™ on real flue gas slip streams from subbituminous coal, bituminous coal, biomass, natural gas, shredded tires, and municipal waste fuels at field sites that include utility power stations, heating plants, cementmore » kilns, and pilot-scale research reactors. The CO2 concentrations from these tests ranged from 5 to 22% on a dry basis. CO2 capture ranged from 95-99+% during these tests. Several other condensable species were also captured including NO2, SO2 and PMxx at 95+%. NO was also captured at a modest rate. The CCC CFG™ process has been scaled up to a .25 ton per day system. This system has been tested on real flue gas streams including subbituminous coal, bituminous coal and natural gas at field sites that include utility power stations, heating plants, and pilot-scale research reactors. CO2 concentrations for these tests ranged from 5 to 15% on a dry basis. CO2 capture ranged from 95-99+% during these tests. Several other condensable species were also captured including NO2, SO2 and PMxx at 95+%. NO was also captured at 90+%. Hg capture was also verified and the resulting effluent from CCC CFG™ was below a 1ppt concentration. This paper will focus on discussion of the capabilities of CCC, the results of field testing and the future steps surrounding the development of this technology.« less
Exploratory investigations of hypervelocity intact capture spectroscopy
NASA Technical Reports Server (NTRS)
Tsou, P.; Griffiths, D. J.
1993-01-01
The ability to capture hypervelocity projectiles intact opens a new technique available for hypervelocity research. A determination of the reactions taking place between the projectile and the capture medium during the process of intact capture is extremely important to an understanding of the intact capture phenomenon, to improving the capture technique, and to developing a theory describing the phenomenon. The intact capture of hypervelocity projectiles by underdense media generates spectra, characteristic of the material species of projectile and capture medium involved. Initial exploratory results into real-time characterization of hypervelocity intact capture techniques by spectroscopy include ultra-violet and visible spectra obtained by use of reflecting gratings, transmitting gratings, and prisms, and recorded by photographic and electronic means. Spectrometry proved to be a valuable real-time diagnostic tool for hypervelocity intact capture events, offering understanding of the interactions of the projectile and the capture medium during the initial period and providing information not obtainable by other characterizations. Preliminary results and analyses of spectra produced by the intact capture of hypervelocity aluminum spheres in polyethylene (PE), polystyrene (PS), and polyurethane (PU) foams are presented. Included are tentative emission species identifications, as well as gray body temperatures produced in the intact capture process.
Model based adaptive control of a continuous capture process for monoclonal antibodies production.
Steinebach, Fabian; Angarita, Monica; Karst, Daniel J; Müller-Späth, Thomas; Morbidelli, Massimo
2016-04-29
A two-column capture process for continuous processing of cell-culture supernatant is presented. Similar to other multicolumn processes, this process uses sequential countercurrent loading of the target compound in order maximize resin utilization and productivity for a given product yield. The process was designed using a novel mechanistic model for affinity capture, which takes both specific adsorption as well as transport through the resin beads into account. Simulations as well as experimental results for the capture of an IgG antibody are discussed. The model was able to predict the process performance in terms of yield, productivity and capacity utilization. Compared to continuous capture with two columns operated batch wise in parallel, a 2.5-fold higher capacity utilization was obtained for the same productivity and yield. This results in an equal improvement in product concentration and reduction of buffer consumption. The developed model was used not only for the process design and optimization but also for its online control. In particular, the unit operating conditions are changed in order to maintain high product yield while optimizing the process performance in terms of capacity utilization and buffer consumption also in the presence of changing upstream conditions and resin aging. Copyright © 2016 Elsevier B.V. All rights reserved.
Bridging gaps in handoffs: a continuity of care based approach.
Abraham, Joanna; Kannampallil, Thomas G; Patel, Vimla L
2012-04-01
Handoff among healthcare providers has been recognized as a major source of medical errors. Most prior research has often focused on the communication aspects of handoff, with limited emphasis on the overall handoff process, especially from a clinician workflow perspective. Such a workflow perspective that is based on the continuity of care model provides a framework required to identify and support an interconnected trajectory of care events affecting handoff communication. To this end, we propose a new methodology, referred to as the clinician-centered approach that allows us to investigate and represent the entire clinician workflow prior to, during and, after handoff communication. This representation of clinician activities supports a comprehensive analysis of the interdependencies in the handoff process across the care continuum, as opposed to a single discrete, information sharing activity. The clinician-centered approach is supported by multifaceted methods for data collection such as observations, shadowing of clinicians, audio recording of handoff communication, semi-structured interviews and artifact identification and collection. The analysis followed a two-stage mixed inductive-deductive method. The iterative development of clinician-centered approach was realized using a multi-faceted study conducted in the Medical Intensive Care Unit (MICU) of an academic hospital. Using the clinician-centered approach, we (a) identify the nature, inherent characteristics and the interdependencies between three phases of the handoff process and (b) develop a descriptive framework of handoff communication in critical care that captures the non-linear, recursive and interactive nature of collaboration and decision-making. The results reported in this paper serve as a "proof of concept" of our approach, emphasizing the importance of capturing a coordinated and uninterrupted succession of clinician information management and transfer activities in relation to patient care events. Copyright © 2011 Elsevier Inc. All rights reserved.
Exploring nurses' reactions to a novel technology to support acute health care delivery.
Kent, Bridie; Redley, Bernice; Wickramasinghe, Nilmini; Nguyen, Lemai; Taylor, Nyree J; Moghimi, Hoda; Botti, Mari
2015-08-01
To explore nurses' reactions to new novel technology for acute health care. Past failures of technology developers to deliver products that meet nurses' needs have led to resistance and reluctance in the technology adoption process. Thus, involving nurses in a collaborative process from early conceptualisation serves to inform design reflective upon current clinical practice, facilitating the cementing of 'vision' and expectations of the technology. An exploratory descriptive design to capture nurses' immediate impressions. Four focus groups (52 nurses from medical and surgical wards at two hospitals in Australia; one private and one public). Nursing reactions towards the new technology illustrated a variance in barrier and enabler comments across multiple domains of the Theoretical Domains Framework. Most challenging for nurses were the perceived threat to their clinical skill, and the potential capability of the novel technology to capture their clinical workflow. Enabling reactions included visions that this could help integrate care between departments; help management and support of nursing processes; and coordinating their patients care between clinicians. Nurses' reactions differed across hospital sites, influenced by their experiences of using technology. For example, Site 1 nurses reported wide variability in their distribution of barrier and enabling comments and nurses at Site 2, where technology was prevalent, reported mostly positive responses. This early involvement offered nursing input and facilitated understanding of the potential capabilities of novel technology to support nursing work, particularly the characteristics seen as potentially beneficial (enabling technology) and those conflicting (barrier technology) with the delivery of both safe and effective patient care. Collaborative involvement of nurses from the early conceptualisation of technology development brings benefits that increase the likelihood of successful use of a tool intended to support the delivery of safe and efficient patient care. © 2015 John Wiley & Sons Ltd.
Dececchi, T Alex; Mabee, Paula M; Blackburn, David C
2016-01-01
Databases of organismal traits that aggregate information from one or multiple sources can be leveraged for large-scale analyses in biology. Yet the differences among these data streams and how well they capture trait diversity have never been explored. We present the first analysis of the differences between phenotypes captured in free text of descriptive publications ('monographs') and those used in phylogenetic analyses ('matrices'). We focus our analysis on osteological phenotypes of the limbs of four extinct vertebrate taxa critical to our understanding of the fin-to-limb transition. We find that there is low overlap between the anatomical entities used in these two sources of phenotype data, indicating that phenotypes represented in matrices are not simply a subset of those found in monographic descriptions. Perhaps as expected, compared to characters found in matrices, phenotypes in monographs tend to emphasize descriptive and positional morphology, be somewhat more complex, and relate to fewer additional taxa. While based on a small set of focal taxa, these qualitative and quantitative data suggest that either source of phenotypes alone will result in incomplete knowledge of variation for a given taxon. As a broader community develops to use and expand databases characterizing organismal trait diversity, it is important to recognize the limitations of the data sources and develop strategies to more fully characterize variation both within species and across the tree of life.
Dececchi, T. Alex; Mabee, Paula M.; Blackburn, David C.
2016-01-01
Databases of organismal traits that aggregate information from one or multiple sources can be leveraged for large-scale analyses in biology. Yet the differences among these data streams and how well they capture trait diversity have never been explored. We present the first analysis of the differences between phenotypes captured in free text of descriptive publications (‘monographs’) and those used in phylogenetic analyses (‘matrices’). We focus our analysis on osteological phenotypes of the limbs of four extinct vertebrate taxa critical to our understanding of the fin-to-limb transition. We find that there is low overlap between the anatomical entities used in these two sources of phenotype data, indicating that phenotypes represented in matrices are not simply a subset of those found in monographic descriptions. Perhaps as expected, compared to characters found in matrices, phenotypes in monographs tend to emphasize descriptive and positional morphology, be somewhat more complex, and relate to fewer additional taxa. While based on a small set of focal taxa, these qualitative and quantitative data suggest that either source of phenotypes alone will result in incomplete knowledge of variation for a given taxon. As a broader community develops to use and expand databases characterizing organismal trait diversity, it is important to recognize the limitations of the data sources and develop strategies to more fully characterize variation both within species and across the tree of life. PMID:27191170
Geoscience Australia Publishes Sample Descriptions using W3C standards
NASA Astrophysics Data System (ADS)
Car, N. J.; Cox, S. J. D.; Bastrakova, I.; Wyborn, L. A.
2017-12-01
The recent revision of the W3C Semantic Sensor Network Ontology (SSN) has focused on three key concerns: Extending the scope of the ontology to include sampling and actuation as well as observation and sensing Modularizing the ontology into a simple core with few classes and properties and little formal axiomatization, supplemented by additional modules that formalize the semantics and extend the scope Alignments with several existing applications and upper ontologies These enhancements mean that SSN can now be used as the basis for publishing descriptions of geologic samples as Linked Data. Geoscience Australia maintains a database of about three million samples, collected over 50 years through projects from ocean core, terrestrial rock and hydrochemistry borehole projects, almost all of which are held in in the special-purpose GA samples repository. Access to descriptions of these samples as Linked Data has recently been enabled. The sample descriptions can be viewed in various machine-readable formalizations, including IGSN (XML & RDF), Dublin Core (XML & RDF) and SSN (RDF), as well as web landing-pages for people. Of particular importance is the support for encoding relationships between samples, and between samples and surveys, boreholes, and traverses which they are related to, as well as between samples processed for analytical purposes and their parents, siblings, and back to the original field samples. The SSN extension for Sample Relationships provides an extensible, semantically rich mechanism to capture any relationship necessary to explain the provenance of observation results obtained from samples. Sample citation is facilitated through the use of URI-based persistent identifiers which resolve to samples' landing pages. The sample system also allows PROV pingbacks to be received for samples when users of them record provenance for their actions.
Spectral method for a kinetic swarming model
Gamba, Irene M.; Haack, Jeffrey R.; Motsch, Sebastien
2015-04-28
Here we present the first numerical method for a kinetic description of the Vicsek swarming model. The kinetic model poses a unique challenge, as there is a distribution dependent collision invariant to satisfy when computing the interaction term. We use a spectral representation linked with a discrete constrained optimization to compute these interactions. To test the numerical scheme we investigate the kinetic model at different scales and compare the solution with the microscopic and macroscopic descriptions of the Vicsek model. Lastly, we observe that the kinetic model captures key features such as vortex formation and traveling waves.
Multiple-predators-based capture process on complex networks
NASA Astrophysics Data System (ADS)
Ramiz Sharafat, Rajput; Pu, Cunlai; Li, Jie; Chen, Rongbin; Xu, Zhongqi
2017-03-01
The predator/prey (capture) problem is a prototype of many network-related applications. We study the capture process on complex networks by considering multiple predators from multiple sources. In our model, some lions start from multiple sources simultaneously to capture the lamb by biased random walks, which are controlled with a free parameter $\\alpha$. We derive the distribution of the lamb's lifetime and the expected lifetime $\\left\\langle T\\right\\rangle $. Through simulation, we find that the expected lifetime drops substantially with the increasing number of lions. We also study how the underlying topological structure affects the capture process, and obtain that locating on small-degree nodes is better than large-degree nodes to prolong the lifetime of the lamb. Moreover, dense or homogeneous network structures are against the survival of the lamb.
Firing rate dynamics in recurrent spiking neural networks with intrinsic and network heterogeneity.
Ly, Cheng
2015-12-01
Heterogeneity of neural attributes has recently gained a lot of attention and is increasing recognized as a crucial feature in neural processing. Despite its importance, this physiological feature has traditionally been neglected in theoretical studies of cortical neural networks. Thus, there is still a lot unknown about the consequences of cellular and circuit heterogeneity in spiking neural networks. In particular, combining network or synaptic heterogeneity and intrinsic heterogeneity has yet to be considered systematically despite the fact that both are known to exist and likely have significant roles in neural network dynamics. In a canonical recurrent spiking neural network model, we study how these two forms of heterogeneity lead to different distributions of excitatory firing rates. To analytically characterize how these types of heterogeneities affect the network, we employ a dimension reduction method that relies on a combination of Monte Carlo simulations and probability density function equations. We find that the relationship between intrinsic and network heterogeneity has a strong effect on the overall level of heterogeneity of the firing rates. Specifically, this relationship can lead to amplification or attenuation of firing rate heterogeneity, and these effects depend on whether the recurrent network is firing asynchronously or rhythmically firing. These observations are captured with the aforementioned reduction method, and furthermore simpler analytic descriptions based on this dimension reduction method are developed. The final analytic descriptions provide compact and descriptive formulas for how the relationship between intrinsic and network heterogeneity determines the firing rate heterogeneity dynamics in various settings.
Learning about Ecological Systems by Constructing Qualitative Models with DynaLearn
ERIC Educational Resources Information Center
Leiba, Moshe; Zuzovsky, Ruth; Mioduser, David; Benayahu, Yehuda; Nachmias, Rafi
2012-01-01
A qualitative model of a system is an abstraction that captures ordinal knowledge and predicts the set of qualitatively possible behaviours of the system, given a qualitative description of its structure and initial state. This paper examines an innovative approach to science education using an interactive learning environment that supports…
Captured by Details: Sense-Making, Language and Communication in Autism
ERIC Educational Resources Information Center
Noens, Ilse L. J.; van Berckelaer-Onnes, Ina A.
2005-01-01
The communication of people with autism spectrum disorder (ASD) is characterized by a qualitative impairment in verbal and non-verbal communication. In past decades a growing body of descriptive studies has appeared on language and communication problems in ASD. Reviews suggest that the development of formal and semantic aspects is relatively…
A Continuum Description of Nonlinear Elasticity, Slip and Twinning, With Application to Sapphire
2009-03-01
Twinning is modelled via the isochoric term FI, and residual volume changes associated with defects are captured by the Jacobian determinant J . The...BF00126994) Farber, Y. A., Yoon, S. Y., Lagerlof, K. P. D. & Heuer, A. H. 1993 Microplasticity during high temperature indentation and the Peierls
ERIC Educational Resources Information Center
Koltz, Rebecca L.; Feit, Stephen S.
2012-01-01
The experiences of live supervision for three, master's level, pre-practicum counseling students were explored using a phenomenological methodology. Using semi-structured interviews, this study resulted in a thick description of the experience of live supervision capturing participants' thoughts, emotions, and behaviors. Data revealed that live…
Continuous versus Arrested Spreading of Biofilms at Solid-Gas Interfaces: The Role of Surface Forces
NASA Astrophysics Data System (ADS)
Trinschek, Sarah; John, Karin; Lecuyer, Sigolène; Thiele, Uwe
2017-08-01
We introduce and analyze a model for osmotically spreading bacterial colonies at solid-air interfaces that includes wetting phenomena, i.e., surface forces. The model is based on a hydrodynamic description for liquid suspensions which is supplemented by bioactive processes. We show that surface forces determine whether a biofilm can expand laterally over a substrate and provide experimental evidence for the existence of a transition between continuous and arrested spreading for Bacillus subtilis biofilms. In the case of arrested spreading, the lateral expansion of the biofilm is confined, albeit the colony is biologically active. However, a small reduction in the surface tension of the biofilm is sufficient to induce spreading. The incorporation of surface forces into our hydrodynamic model allows us to capture this transition in biofilm spreading behavior.
NASA Astrophysics Data System (ADS)
Young, Eric D.
The analysis of
s -wave scattering length of a Gaussian potential
NASA Astrophysics Data System (ADS)
Jeszenszki, Peter; Cherny, Alexander Yu.; Brand, Joachim
2018-04-01
We provide accurate expressions for the s -wave scattering length for a Gaussian potential well in one, two, and three spatial dimensions. The Gaussian potential is widely used as a pseudopotential in the theoretical description of ultracold-atomic gases, where the s -wave scattering length is a physically relevant parameter. We first describe a numerical procedure to compute the value of the s -wave scattering length from the parameters of the Gaussian, but find that its accuracy is limited in the vicinity of singularities that result from the formation of new bound states. We then derive simple analytical expressions that capture the correct asymptotic behavior of the s -wave scattering length near the bound states. Expressions that are increasingly accurate in wide parameter regimes are found by a hierarchy of approximations that capture an increasing number of bound states. The small number of numerical coefficients that enter these expressions is determined from accurate numerical calculations. The approximate formulas combine the advantages of the numerical and approximate expressions, yielding an accurate and simple description from the weakly to the strongly interacting limit.
Kim, Huiyong; Hwang, Sung June; Lee, Kwang Soon
2015-02-03
Among various CO2 capture processes, the aqueous amine-based absorption process is considered the most promising for near-term deployment. However, the performance evaluation of newly developed solvents still requires complex and time-consuming procedures, such as pilot plant tests or the development of a rigorous simulator. Absence of accurate and simple calculation methods for the energy performance at an early stage of process development has lengthened and increased expense of the development of economically feasible CO2 capture processes. In this paper, a novel but simple method to reliably calculate the regeneration energy in a standard amine-based carbon capture process is proposed. Careful examination of stripper behaviors and exploitation of energy balance equations around the stripper allowed for calculation of the regeneration energy using only vapor-liquid equilibrium and caloric data. Reliability of the proposed method was confirmed by comparing to rigorous simulations for two well-known solvents, monoethanolamine (MEA) and piperazine (PZ). The proposed method can predict the regeneration energy at various operating conditions with greater simplicity, greater speed, and higher accuracy than those proposed in previous studies. This enables faster and more precise screening of various solvents and faster optimization of process variables and can eventually accelerate the development of economically deployable CO2 capture processes.
Asymmetric capture of Dirac dark matter by the Sun
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blennow, Mattias; Clementz, Stefan
2015-08-18
Current problems with the solar model may be alleviated if a significant amount of dark matter from the galactic halo is captured in the Sun. We discuss the capture process in the case where the dark matter is a Dirac fermion and the background halo consists of equal amounts of dark matter and anti-dark matter. By considering the case where dark matter and anti-dark matter have different cross sections on solar nuclei as well as the case where the capture process is considered to be a Poisson process, we find that a significant asymmetry between the captured dark particles andmore » anti-particles is possible even for an annihilation cross section in the range expected for thermal relic dark matter. Since the captured number of particles are competitive with asymmetric dark matter models in a large range of parameter space, one may expect solar physics to be altered by the capture of Dirac dark matter. It is thus possible that solutions to the solar composition problem may be searched for in these type of models.« less
Dynamics of Postcombustion CO2 Capture Plants: Modeling, Validation, and Case Study
2017-01-01
The capture of CO2 from power plant flue gases provides an opportunity to mitigate emissions that are harmful to the global climate. While the process of CO2 capture using an aqueous amine solution is well-known from experience in other technical sectors (e.g., acid gas removal in the gas processing industry), its operation combined with a power plant still needs investigation because in this case, the interaction with power plants that are increasingly operated dynamically poses control challenges. This article presents the dynamic modeling of CO2 capture plants followed by a detailed validation using transient measurements recorded from the pilot plant operated at the Maasvlakte power station in the Netherlands. The model predictions are in good agreement with the experimental data related to the transient changes of the main process variables such as flow rate, CO2 concentrations, temperatures, and solvent loading. The validated model was used to study the effects of fast power plant transients on the capture plant operation. A relevant result of this work is that an integrated CO2 capture plant might enable more dynamic operation of retrofitted fossil fuel power plants because the large amount of steam needed by the capture process can be diverted rapidly to and from the power plant. PMID:28413256
NASA Astrophysics Data System (ADS)
Shilyaev, M. I.; Khromova, E. M.; Grigoriev, A. V.; Tumashova, A. V.
2011-09-01
A physical-mathematical model of the heat and mass exchange process and condensation capture of sub-micron dust particles on the droplets of dispersed liquid in a sprayer scrubber is proposed and analysed. A satisfactory agreement of computed results and experimental data on soot capturing from the cracking gases is obtained.
Mobile capture of remote points of interest using line of sight modelling
NASA Astrophysics Data System (ADS)
Meek, Sam; Priestnall, Gary; Sharples, Mike; Goulding, James
2013-03-01
Recording points of interest using GPS whilst working in the field is an established technique in geographical fieldwork, where the user's current position is used as the spatial reference to be captured; this is known as geo-tagging. We outline the development and evaluation of a smartphone application called Zapp that enables geo-tagging of any distant point on the visible landscape. The ability of users to log or retrieve information relating to what they can see, rather than where they are standing, allows them to record observations of points in the broader landscape scene, or to access descriptions of landscape features from any viewpoint. The application uses the compass orientation and tilt of the phone to provide data for a line of sight algorithm that intersects with a Digital Surface Model stored on the mobile device. We describe the development process and design decisions for Zapp present the results of a controlled study of the accuracy of the application, and report on the use of Zapp for a student field exercise. The studies indicate the feasibility of the approach, but also how the appropriate use of such techniques will be constrained by current levels of precision in mobile sensor technology. The broader implications for interactive query of the distant landscape and for remote data logging are discussed.
Adverse Outcome Pathway (AOP) Network Development for ...
Adverse outcome pathways (AOPs) are descriptive biological sequences that start from a molecular initiating event (MIE) and end with an adverse health outcome. AOPs provide biological context for high throughput chemical testing and further prioritize environmental health risk research. According to the Organization for Economic Co-operation and Development guidelines, AOPs are pathways with one MIE anchored to an adverse outcome (AO) by key events (KEs) and key event relationships (KERs). However, this approach does not always capture the cumulative impacts of multiple MIEs on the AO. For example, hepatic lipid flux due to chemical-induced toxicity initiates from multiple ligand-activated receptors and signaling pathways that cascade across biology to converge upon a common fatty liver (FL, also known as steatosis) outcome. To capture this complexity, a top-down strategy was used to develop a FL AOP network (AOPnet). Literature was queried based on the terms steatosis, fatty liver, cirrhosis, and hepatocellular carcinoma. Search results were analyzed for physiological and pathophysiological organ level, cellular and molecular processes, as well as pathway intermediates, to identify potential KEs and MIEs that are key for hepatic lipid metabolism, maintenance, and dysregulation. The analysis identified four apical KE nodes (hepatic fatty acid uptake, de novo fatty acid and lipid synthesis, fatty acid oxidation, and lipid efflux) juxtaposed to the FL AO. The apic
OPSO - The OpenGL based Field Acquisition and Telescope Guiding System
NASA Astrophysics Data System (ADS)
Škoda, P.; Fuchs, J.; Honsa, J.
2006-07-01
We present OPSO, a modular pointing and auto-guiding system for the coudé spectrograph of the Ondřejov observatory 2m telescope. The current field and slit viewing CCD cameras with image intensifiers are giving only standard TV video output. To allow the acquisition and guiding of very faint targets, we have designed an image enhancing system working in real time on TV frames grabbed by BT878-based video capture card. Its basic capabilities include the sliding averaging of hundreds of frames with bad pixel masking and removal of outliers, display of median of set of frames, quick zooming, contrast and brightness adjustment, plotting of horizontal and vertical cross cuts of seeing disk within given intensity range and many more. From the programmer's point of view, the system consists of three tasks running in parallel on a Linux PC. One C task controls the video capturing over Video for Linux (v4l2) interface and feeds the frames into the large block of shared memory, where the core image processing is done by another C program calling the OpenGL library. The GUI is, however, dynamically built in Python from XML description of widgets prepared in Glade. All tasks are exchanging information by IPC calls using the shared memory segments.
Image processing system design for microcantilever-based optical readout infrared arrays
NASA Astrophysics Data System (ADS)
Tong, Qiang; Dong, Liquan; Zhao, Yuejin; Gong, Cheng; Liu, Xiaohua; Yu, Xiaomei; Yang, Lei; Liu, Weiyu
2012-12-01
Compared with the traditional infrared imaging technology, the new type of optical-readout uncooled infrared imaging technology based on MEMS has many advantages, such as low cost, small size, producing simple. In addition, the theory proves that the technology's high thermal detection sensitivity. So it has a very broad application prospects in the field of high performance infrared detection. The paper mainly focuses on an image capturing and processing system in the new type of optical-readout uncooled infrared imaging technology based on MEMS. The image capturing and processing system consists of software and hardware. We build our image processing core hardware platform based on TI's high performance DSP chip which is the TMS320DM642, and then design our image capturing board based on the MT9P031. MT9P031 is Micron's company high frame rate, low power consumption CMOS chip. Last we use Intel's company network transceiver devices-LXT971A to design the network output board. The software system is built on the real-time operating system DSP/BIOS. We design our video capture driver program based on TI's class-mini driver and network output program based on the NDK kit for image capturing and processing and transmitting. The experiment shows that the system has the advantages of high capturing resolution and fast processing speed. The speed of the network transmission is up to 100Mbps.
Pilot testing of a membrane system for postcombustion CO 2 capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merkel, Tim; Kniep, Jay; Wei, Xiaotong
2015-09-30
This final report summarizes work conducted for the U.S. Department of Energy, National Energy Technology Laboratory (DOE) to scale up an efficient post-combustion CO 2 capture membrane process to the small pilot test stage (award number DE-FE0005795). The primary goal of this research program was to design, fabricate, and operate a membrane CO 2 capture system to treat coal-derived flue gas containing 20 tonnes CO 2/day (20 TPD). Membrane Technology and Research (MTR) conducted this project in collaboration with Babcock and Wilcox (B&W), the Electric Power Research Institute (EPRI), WorleyParsons (WP), the Illinois Sustainable Technology Center (ISTC), Enerkem (EK), andmore » the National Carbon Capture Center (NCCC). In addition to the small pilot design, build and slipstream testing at NCCC, other project efforts included laboratory membrane and module development at MTR, validation field testing on a 1 TPD membrane system at NCCC, boiler modeling and testing at B&W, a techno-economic analysis (TEA) by EPRI/WP, a case study of the membrane technology applied to a ~20 MWe power plant by ISTC, and an industrial CO 2 capture test at an Enerkem waste-to-biofuel facility. The 20 TPD small pilot membrane system built in this project successfully completed over 1,000 hours of operation treating flue gas at NCCC. The Polaris™ membranes used on this system demonstrated stable performance, and when combined with over 10,000 hours of operation at NCCC on a 1 TPD system, the risk associated with uncertainty in the durability of postcombustion capture membranes has been greatly reduced. Moreover, next-generation Polaris membranes with higher performance and lower cost were validation tested on the 1 TPD system. The 20 TPD system also demonstrated successful operation of a new low-pressure-drop sweep module that will reduce parasitic energy losses at full scale by as much as 10 MWe. In modeling and pilot boiler testing, B&W confirmed the viability of CO 2 recycle to the boiler as envisioned in the MTR process design. The impact of this CO 2 recycle on boiler efficiency was quantified and incorporated into a TEA of the membrane capture process applied to a full-scale power plant. As with previous studies, the TEA showed the membrane process to be lower cost than the conventional solvent capture process even at 90% CO 2capture. A sensitivity study indicates that the membrane capture cost decreases significantly if the 90% capture requirement is relaxed. Depending on the process design, a minimum capture cost is achieved at 30-60% capture, values that would meet proposed CO 2 emission regulations for coal-fired power plants. In summary, this project has successfully advanced the MTR membrane capture process through small pilot testing (technology readiness level 6). The technology is ready for future scale-up to the 10 MWe size.« less
Bogolon-mediated electron capture by impurities in hybrid Bose-Fermi systems
NASA Astrophysics Data System (ADS)
Boev, M. V.; Kovalev, V. M.; Savenko, I. G.
2018-04-01
We investigate the processes of electron capture by a Coulomb impurity center residing in a hybrid system consisting of spatially separated two-dimensional layers of electron and Bose-condensed dipolar exciton gases coupled via the Coulomb forces. We calculate the probability of the electron capture accompanied by the emission of a single Bogoliubov excitation (bogolon), similar to regular phonon-mediated scattering in solids. Furthermore, we study the electron capture mediated by the emission of a pair of bogolons in a single capture event and show that these processes not only should be treated in the same order of the perturbation theory, but also they give a more important contribution than single-bogolon-mediated capture, in contrast with regular phonon scattering.
Origin of the main r-process elements
NASA Astrophysics Data System (ADS)
Otsuki, K.; Truran, J.; Wiescher, M.; Gorres, J.; Mathews, G.; Frekers, D.; Mengoni, A.; Bartlett, A.; Tostevin, J.
2006-07-01
The r-process is supposed to be a primary process which assembles heavy nuclei from a photo-dissociated nucleon gas. Hence, the reaction flow through light elements can be important as a constraint on the conditions for the r-process. We have studied the impact of di-neutron capture and the neutron-capture of light (Z<10) elements on r-process nucleosynthesis in three different environments: neutrino-driven winds in Type II supernovae; the prompt explosion of low mass supernovae; and neutron star mergers. Although the effect of di-neutron capture is not significant for the neutrino-driven wind model or low-mass supernovae, it becomes significant in the neutron-star merger model. The neutron-capture of light elements, which has been studied extensively for neutrino-driven wind models, also impacts the other two models. We show that it may be possible to identify the astrophysical site for the main r-process if the nuclear physics uncertainties in current r-process calculations could be reduced.
Optical correlators for automated rendezvous and capture
NASA Technical Reports Server (NTRS)
Juday, Richard D.
1991-01-01
The paper begins with a description of optical correlation. In this process, the propagation physics of coherent light is used to process images and extract information. The processed image is operated on as an area, rather than as a collection of points. An essentially instantaneous convolution is performed on that image to provide the sensory data. In this process, an image is sensed and encoded onto a coherent wavefront, and the propagation is arranged to create a bright spot of the image to match a model of the desired object. The brightness of the spot provides an indication of the degree of resemblance of the viewed image to the mode, and the location of the bright spot provides pointing information. The process can be utilized for AR&C to achieve the capability to identify objects among known reference types, estimate the object's location and orientation, and interact with the control system. System characteristics (speed, robustness, accuracy, small form factors) are adequate to meet most requirements. The correlator exploits the fact that Bosons and Fermions pass through each other. Since the image source is input as an electronic data set, conventional imagers can be used. In systems where the image is input directly, the correlating element must be at the sensing location.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blennow, Mattias; Clementz, Stefan, E-mail: emb@kth.se, E-mail: scl@kth.se
Current problems with the solar model may be alleviated if a significant amount of dark matter from the galactic halo is captured in the Sun. We discuss the capture process in the case where the dark matter is a Dirac fermion and the background halo consists of equal amounts of dark matter and anti-dark matter. By considering the case where dark matter and anti-dark matter have different cross sections on solar nuclei as well as the case where the capture process is considered to be a Poisson process, we find that a significant asymmetry between the captured dark particles andmore » anti-particles is possible even for an annihilation cross section in the range expected for thermal relic dark matter. Since the captured number of particles are competitive with asymmetric dark matter models in a large range of parameter space, one may expect solar physics to be altered by the capture of Dirac dark matter. It is thus possible that solutions to the solar composition problem may be searched for in these type of models.« less
2017-01-01
Several reactions, known from other amine systems for CO2 capture, have been proposed for Lewatit R VP OC 1065. The aim of this molecular modeling study is to elucidate the CO2 capture process: the physisorption process prior to the CO2-capture and the reactions. Molecular modeling yields that the resin has a structure with benzyl amine groups on alternating positions in close vicinity of each other. Based on this structure, the preferred adsorption mode of CO2 and H2O was established. Next, using standard Density Functional Theory two catalytic reactions responsible for the actual CO2 capture were identified: direct amine and amine-H2O catalyzed formation of carbamic acid. The latter is a new type of catalysis. Other reactions are unlikely. Quantitative verification of the molecular modeling results with known experimental CO2 adsorption isotherms, applying a dual site Langmuir adsorption isotherm model, further supports all results of this molecular modeling study. PMID:29142339
Lloyd, Jeffrey T.; Clayton, John D.; Austin, Ryan A.; ...
2015-07-10
Background: The shock response of metallic single crystals can be captured using a micro-mechanical description of the thermoelastic-viscoplastic material response; however, using a such a description within the context of traditional numerical methods may introduce a physical artifacts. Advantages and disadvantages of complex material descriptions, in particular the viscoplastic response, must be framed within approximations introduced by numerical methods. Methods: Three methods of modeling the shock response of metallic single crystals are summarized: finite difference simulations, steady wave simulations, and algebraic solutions of the Rankine-Hugoniot jump conditions. For the former two numerical techniques, a dislocation density based framework describes themore » rate- and temperature-dependent shear strength on each slip system. For the latter analytical technique, a simple (two-parameter) rate- and temperature-independent linear hardening description is necessarily invoked to enable simultaneous solution of the governing equations. For all models, the same nonlinear thermoelastic energy potential incorporating elastic constants of up to order 3 is applied. Results: Solutions are compared for plate impact of highly symmetric orientations (all three methods) and low symmetry orientations (numerical methods only) of aluminum single crystals shocked to 5 GPa (weak shock regime) and 25 GPa (overdriven regime). Conclusions: For weak shocks, results of the two numerical methods are very similar, regardless of crystallographic orientation. For strong shocks, artificial viscosity affects the finite difference solution, and effects of transverse waves for the lower symmetry orientations not captured by the steady wave method become important. The analytical solution, which can only be applied to highly symmetric orientations, provides reasonable accuracy with regards to prediction of most variables in the final shocked state but, by construction, does not provide insight into the shock structure afforded by the numerical methods.« less
The Metadata Coverage Index (MCI): A standardized metric for quantifying database metadata richness.
Liolios, Konstantinos; Schriml, Lynn; Hirschman, Lynette; Pagani, Ioanna; Nosrat, Bahador; Sterk, Peter; White, Owen; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Taylor, Chris; Kyrpides, Nikos C; Field, Dawn
2012-07-30
Variability in the extent of the descriptions of data ('metadata') held in public repositories forces users to assess the quality of records individually, which rapidly becomes impractical. The scoring of records on the richness of their description provides a simple, objective proxy measure for quality that enables filtering that supports downstream analysis. Pivotally, such descriptions should spur on improvements. Here, we introduce such a measure - the 'Metadata Coverage Index' (MCI): the percentage of available fields actually filled in a record or description. MCI scores can be calculated across a database, for individual records or for their component parts (e.g., fields of interest). There are many potential uses for this simple metric: for example; to filter, rank or search for records; to assess the metadata availability of an ad hoc collection; to determine the frequency with which fields in a particular record type are filled, especially with respect to standards compliance; to assess the utility of specific tools and resources, and of data capture practice more generally; to prioritize records for further curation; to serve as performance metrics of funded projects; or to quantify the value added by curation. Here we demonstrate the utility of MCI scores using metadata from the Genomes Online Database (GOLD), including records compliant with the 'Minimum Information about a Genome Sequence' (MIGS) standard developed by the Genomic Standards Consortium. We discuss challenges and address the further application of MCI scores; to show improvements in annotation quality over time, to inform the work of standards bodies and repository providers on the usability and popularity of their products, and to assess and credit the work of curators. Such an index provides a step towards putting metadata capture practices and in the future, standards compliance, into a quantitative and objective framework.
A process for capturing CO 2 from the atmosphere
Keith, David W.; Holmes, Geoffrey; St. Angelo, David; ...
2018-06-07
Here, we describe a process for capturing CO 2 from the atmosphere in an industrial plant. The design captures ~1 Mt-CO 2/year in a continuous process using an aqueous KOH sorbent coupled to a calcium caustic recovery loop. We describe the design rationale, summarize performance of the major unit operations, and provide a capital cost breakdown developed with an independent consulting engineering firm. We report results from a pilot plant which provides data on performance of the major unit operations. We summarize the energy and material balance computed using an Aspen process simulation. When CO 2 is delivered at 15more » MPa the design requires either 8.81 GJ of natural gas, or 5.25 GJ of gas and 366 kWhr of electricity, per ton of CO 2 captured. Depending on financial assumptions, energy costs, and the specific choice of inputs and outputs, the levelized cost per ton CO 2 captured from the atmosphere ranges from 94 to 232 $/t-CO 2.« less
A process for capturing CO 2 from the atmosphere
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keith, David W.; Holmes, Geoffrey; St. Angelo, David
Here, we describe a process for capturing CO 2 from the atmosphere in an industrial plant. The design captures ~1 Mt-CO 2/year in a continuous process using an aqueous KOH sorbent coupled to a calcium caustic recovery loop. We describe the design rationale, summarize performance of the major unit operations, and provide a capital cost breakdown developed with an independent consulting engineering firm. We report results from a pilot plant which provides data on performance of the major unit operations. We summarize the energy and material balance computed using an Aspen process simulation. When CO 2 is delivered at 15more » MPa the design requires either 8.81 GJ of natural gas, or 5.25 GJ of gas and 366 kWhr of electricity, per ton of CO 2 captured. Depending on financial assumptions, energy costs, and the specific choice of inputs and outputs, the levelized cost per ton CO 2 captured from the atmosphere ranges from 94 to 232 $/t-CO 2.« less
Rabattu, Pierre-Yves; Massé, Benoit; Ulliana, Federico; Rousset, Marie-Christine; Rohmer, Damien; Léon, Jean-Claude; Palombi, Olivier
2015-01-01
Embryology is a complex morphologic discipline involving a set of entangled mechanisms, sometime difficult to understand and to visualize. Recent computer based techniques ranging from geometrical to physically based modeling are used to assist the visualization and the simulation of virtual humans for numerous domains such as surgical simulation and learning. On the other side, the ontology-based approach applied to knowledge representation is more and more successfully adopted in the life-science domains to formalize biological entities and phenomena, thanks to a declarative approach for expressing and reasoning over symbolic information. 3D models and ontologies are two complementary ways to describe biological entities that remain largely separated. Indeed, while many ontologies providing a unified formalization of anatomy and embryology exist, they remain only descriptive and make the access to anatomical content of complex 3D embryology models and simulations difficult. In this work, we present a novel ontology describing the development of the human embryology deforming 3D models. Beyond describing how organs and structures are composed, our ontology integrates a procedural description of their 3D representations, temporal deformation and relations with respect to their developments. We also created inferences rules to express complex connections between entities. It results in a unified description of both the knowledge of the organs deformation and their 3D representations enabling to visualize dynamically the embryo deformation during the Carnegie stages. Through a simplified ontology, containing representative entities which are linked to spatial position and temporal process information, we illustrate the added-value of such a declarative approach for interactive simulation and visualization of 3D embryos. Combining ontologies and 3D models enables a declarative description of different embryological models that capture the complexity of human developmental anatomy. Visualizing embryos with 3D geometric models and their animated deformations perhaps paves the way towards some kind of hypothesis-driven application. These can also be used to assist the learning process of this complex knowledge. http://www.mycorporisfabrica.org/.
Shallow Processing and Attention Capture in Written and Spoken Discourse
ERIC Educational Resources Information Center
Sanford, Alison J. S.; Sanford, Anthony J.; Molle, Jo; Emmott, Catherine
2006-01-01
Processing of discourse seems to be far from uniform with much evidence indicating that it can be quite shallow. The question is then what modulates depth of processing? A range of discourse devices exist that we believe may lead to more detailed processing of language input (Attention Capturers), thus serving as modulators of processing enabling…
Water surface capturing by image processing
USDA-ARS?s Scientific Manuscript database
An alternative means of measuring the water surface interface during laboratory experiments is processing a series of sequentially captured images. Image processing can provide a continuous, non-intrusive record of the water surface profile whose accuracy is not dependent on water depth. More trad...
Methanol from CO2 by organo-cocatalysis: CO2 capture and hydrogenation in one process step.
Reller, Christian; Pöge, Matthias; Lißner, Andreas; Mertens, Florian O R L
2014-12-16
Carbon dioxide chemically bound to alcohol-amines was hydrogenated to methanol under retrieval of these industrially used CO2 capturing reagents. The energetics of the process can be seen as a partial cancellation of the exothermic heat of reaction of the hydrogenation with the endothermic one of the CO2 release from the capturing reagent. The process provides a means to significantly improve the energy efficiency of CO2 to methanol conversions.
Distractor-Induced Blindness: A Special Case of Contingent Attentional Capture?
Winther, Gesche N.; Niedeggen, Michael
2017-01-01
The detection of a salient visual target embedded in a rapid serial visual presentation (RSVP) can be severely affected if target-like distractors are presented previously. This phenomenon, known as distractor-induced blindness (DIB), shares the prerequisites of contingent attentional capture (Folk, Remington, & Johnston, 1992). In both, target processing is transiently impaired by the presentation of distractors defined by similar features. In the present study, we investigated whether the speeded response to a target in the DIB paradigm can be described in terms of a contingent attentional capture process. In the first experiments, multiple distractors were embedded in the RSVP stream. Distractors either shared the target’s visual features (Experiment 1A) or differed from them (Experiment 1B). Congruent with hypotheses drawn from contingent attentional capture theory, response times (RTs) were exclusively impaired in conditions with target-like distractors. However, RTs were not impaired if only one single target-like distractor was presented (Experiment 2). If attentional capture directly contributed to DIB, the single distractor should be sufficient to impair target processing. In conclusion, DIB is not due to contingent attentional capture, but may rely on a central suppression process triggered by multiple distractors. PMID:28439320
Kinematic discrimination of ataxia in horses is facilitated by blindfolding.
Olsen, E; FouchÉ, N; Jordan, H; Pfau, T; Piercy, R J
2018-03-01
Agreement among experienced clinicians is poor when assessing the presence and severity of ataxia, especially when signs are mild. Consequently, objective gait measurements might be beneficial for assessment of horses with neurological diseases. To assess diagnostic criteria using motion capture to measure variability in spatial gait-characteristics and swing duration derived from ataxic and non-ataxic horses, and to assess if variability increases with blindfolding. Cross-sectional. A total of 21 horses underwent measurements in a gait laboratory and live neurological grading by multiple raters. In the gait laboratory, the horses were made to walk across a runway surrounded by a 12-camera motion capture system with a sample frequency of 240 Hz. They were made to walk normally and with a blindfold in at least three trials each. Displacements of reflective markers on head, fetlock, hoof, fourth lumbar vertebra, tuber coxae and sacrum derived from three to four consecutive strides were processed and descriptive statistics, receiver operator characteristics (ROC) to determine the diagnostic sensitivity, specificity and area under the curve (AUC), and correlation between median ataxia grade and gait parameters were determined. For horses with a median ataxia grade ≥2, coefficient of variation for the location of maximum vertical displacement of pelvic and thoracic distal limbs generated good diagnostic yield. The hoofs of the thoracic limbs yielded an AUC of 0.81 with 64% sensitivity and 90% specificity. Blindfolding exacerbated the variation for ataxic horses compared to non-ataxic horses with the hoof marker having an AUC of 0.89 with 82% sensitivity and 90% specificity. The low number of consecutive strides per horse obtained with motion capture could decrease diagnostic utility. Motion capture can objectively aid the assessment of horses with ataxia. Furthermore, blindfolding increases variation in distal pelvic limb kinematics making it a useful clinical tool. © 2017 EVJ Ltd.
Effect of fossil fuels on the parameters of CO2 capture.
Nagy, Tibor; Mizsey, Peter
2013-08-06
The carbon dioxide capture is a more and more important issue in the design and operation of boilers and/or power stations because of increasing environmental considerations. Such processes, absorber desorber should be able to cope with flue gases from the use of different fossil primary energy sources, in order to guarantee a flexible, stable, and secure energy supply operation. The changing flue gases have significant influence on the optimal operation of the capture process, that is, where the required heating of the desorber is the minimal. Therefore special considerations are devoted to the proper design and control of such boiler and/or power stations equipped with CO2 capture process.
Center-TRACON Automation System (CTAS) En Route Trajectory Predictor Requirements and Capabilities
NASA Technical Reports Server (NTRS)
Vivona, Robert; Cate, Karen Tung
2013-01-01
This requirements framework document is designed to support the capture of requirements and capabilities for state-of-the-art trajectory predictors (TPs). This framework has been developed to assist TP experts in capturing a clear, consistent, and cross-comparable set of requirements and capabilities. The goal is to capture capabilities (types of trajectories that can be built), functional requirements (including inputs and outputs), non-functional requirements (including prediction accuracy and computational performance), approaches for constraint relaxation, and input uncertainties. The sections of this framework are based on the Common Trajectory Predictor structure developed by the FAA/Eurocontrol Cooperative R&D Action Plan 16 Committee on Common Trajectory Prediction. It is assumed that the reader is familiar with the Common TP Structure.1 This initial draft is intended as a first cut capture of the En Route TS Capabilities and Requirements. As such, it contains many annotations indicating possible logic errors in the CTAS code or in the description provided. It is intended to work out the details of the annotations with NASA and to update this document at a later time.
Tidal capture of stars by a massive black hole
NASA Technical Reports Server (NTRS)
Novikov, I. D.; Pethick, C. J.; Polnarev, A. G.
1992-01-01
The processes leading to tidal capture of stars by a massive black hole and the consequences of these processes in a dense stellar cluster are discussed in detail. When the amplitude of a tide and the subsequent oscillations are sufficiently large, the energy deposited in a star after periastron passage and formation of a bound orbit cannot be estimated directly using the linear theory of oscillations of a spherical star, but rather numerical estimates must be used. The evolution of a star after tidal capture is discussed. The maximum ratio R of the cross-section for tidal capture to that for tidal disruption is about 3 for real systems. For the case of a stellar system with an empty capture loss cone, even in the case when the impact parameter for tidal capture only slightly exceeds the impact parameter for direct tidal disruption, tidal capture would be much more important than tidal disruption.
The relationship between action-effect monitoring and attention capture.
Kumar, Neeraj; Manjaly, Jaison A; Sunny, Meera Mary
2015-02-01
Many recent findings suggest that stimuli that are perceived to be the consequence of one's own actions are processed with priority. According to the preactivation account of intentional binding, predicted consequences are preactivated and hence receive a temporal advantage in processing. The implications of the preactivation account are important for theories of attention capture, as temporal advantage often translates to attention capture. Hence, action might modulate attention capture by feature singletons. Experiment 1 showed that a motion onset and color change captured attention only when it was preceded by an action. Experiment 2 showed that the capture occurs only with predictable, but not with unpredictable, consequences of action. Experiment 3 showed that even when half the display changed color at display transition, they were all prioritized. The results suggest that action modulates attentional control.
Landmeyer, J.E.
1994-01-01
Ground-water capture zone boundaries for individual pumped wells in a confined aquffer were delineated by using groundwater models. Both analytical and numerical (semi-analytical) models that more accurately represent the $round-water-flow system were used. All models delineated 2-dimensional boundaries (capture zones) that represent the areal extent of groundwater contribution to a pumped well. The resultant capture zones were evaluated on the basis of the ability of each model to realistically rapresent the part of the ground-water-flow system that contributed water to the pumped wells. Analytical models used were based on a fixed radius approach, and induded; an arbitrary radius model, a calculated fixed radius model based on the volumetric-flow equation with a time-of-travel criterion, and a calculated fixed radius model derived from modification of the Theis model with a drawdown criterion. Numerical models used induded the 2-dimensional, finite-difference models RESSQC and MWCAP. The arbitrary radius and Theis analytical models delineated capture zone boundaries that compared least favorably with capture zones delineated using the volumetric-flow analytical model and both numerical models. The numerical models produced more hydrologically reasonable capture zones (that were oriented parallel to the regional flow direction) than the volumetric-flow equation. The RESSQC numerical model computed more hydrologically realistic capture zones than the MWCAP numerical model by accounting for changes in the shape of capture zones caused by multiple-well interference. The capture zone boundaries generated by using both analytical and numerical models indicated that the curnmtly used 100-foot radius of protection around a wellhead in South Carolina is an underestimate of the extent of ground-water capture for pumped wetis in this particular wellfield in the Upper Floridan aquifer. The arbitrary fixed radius of 100 feet was shown to underestimate the upgradient contribution of ground-water flow to a pumped well.
Time Resolved Studies of Carrier Dynamics in III -v Heterojunction Semiconductors.
NASA Astrophysics Data System (ADS)
Westland, Duncan James
Available from UMI in association with The British Library. Requires signed TDF. Picosecond time-resolution photoluminescence spectroscopy has been used to study transient processes in Ga _{.47}In_{.53 }As/InP multiple quantum wells (MQWs), and in bulk Ga_{.47}In _{.53}As and GaSb. To facilitate the experimental studies, apparatus was constructed to allow the detection of transient luminescence with 3ps time resolution. A frequency upconversion technique was employed. Relaxation of energetic carriers in bulk Ga _{.47}In_{.53 }As by optic phonons has been investigated, and, at carrier densities ~3 times 10^{18}cm ^{-3} is found to be a considerably slower process than simple theory predicts. The discrepancy is resolved by the inclusion of a non-equilibrium population of longitudinal optic phonons in the theoretical description. Slow energy loss is also observed in a 154A MQW under similar conditions, but carriers are found to relax more quickly in a 14A MQW with a comparable repeat period. The theory of non-equilibrium mode occupation is modified to describe the case of a MQW and is found to agree with experiment. Carrier relaxation in GaSb is studied and the importance of occupation of the L _6 conduction band valley in this material is demonstrated. The ambipolar diffusion of a photoexcited carrier plasma through an InP capping layer was investigated using an optical time-of-flight technique. This experiment also enables the efficiency of carrier capture by a Ga _{.47}In_{.53 }As quantum well to be determined. A capture time of 4ps was found.
Children's Understanding of Large-Scale Mapping Tasks: An Analysis of Talk, Drawings, and Gesture
ERIC Educational Resources Information Center
Kotsopoulos, Donna; Cordy, Michelle; Langemeyer, Melanie
2015-01-01
This research examined how children represent motion in large-scale mapping tasks that we referred to as "motion maps". The underlying mathematical content was transformational geometry. In total, 19 children, 8- to 10-year-old, created motion maps and captured their motion maps with accompanying verbal description digitally. Analysis of…
40 CFR 63.11509 - What are my notification, reporting, and recordkeeping requirements?
Code of Federal Regulations, 2011 CFR
2011-07-01
... management practices and equipment standards. (iii) Description of the capture and emission control systems... subject to the requirements in § 63.11507(a)(1), “What are my standards and management practices?”, you... that is subject to the requirements in § 63.11507(b), “What are my standards and management practices...
IDEF5 Ontology Description Capture Method: Concepts and Formal Foundations
1992-11-01
cutter comes to exist. The puzzle here goes back to Greek times iii the guise of the Ship of Theseus : if we bit by bit replace the planks of a ship...831 Barwise, J. and Perry, J., Situations and Attitudes, The MIT Press, Cambridge, 1983. [Burch 911 Burch, R., A Peircean Reduction Thesis : The
MSFC Propulsion Systems Department Knowledge Management Project
NASA Technical Reports Server (NTRS)
Caraccioli, Paul A.
2007-01-01
This slide presentation reviews the Knowledge Management (KM) project of the Propulsion Systems Department at Marshall Space Flight Center. KM is needed to support knowledge capture, preservation and to support an information sharing culture. The presentation includes the strategic plan for the KM initiative, the system requirements, the technology description, the User Interface and custom features, and a search demonstration.
Art in Social Studies: Exploring the World and Ourselves with Rembrandt
ERIC Educational Resources Information Center
Ahmad, Iftikhar
2008-01-01
Rembrandt's art lends itself as a fertile resource for teaching and learning social studies. His art not only captures the social studies themes relevant to the Dutch Golden Age, but it also offers a description of human relations transcending temporal and spatial frontiers. Rembrandt is an imaginative storyteller with a keen insight for minute…
40 CFR 63.4920 - What reports must I submit?
Code of Federal Regulations, 2010 CFR
2010-07-01
... control device and were diverted to the atmosphere), the semiannual compliance report must contain the... capture systems and add-on control devices, using Equation 1 of § 63.4961, and Equation 3 of § 63.4961 for... malfunction started and stopped. (vii) A brief description of the CPMS. (viii) The date of the latest CPMS...
The California All-sky Meteor Surveillance (CAMS) System
NASA Astrophysics Data System (ADS)
Gural, P. S.
2011-01-01
A unique next generation multi-camera, multi-site video meteor system is being developed and deployed in California to provide high accuracy orbits of simultaneously captured meteors. Included herein is a description of the goals, concept of operations, hardware, and software development progress. An appendix contains a meteor camera performance trade study made for video systems circa 2010.
CO 2 capture from IGCC gas streams using the AC-ABC process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagar, Anoop; McLaughlin, Elisabeth; Hornbostel, Marc
The objective of this project was to develop a novel, low-cost CO 2 capture process from pre-combustion gas streams. The bench-scale work was conducted at the SRI International. A 0.15-MWe integrated pilot plant was constructed and operated for over 700 hours at the National Carbon Capture Center, Wilsonville, AL. The AC-ABC (ammonium carbonate-ammonium bicarbonate) process for capture of CO 2 and H 2S from the pre-combustion gas stream offers many advantages over Selexol-based technology. The process relies on the simple chemistry of the NH 3-CO 2-H 2O-H 2S system and on the ability of the aqueous ammoniated solution to absorbmore » CO 2 at near ambient temperatures and to release it as a high-purity, high-pressure gas at a moderately elevated regeneration temperature. It is estimated the increase in cost of electricity (COE) with the AC-ABC process will be ~ 30%, and the cost of CO 2 captured is projected to be less than $27/metric ton of CO 2 while meeting 90% CO 2 capture goal. The Bechtel Pressure Swing Claus (BPSC) is a complementary technology offered by Bechtel Hydrocarbon Technology Solutions, Inc. BPSC is a high-pressure, sub-dew-point Claus process that allows for nearly complete removal of H 2S from a gas stream. It operates at gasifier pressures and moderate temperatures and does not affect CO 2 content. When coupled with AC-ABC, the combined technologies allow a nearly pure CO 2 stream to be captured at high pressure, something which Selexol and other solvent-based technologies cannot achieve.« less
Stellar Abundance Observations and Heavy Element Formation
NASA Astrophysics Data System (ADS)
Cowan, J. J.
2005-05-01
Abundance observations indicate the presence of rapid-neutron capture (i.e., r-process) elements in old Galactic halo and globular cluster stars. These observations provide insight into the nature of the earliest generations of stars in the Galaxy -- the progenitors of the halo stars -- responsible for neutron-capture synthesis of the heavy elements. Abundance comparisons among the r-process-rich halo stars show that the heaviest neutron-capture elements (i.e., Ba and above) are consistent with a scaled solar system r-process abundance distribution, while the lighter neutron-capture elements do not conform to the solar pattern. These comparisons suggest the possibility of two r-process sites in stars. The large star-to-star scatter observed in the abundances of neutron-capture element/iron ratios at low metallicities -- which disappears with increasing metallicity or [Fe/H] -- suggests the formation of these heavy elements (presumably from certain types of supernovae) was rare in the early Galaxy. The stellar abundances also indicate a change from the r-process to the slow neutron capture (i.e., s-) process at higher metallicities in the Galaxy and provide insight into Galactic chemical evolution. Finally, the detection of thorium and uranium in halo and globular cluster stars offers an independent age-dating technique that can put lower limits on the age of the Galaxy, and hence the Universe. This work has been supported in part by NSF grant AST 03-07279 (J.J.C.) and by STScI grants GO-8111, GO-8342 and GO-9359.
Preferred mental models in reasoning about spatial relations.
Jahn, Georg; Knauff, Markus; Johnson-Laird, P N
2007-12-01
The theory of mental models postulates that individuals infer that a spatial description is consistent only if they can construct a model in which all the assertions in the description are true. Individuals prefer a parsimonious representation, and so, when a description is consistent with more than one possible layout of entities on the left-right dimension, individuals in our culture prefer to construct models working from left to right. They also prefer to locate entities referred to in the same assertion as adjacent to one another in a model. And, if possible, they tend to chunk entities into a single unit in order to capture several possibilities in a single model. We report four experiments corroborating these predictions. The results shed light on the integration of relational assertions, and they show that participants exploit implicit constraints in building models of spatial relations.
Applying policy network theory to policy-making in China: the case of urban health insurance reform.
Zheng, Haitao; de Jong, Martin; Koppenjan, Joop
2010-01-01
In this article, we explore whether policy network theory can be applied in the People's Republic of China (PRC). We carried out a literature review of how this approach has already been dealt with in the Chinese policy sciences thus far. We then present the key concepts and research approach in policy networks theory in the Western literature and try these on a Chinese case to see the fit. We follow this with a description and analysis of the policy-making process regarding the health insurance reform in China from 1998 until the present. Based on this case study, we argue that this body of theory is useful to describe and explain policy-making processes in the Chinese context. However, limitations in the generic model appear in capturing the fundamentally different political and administrative systems, crucially different cultural values in the applicability of some research methods common in Western countries. Finally, we address which political and cultural aspects turn out to be different in the PRC and how they affect methodological and practical problems that PRC researchers will encounter when studying decision-making processes.
Integration of the Gene Ontology into an object-oriented architecture.
Shegogue, Daniel; Zheng, W Jim
2005-05-10
To standardize gene product descriptions, a formal vocabulary defined as the Gene Ontology (GO) has been developed. GO terms have been categorized into biological processes, molecular functions, and cellular components. However, there is no single representation that integrates all the terms into one cohesive model. Furthermore, GO definitions have little information explaining the underlying architecture that forms these terms, such as the dynamic and static events occurring in a process. In contrast, object-oriented models have been developed to show dynamic and static events. A portion of the TGF-beta signaling pathway, which is involved in numerous cellular events including cancer, differentiation and development, was used to demonstrate the feasibility of integrating the Gene Ontology into an object-oriented model. Using object-oriented models we have captured the static and dynamic events that occur during a representative GO process, "transforming growth factor-beta (TGF-beta) receptor complex assembly" (GO:0007181). We demonstrate that the utility of GO terms can be enhanced by object-oriented technology, and that the GO terms can be integrated into an object-oriented model by serving as a basis for the generation of object functions and attributes.
NASA Astrophysics Data System (ADS)
Mazzuca, James W.; Haut, Nathaniel K.
2018-06-01
It has been recently shown that in the presence of an applied voltage, hydrogen and deuterium nuclei can be separated from one another using graphene membranes as a nuclear sieve, resulting in a 10-fold enhancement in the concentration of the lighter isotope. While previous studies, both experimental and theoretical, have attributed this effect mostly to differences in vibrational zero point energy (ZPE) of the various isotopes near the membrane surface, we propose that multi-dimensional quantum mechanical tunneling of nuclei through the graphene membrane influences this proton permeation process in a fundamental way. We perform ring polymer molecular dynamics calculations in which we include both ZPE and tunneling effects of various hydrogen isotopes as they permeate the graphene membrane and compute rate constants across a range of temperatures near 300 K. While capturing the experimentally observed separation factor, our calculations indicate that the transverse motion of the various isotopes across the surface of the graphene membrane is an essential part of this sieving mechanism. An understanding of the multi-dimensional quantum mechanical nature of this process could serve to guide the design of other such isotopic enrichment processes for a variety of atomic and molecular species of interest.
Integration of the Gene Ontology into an object-oriented architecture
Shegogue, Daniel; Zheng, W Jim
2005-01-01
Background To standardize gene product descriptions, a formal vocabulary defined as the Gene Ontology (GO) has been developed. GO terms have been categorized into biological processes, molecular functions, and cellular components. However, there is no single representation that integrates all the terms into one cohesive model. Furthermore, GO definitions have little information explaining the underlying architecture that forms these terms, such as the dynamic and static events occurring in a process. In contrast, object-oriented models have been developed to show dynamic and static events. A portion of the TGF-beta signaling pathway, which is involved in numerous cellular events including cancer, differentiation and development, was used to demonstrate the feasibility of integrating the Gene Ontology into an object-oriented model. Results Using object-oriented models we have captured the static and dynamic events that occur during a representative GO process, "transforming growth factor-beta (TGF-beta) receptor complex assembly" (GO:0007181). Conclusion We demonstrate that the utility of GO terms can be enhanced by object-oriented technology, and that the GO terms can be integrated into an object-oriented model by serving as a basis for the generation of object functions and attributes. PMID:15885145
Mazzuca, James W; Haut, Nathaniel K
2018-06-14
It has been recently shown that in the presence of an applied voltage, hydrogen and deuterium nuclei can be separated from one another using graphene membranes as a nuclear sieve, resulting in a 10-fold enhancement in the concentration of the lighter isotope. While previous studies, both experimental and theoretical, have attributed this effect mostly to differences in vibrational zero point energy (ZPE) of the various isotopes near the membrane surface, we propose that multi-dimensional quantum mechanical tunneling of nuclei through the graphene membrane influences this proton permeation process in a fundamental way. We perform ring polymer molecular dynamics calculations in which we include both ZPE and tunneling effects of various hydrogen isotopes as they permeate the graphene membrane and compute rate constants across a range of temperatures near 300 K. While capturing the experimentally observed separation factor, our calculations indicate that the transverse motion of the various isotopes across the surface of the graphene membrane is an essential part of this sieving mechanism. An understanding of the multi-dimensional quantum mechanical nature of this process could serve to guide the design of other such isotopic enrichment processes for a variety of atomic and molecular species of interest.
NASA Astrophysics Data System (ADS)
Nomaguch, Yutaka; Fujita, Kikuo
This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.
Yap, Florence G H; Yen, Hong-Hsu
2014-02-20
Wireless Visual Sensor Networks (WVSNs) where camera-equipped sensor nodes can capture, process and transmit image/video information have become an important new research area. As compared to the traditional wireless sensor networks (WSNs) that can only transmit scalar information (e.g., temperature), the visual data in WVSNs enable much wider applications, such as visual security surveillance and visual wildlife monitoring. However, as compared to the scalar data in WSNs, visual data is much bigger and more complicated so intelligent schemes are required to capture/process/ transmit visual data in limited resources (hardware capability and bandwidth) WVSNs. WVSNs introduce new multi-disciplinary research opportunities of topics that include visual sensor hardware, image and multimedia capture and processing, wireless communication and networking. In this paper, we survey existing research efforts on the visual sensor hardware, visual sensor coverage/deployment, and visual data capture/ processing/transmission issues in WVSNs. We conclude that WVSN research is still in an early age and there are still many open issues that have not been fully addressed. More new novel multi-disciplinary, cross-layered, distributed and collaborative solutions should be devised to tackle these challenging issues in WVSNs.
Yap, Florence G. H.; Yen, Hong-Hsu
2014-01-01
Wireless Visual Sensor Networks (WVSNs) where camera-equipped sensor nodes can capture, process and transmit image/video information have become an important new research area. As compared to the traditional wireless sensor networks (WSNs) that can only transmit scalar information (e.g., temperature), the visual data in WVSNs enable much wider applications, such as visual security surveillance and visual wildlife monitoring. However, as compared to the scalar data in WSNs, visual data is much bigger and more complicated so intelligent schemes are required to capture/process/transmit visual data in limited resources (hardware capability and bandwidth) WVSNs. WVSNs introduce new multi-disciplinary research opportunities of topics that include visual sensor hardware, image and multimedia capture and processing, wireless communication and networking. In this paper, we survey existing research efforts on the visual sensor hardware, visual sensor coverage/deployment, and visual data capture/processing/transmission issues in WVSNs. We conclude that WVSN research is still in an early age and there are still many open issues that have not been fully addressed. More new novel multi-disciplinary, cross-layered, distributed and collaborative solutions should be devised to tackle these challenging issues in WVSNs. PMID:24561401
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2010-01-01
Broad Funding Opportunity Announcement Project: Two faculty members at Lehigh University created a new technique called supercapacitive swing adsorption (SSA) that uses electrical charges to encourage materials to capture and release CO2. Current CO2 capture methods include expensive processes that involve changes in temperature or pressure. Lehigh University’s approach uses electric fields to improve the ability of inexpensive carbon sorbents to trap CO2. Because this process uses electric fields and not electric current, the overall energy consumption is projected to be much lower than conventional methods. Lehigh University is now optimizing the materials to maximize CO2 capture and minimize themore » energy needed for the process.« less
NASA Astrophysics Data System (ADS)
Casanovas, A.; Domingo-Pardo, C.; Guerrero, C.; Lerendegui-Marco, J.; Calviño, F.; Tarifeño-Saldivia, A.; Dressler, R.; Heinitz, S.; Kivel, N.; Quesada, J. M.; Schumann, D.; Aberle, O.; Alcayne, V.; Andrzejewski, J.; Audouin, L.; Bécares, V.; Bacak, M.; Barbagallo, M.; Bečvář, F.; Bellia, G.; Berthoumieux, E.; Billowes, J.; Bosnar, D.; Brown, A.; Busso, M.; Caamaño, M.; Caballero-Ontanaya, L.; Calviani, M.; Cano-Ott, D.; Cerutti, F.; Chen, Y. H.; Chiaveri, E.; Colonna, N.; Cortés, G.; Cortés-Giraldo, M. A.; Cosentino, L.; Cristallo, S.; Damone, L. A.; Diakaki, M.; Dietz, M.; Dupont, E.; Durán, I.; Eleme, Z.; Fernández-Domínguez, B.; Ferrari, A.; Ferreira, P.; Finocchiaro, P.; Furman, V.; Göbel, K.; Gawlik, A.; Gilardoni, S.; Glodariu, T.; Gonçalves, I. F.; González-Romero, E.; Gunsing, F.; Heyse, J.; Jenkins, D. G.; Käppeler, F.; Kadi, Y.; Katabuchi, T.; Kimura, A.; Kokkoris, M.; Kopatch, Y.; Krtička, M.; Kurtulgil, D.; Ladarescu, I.; Lederer-Woods, C.; Meo, S. Lo; Lonsdale, S. J.; Macina, D.; Martínez, T.; Masi, A.; Massimi, C.; Mastinu, P.; Mastromarco, M.; Matteucci, F.; Maugeri, E. A.; Mazzone, A.; Mendoza, E.; Mengoni, A.; Michalopoulou, V.; Milazzo, P. M.; Mingrone, F.; Musumarra, A.; Negret, A.; Nolte, R.; Ogállar, F.; Oprea, A.; Patronis, N.; Pavlik, A.; Perkowski, J.; Persanti, L.; Porras, I.; Praena, J.; Radeck, D.; Ramos, D.; Rauscher, T.; Reifarth, R.; Rochman, D.; Sabaté-Gilarte, M.; Saxena, A.; Schillebeeckx, P.; Simone, S.; Smith, A. G.; Sosnin, N. V.; Stamatopoulos, A.; Tagliente, G.; Tain, J. L.; Talip, T.; Tassan-Got, L.; Tsinganis, A.; Ulrich, J.; Valenta, S.; Vannini, G.; Variale, V.; Vaz, P.; Ventura, A.; Vlachoudis, V.; Vlastou, R.; Wallner, A.; Woods, P. J.; Wright, T.; Žugec, P.; Köster, U.
2018-05-01
The neutron capture cross section of some unstable nuclei is especially relevant for s-process nucleosynthesis studies. This magnitude is crucial to determine the local abundance pattern, which can yield valuable information of the s-process stellar environment. In this work we describe the neutron capture (n,γ) measurement on two of these nuclei of interest, 204Tl and 171Tm, from target production to the final measurement, performed successfully at the n_TOF facility at CERN in 2014 and 2015. Preliminary results on the ongoing experimental data analysis will also be shown. These results include the first ever experimental observation of capture resonances for these two nuclei.
Żyła, Dagmara; Yamamoto, Shûhei; Wolf-Schwenninger, Karin; Solodovnikov, Alexey
2017-01-01
Stenus is the largest genus of rove beetles and the second largest among animals. Its evolutionary success was associated with the adhesive labial prey-capture apparatus, a unique apomorphy of that genus. Definite Stenus with prey-capture apparatus are known from the Cenozoic fossils, while the age and early evolution of Steninae was hardly ever hypothesized. Our study of several Cretaceous Burmese amber inclusions revealed a stem lineage of Steninae that possibly possesses the Stenus-like prey-capture apparatus. Phylogenetic analysis of extinct and extant taxa of Steninae and putatively allied subfamilies of Staphylinidae with parsimony and Bayesian approaches resolved the Burmese amber lineage as a member of Steninae. It justified the description of a new extinct stenine genus Festenus with two new species, F. robustus and F. gracilis. The Late Cretaceous age of Festenus suggests an early origin of prey-capture apparatus in Steninae that, perhaps, drove the evolution towards the crown Stenus. Our analysis confirmed the well-established sister relationships between Steninae and Euaesthetinae and resolved Scydmaeninae as their next closest relative, the latter having no stable position in recent phylogenetic studies of rove beetles. Close affiliation of Megalopsidiinae, a subfamily often considered as a sister group to Euaesthetinae + Steninae clade, is rejected. PMID:28397786
Use of sentiment analysis for capturing patient experience from free-text comments posted online.
Greaves, Felix; Ramirez-Cano, Daniel; Millett, Christopher; Darzi, Ara; Donaldson, Liam
2013-11-01
There are large amounts of unstructured, free-text information about quality of health care available on the Internet in blogs, social networks, and on physician rating websites that are not captured in a systematic way. New analytical techniques, such as sentiment analysis, may allow us to understand and use this information more effectively to improve the quality of health care. We attempted to use machine learning to understand patients' unstructured comments about their care. We used sentiment analysis techniques to categorize online free-text comments by patients as either positive or negative descriptions of their health care. We tried to automatically predict whether a patient would recommend a hospital, whether the hospital was clean, and whether they were treated with dignity from their free-text description, compared to the patient's own quantitative rating of their care. We applied machine learning techniques to all 6412 online comments about hospitals on the English National Health Service website in 2010 using Weka data-mining software. We also compared the results obtained from sentiment analysis with the paper-based national inpatient survey results at the hospital level using Spearman rank correlation for all 161 acute adult hospital trusts in England. There was 81%, 84%, and 89% agreement between quantitative ratings of care and those derived from free-text comments using sentiment analysis for cleanliness, being treated with dignity, and overall recommendation of hospital respectively (kappa scores: .40-.74, P<.001 for all). We observed mild to moderate associations between our machine learning predictions and responses to the large patient survey for the three categories examined (Spearman rho 0.37-0.51, P<.001 for all). The prediction accuracy that we have achieved using this machine learning process suggests that we are able to predict, from free-text, a reasonably accurate assessment of patients' opinion about different performance aspects of a hospital and that these machine learning predictions are associated with results of more conventional surveys.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... Charles Carbon Capture and Sequestration Project, Lake Charles, LA AGENCY: Department of Energy. ACTION... competitive process under the Industrial Carbon Capture and Sequestration (ICCS) Program. The Lake Charles Carbon Capture and Sequestration Project (Lake Charles CCS Project) would demonstrate: (1) advanced...
Representing virus-host interactions and other multi-organism processes in the Gene Ontology.
Foulger, R E; Osumi-Sutherland, D; McIntosh, B K; Hulo, C; Masson, P; Poux, S; Le Mercier, P; Lomax, J
2015-07-28
The Gene Ontology project is a collaborative effort to provide descriptions of gene products in a consistent and computable language, and in a species-independent manner. The Gene Ontology is designed to be applicable to all organisms but up to now has been largely under-utilized for prokaryotes and viruses, in part because of a lack of appropriate ontology terms. To address this issue, we have developed a set of Gene Ontology classes that are applicable to microbes and their hosts, improving both coverage and quality in this area of the Gene Ontology. Describing microbial and viral gene products brings with it the additional challenge of capturing both the host and the microbe. Recognising this, we have worked closely with annotation groups to test and optimize the GO classes, and we describe here a set of annotation guidelines that allow the controlled description of two interacting organisms. Building on the microbial resources already in existence such as ViralZone, UniProtKB keywords and MeGO, this project provides an integrated ontology to describe interactions between microbial species and their hosts, with mappings to the external resources above. Housing this information within the freely-accessible Gene Ontology project allows the classes and annotation structure to be utilized by a large community of biologists and users.
Free-Form Region Description with Second-Order Pooling.
Carreira, João; Caseiro, Rui; Batista, Jorge; Sminchisescu, Cristian
2015-06-01
Semantic segmentation and object detection are nowadays dominated by methods operating on regions obtained as a result of a bottom-up grouping process (segmentation) but use feature extractors developed for recognition on fixed-form (e.g. rectangular) patches, with full images as a special case. This is most likely suboptimal. In this paper we focus on feature extraction and description over free-form regions and study the relationship with their fixed-form counterparts. Our main contributions are novel pooling techniques that capture the second-order statistics of local descriptors inside such free-form regions. We introduce second-order generalizations of average and max-pooling that together with appropriate non-linearities, derived from the mathematical structure of their embedding space, lead to state-of-the-art recognition performance in semantic segmentation experiments without any type of local feature coding. In contrast, we show that codebook-based local feature coding is more important when feature extraction is constrained to operate over regions that include both foreground and large portions of the background, as typical in image classification settings, whereas for high-accuracy localization setups, second-order pooling over free-form regions produces results superior to those of the winning systems in the contemporary semantic segmentation challenges, with models that are much faster in both training and testing.
A tesselated probabilistic representation for spatial robot perception and navigation
NASA Technical Reports Server (NTRS)
Elfes, Alberto
1989-01-01
The ability to recover robust spatial descriptions from sensory information and to efficiently utilize these descriptions in appropriate planning and problem-solving activities are crucial requirements for the development of more powerful robotic systems. Traditional approaches to sensor interpretation, with their emphasis on geometric models, are of limited use for autonomous mobile robots operating in and exploring unknown and unstructured environments. Here, researchers present a new approach to robot perception that addresses such scenarios using a probabilistic tesselated representation of spatial information called the Occupancy Grid. The Occupancy Grid is a multi-dimensional random field that maintains stochastic estimates of the occupancy state of each cell in the grid. The cell estimates are obtained by interpreting incoming range readings using probabilistic models that capture the uncertainty in the spatial information provided by the sensor. A Bayesian estimation procedure allows the incremental updating of the map using readings taken from several sensors over multiple points of view. An overview of the Occupancy Grid framework is given, and its application to a number of problems in mobile robot mapping and navigation are illustrated. It is argued that a number of robotic problem-solving activities can be performed directly on the Occupancy Grid representation. Some parallels are drawn between operations on Occupancy Grids and related image processing operations.
Chang, Lin-Chau; Mahmood, Riaz; Qureshi, Samina; Breder, Christopher D
2017-01-01
Standardised MedDRA Queries (SMQs) have been developed since the early 2000's and used by academia, industry, public health, and government sectors for detecting safety signals in adverse event safety databases. The purpose of the present study is to characterize how SMQs are used and the impact in safety analyses for New Drug Application (NDA) and Biologics License Application (BLA) submissions to the United States Food and Drug Administration (USFDA). We used the PharmaPendium database to capture SMQ use in Summary Basis of Approvals (SBoAs) of drugs and biologics approved by the USFDA. Characteristics of the drugs and the SMQ use were employed to evaluate the role of SMQ safety analyses in regulatory decisions and the veracity of signals they revealed. A comprehensive search of the SBoAs yielded 184 regulatory submissions approved from 2006 to 2015. Search strategies more frequently utilized restrictive searches with "narrow terms" to enhance specificity over strategies using "broad terms" to increase sensitivity, while some involved modification of search terms. A majority (59%) of 1290 searches used descriptive statistics, however inferential statistics were utilized in 35% of them. Commentary from reviewers and supervisory staff suggested that a small, yet notable percentage (18%) of 1290 searches supported regulatory decisions. The searches with regulatory impact were found in 73 submissions (40% of the submissions investigated). Most searches (75% of 227 searches) with regulatory implications described how the searches were confirmed, indicating prudence in the decision-making process. SMQs have an increasing role in the presentation and review of safety analysis for NDAs/BLAs and their regulatory reviews. This study suggests that SMQs are best used for screening process, with descriptive statistics, description of SMQ modifications, and systematic verification of cases which is crucial for drawing regulatory conclusions.
Chang, Lin-Chau; Mahmood, Riaz; Qureshi, Samina
2017-01-01
Purpose Standardised MedDRA Queries (SMQs) have been developed since the early 2000’s and used by academia, industry, public health, and government sectors for detecting safety signals in adverse event safety databases. The purpose of the present study is to characterize how SMQs are used and the impact in safety analyses for New Drug Application (NDA) and Biologics License Application (BLA) submissions to the United States Food and Drug Administration (USFDA). Methods We used the PharmaPendium database to capture SMQ use in Summary Basis of Approvals (SBoAs) of drugs and biologics approved by the USFDA. Characteristics of the drugs and the SMQ use were employed to evaluate the role of SMQ safety analyses in regulatory decisions and the veracity of signals they revealed. Results A comprehensive search of the SBoAs yielded 184 regulatory submissions approved from 2006 to 2015. Search strategies more frequently utilized restrictive searches with “narrow terms” to enhance specificity over strategies using “broad terms” to increase sensitivity, while some involved modification of search terms. A majority (59%) of 1290 searches used descriptive statistics, however inferential statistics were utilized in 35% of them. Commentary from reviewers and supervisory staff suggested that a small, yet notable percentage (18%) of 1290 searches supported regulatory decisions. The searches with regulatory impact were found in 73 submissions (40% of the submissions investigated). Most searches (75% of 227 searches) with regulatory implications described how the searches were confirmed, indicating prudence in the decision-making process. Conclusions SMQs have an increasing role in the presentation and review of safety analysis for NDAs/BLAs and their regulatory reviews. This study suggests that SMQs are best used for screening process, with descriptive statistics, description of SMQ modifications, and systematic verification of cases which is crucial for drawing regulatory conclusions. PMID:28570569
40 CFR 65.113 - Standards: Sampling connection systems.
Code of Federal Regulations, 2011 CFR
2011-07-01
... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...
40 CFR 65.113 - Standards: Sampling connection systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...
40 CFR 65.113 - Standards: Sampling connection systems.
Code of Federal Regulations, 2010 CFR
2010-07-01
... be collected or captured. (c) Equipment design and operation. Each closed-purge, closed-loop, or... system; or (2) Collect and recycle the purged process fluid to a process; or (3) Be designed and operated to capture and transport all the purged process fluid to a control device that meets the requirements...
Kendall, W.L.; Nichols, J.D.; North, P.M.; Nichols, J.D.
1995-01-01
The use of the Cormack- Jolly-Seber model under a standard sampling scheme of one sample per time period, when the Jolly-Seber assumption that all emigration is permanent does not hold, leads to the confounding of temporary emigration probabilities with capture probabilities. This biases the estimates of capture probability when temporary emigration is a completely random process, and both capture and survival probabilities when there is a temporary trap response in temporary emigration, or it is Markovian. The use of secondary capture samples over a shorter interval within each period, during which the population is assumed to be closed (Pollock's robust design), provides a second source of information on capture probabilities. This solves the confounding problem, and thus temporary emigration probabilities can be estimated. This process can be accomplished in an ad hoc fashion for completely random temporary emigration and to some extent in the temporary trap response case, but modelling the complete sampling process provides more flexibility and permits direct estimation of variances. For the case of Markovian temporary emigration, a full likelihood is required.
Williams, A Mark; Ericsson, K Anders
2005-06-01
The number of researchers studying perceptual-cognitive expertise in sport is increasing. The intention in this paper is to review the currently accepted framework for studying expert performance and to consider implications for undertaking research work in the area of perceptual-cognitive expertise in sport. The expert performance approach presents a descriptive and inductive approach for the systematic study of expert performance. The nature of expert performance is initially captured in the laboratory using representative tasks that identify reliably superior performance. Process-tracing measures are employed to determine the mechanisms that mediate expert performance on the task. Finally, the specific types of activities that lead to the acquisition and development of these mediating mechanisms are identified. General principles and mechanisms may be discovered and then validated by more traditional experimental designs. The relevance of this approach to the study of perceptual-cognitive expertise in sport is discussed and suggestions for future work highlighted.
Best practices for the 3D documentation of the Grotta dei Cervi of Porto Badisco, Italy
NASA Astrophysics Data System (ADS)
Beraldin, J. A.; Picard, M.; Bandiera, A.; Valzano, V.; Negro, F.
2011-03-01
The Grotta dei Cervi is a Neolithic cave where human presence has left many unique pictographs on the walls of many of its chambers. It was closed for conservation reasons soon after its discovery in 1970. It is for these reasons that a 3D documentation was started. Two sets of high resolution and detailed three-dimensional (3D) acquisitions were captured in 2005 and 2009 respectively, along with two-dimensional (2D) images. From this information a textured 3D model was produced for most of the 300-m long central corridor. Carbon dating of the guano used for the pictographs and environmental monitoring (Temperature, Relative humidity, and Radon) completed the project. This paper presents this project, some results obtained up to now, the best practice that has emerged from this work and a description of the processing pipeline that deals with more than 27 billion 3D coordinates.
Space Situational Awareness in the Joint Space Operations Center
NASA Astrophysics Data System (ADS)
Wasson, M.
2011-09-01
Flight safety of orbiting resident space objects is critical to our national interest and defense. United States Strategic Command has assigned the responsibility for Space Situational Awareness (SSA) to its Joint Functional Component Command - Space (JFCC SPACE) at Vandenberg Air Force Base. This paper will describe current SSA imperatives, new developments in SSA tools and developments in Defensive Operations. Current SSA processes are being examined to capture, and possibly improve, tasking of SSN sensors and "new" space-based sensors, "common" conjunction assessment methodology, and SSA sharing due to the growth seen over the last two years. The stand-up of a Defensive Ops Branch will highlight the need for advanced analysis and collaboration across space, weather, intelligence, and cyber specialties. New developments in SSA tools will be a description of computing hardware/software upgrades planned as well as the use of User-Defined Operating Pictures and visualization applications.
Ellis, Beverley; Roberts, Jean; Cooper, Helen
2007-01-01
This case study report of the establishment of a national repository of multi-media materials describes the creation process, the challenges faced in putting it into operation and the opportunities for the future. The initial resource has been incorporated under standard library and knowledge management practices. A collaborative action research method was used with active experts in the domain to determine the requirements and priorities for further development. The National Health Informatics Collection (NatHIC) is now accessible and the further issues are being addressed by inclusion in future University and NHS strategic plans. Ultimately the Collection will link with other facilities that contribute to the description and maintenance of effective informatics in support of health globally. The issues raised about the National Health Informatics Collection as established in the UK have resonance with the challenges of capturing the overall historic development of an emerging discipline in any country.
Mapping transiently formed and sparsely populated conformations on a complex energy landscape.
Wang, Yong; Papaleo, Elena; Lindorff-Larsen, Kresten
2016-08-23
Determining the structures, kinetics, thermodynamics and mechanisms that underlie conformational exchange processes in proteins remains extremely difficult. Only in favourable cases is it possible to provide atomic-level descriptions of sparsely populated and transiently formed alternative conformations. Here we benchmark the ability of enhanced-sampling molecular dynamics simulations to determine the free energy landscape of the L99A cavity mutant of T4 lysozyme. We find that the simulations capture key properties previously measured by NMR relaxation dispersion methods including the structure of a minor conformation, the kinetics and thermodynamics of conformational exchange, and the effect of mutations. We discover a new tunnel that involves the transient exposure towards the solvent of an internal cavity, and show it to be relevant for ligand escape. Together, our results provide a comprehensive view of the structural landscape of a protein, and point forward to studies of conformational exchange in systems that are less characterized experimentally.
A computer vision for animal ecology.
Weinstein, Ben G
2018-05-01
A central goal of animal ecology is to observe species in the natural world. The cost and challenge of data collection often limit the breadth and scope of ecological study. Ecologists often use image capture to bolster data collection in time and space. However, the ability to process these images remains a bottleneck. Computer vision can greatly increase the efficiency, repeatability and accuracy of image review. Computer vision uses image features, such as colour, shape and texture to infer image content. I provide a brief primer on ecological computer vision to outline its goals, tools and applications to animal ecology. I reviewed 187 existing applications of computer vision and divided articles into ecological description, counting and identity tasks. I discuss recommendations for enhancing the collaboration between ecologists and computer scientists and highlight areas for future growth of automated image analysis. © 2017 The Author. Journal of Animal Ecology © 2017 British Ecological Society.
Pumpe, Daniel; Greiner, Maksim; Müller, Ewald; Enßlin, Torsten A
2016-07-01
Stochastic differential equations describe well many physical, biological, and sociological systems, despite the simplification often made in their derivation. Here the usage of simple stochastic differential equations to characterize and classify complex dynamical systems is proposed within a Bayesian framework. To this end, we develop a dynamic system classifier (DSC). The DSC first abstracts training data of a system in terms of time-dependent coefficients of the descriptive stochastic differential equation. Thereby the DSC identifies unique correlation structures within the training data. For definiteness we restrict the presentation of the DSC to oscillation processes with a time-dependent frequency ω(t) and damping factor γ(t). Although real systems might be more complex, this simple oscillator captures many characteristic features. The ω and γ time lines represent the abstract system characterization and permit the construction of efficient signal classifiers. Numerical experiments show that such classifiers perform well even in the low signal-to-noise regime.
Towards a visual modeling approach to designing microelectromechanical system transducers
NASA Astrophysics Data System (ADS)
Dewey, Allen; Srinivasan, Vijay; Icoz, Evrim
1999-12-01
In this paper, we address initial design capture and system conceptualization of microelectromechanical system transducers based on visual modeling and design. Visual modeling frames the task of generating hardware description language (analog and digital) component models in a manner similar to the task of generating software programming language applications. A structured topological design strategy is employed, whereby microelectromechanical foundry cell libraries are utilized to facilitate the design process of exploring candidate cells (topologies), varying key aspects of the transduction for each topology, and determining which topology best satisfies design requirements. Coupled-energy microelectromechanical system characterizations at a circuit level of abstraction are presented that are based on branch constitutive relations and an overall system of simultaneous differential and algebraic equations. The resulting design methodology is called visual integrated-microelectromechanical VHDL-AMS interactive design (VHDL-AMS is visual hardware design language for analog and mixed signal).
Pseudo-Newtonian Equations for Evolution of Particles and Fluids in Stationary Space-times
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witzany, Vojtěch; Lämmerzahl, Claus, E-mail: vojtech.witzany@zarm.uni-bremen.de, E-mail: claus.laemmerzahl@zarm.uni-bremen.de
Pseudo-Newtonian potentials are a tool often used in theoretical astrophysics to capture some key features of a black hole space-time in a Newtonian framework. As a result, one can use Newtonian numerical codes, and Newtonian formalism, in general, in an effective description of important astrophysical processes such as accretion onto black holes. In this paper, we develop a general pseudo-Newtonian formalism, which pertains to the motion of particles, light, and fluids in stationary space-times. In return, we are able to assess the applicability of the pseudo-Newtonian scheme. The simplest and most elegant formulas are obtained in space-times without gravitomagnetic effects,more » such as the Schwarzschild rather than the Kerr space-time; the quantitative errors are smallest for motion with low binding energy. Included is a ready-to-use set of fluid equations in Schwarzschild space-time in Cartesian and radial coordinates.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allgood, G.O.; Dress, W.B.; Kercel, S.W.
1999-05-10
A major problem with cavitation in pumps and other hydraulic devices is that there is no effective method for detecting or predicting its inception. The traditional approach is to declare the pump in cavitation when the total head pressure drops by some arbitrary value (typically 3o/0) in response to a reduction in pump inlet pressure. However, the pump is already cavitating at this point. A method is needed in which cavitation events are captured as they occur and characterized by their process dynamics. The object of this research was to identify specific features of cavitation that could be used asmore » a model-based descriptor in a context-dependent condition-based maintenance (CD-CBM) anticipatory prognostic and health assessment model. This descriptor was based on the physics of the phenomena, capturing the salient features of the process dynamics. An important element of this concept is the development and formulation of the extended process feature vector @) or model vector. Thk model-based descriptor encodes the specific information that describes the phenomena and its dynamics and is formulated as a data structure consisting of several elements. The first is a descriptive model abstracting the phenomena. The second is the parameter list associated with the functional model. The third is a figure of merit, a single number between [0,1] representing a confidence factor that the functional model and parameter list actually describes the observed data. Using this as a basis and applying it to the cavitation problem, any given location in a flow loop will have this data structure, differing in value but not content. The extended process feature vector is formulated as follows: E`> [ , {parameter Iist}, confidence factor]. (1) For this study, the model that characterized cavitation was a chirped-exponentially decaying sinusoid. Using the parameters defined by this model, the parameter list included frequency, decay, and chirp rate. Based on this, the process feature vector has the form: @=> [, {01 = a, ~= b, ~ = c}, cf = 0.80]. (2) In this experiment a reversible catastrophe was examined. The reason for this is that the same catastrophe could be repeated to ensure the statistical significance of the data.« less
DANCEing with the Stars: Measuring Neutron Capture on Unstable Isotopes with DANCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Couture, A.; Bond, E.; Bredeweg, T. A.
2009-03-10
Isotopes heavier than iron are known to be produced in stars through neutron capture processes. Two major processes, the slow (s) and rapid (r) processes are each responsible for 50% of the abundances of the heavy isotopes. The neutron capture cross sections of the isotopes on the s process path reveal information about the expected abundances of the elements as well as stellar conditions and dynamics. Until recently, measurements on unstable isotopes, which are most important for determining stellar temperatures and reaction flow, have not been experimentally feasible. The Detector for Advance Neutron Capture Experiments (DANCE) located at the Losmore » Alamos Neutron Science Center (LANSCE) was designed to perform time-of-flight neutron capture measurements on unstable isotopes for nuclear astrophysics, stockpile stewardship, and reactor development. DANCE is a 4-{pi}BaF{sub 2} scintillator array which can perform measurements on sub-milligram samples of isotopes with half-lives as short as a few hundred days. These cross sections are critical for advancing our understanding of the production of the heavy isotopes.« less
Analysis of the Waggle Dance Motion of Honeybees for the Design of a Biomimetic Honeybee Robot
Landgraf, Tim; Rojas, Raúl; Nguyen, Hai; Kriegel, Fabian; Stettin, Katja
2011-01-01
The honeybee dance “language” is one of the most popular examples of information transfer in the animal world. Today, more than 60 years after its discovery it still remains unknown how follower bees decode the information contained in the dance. In order to build a robotic honeybee that allows a deeper investigation of the communication process we have recorded hundreds of videos of waggle dances. In this paper we analyze the statistics of visually captured high-precision dance trajectories of European honeybees (Apis mellifera carnica). The trajectories were produced using a novel automatic tracking system and represent the most detailed honeybee dance motion information available. Although honeybee dances seem very variable, some properties turned out to be invariant. We use these properties as a minimal set of parameters that enables us to model the honeybee dance motion. We provide a detailed statistical description of various dance properties that have not been characterized before and discuss the role of particular dance components in the commmunication process. PMID:21857906
NASA Astrophysics Data System (ADS)
Szücs, T.; Kiss, G. G.; Gyürky, Gy.; Halász, Z.; Fülöp, Zs.; Rauscher, T.
2018-01-01
The stellar reaction rates of radiative α-capture reactions on heavy isotopes are of crucial importance for the γ process network calculations. These rates are usually derived from statistical model calculations, which need to be validated, but the experimental database is very scarce. This paper presents the results of α-induced reaction cross section measurements on iridium isotopes carried out at first close to the astrophysically relevant energy region. Thick target yields of 191Ir(α,γ)195Au, 191Ir(α,n)194Au, 193Ir(α,n)196mAu, 193Ir(α,n)196Au reactions have been measured with the activation technique between Eα = 13.4 MeV and 17 MeV. For the first time the thick target yield was determined with X-ray counting. This led to a previously unprecedented sensitivity. From the measured thick target yields, reaction cross sections are derived and compared with statistical model calculations. The recently suggested energy-dependent modification of the α + nucleus optical potential gives a good description of the experimental data.
A Process for Capturing the Art of Systems Engineering
NASA Technical Reports Server (NTRS)
Owens, Clark V., III; Sekeres, Carrie; Roumie, Yasmeen
2016-01-01
There is both an art and a science to systems engineering. The science of systems engineering is effectively captured in processes and procedures, but the art is much more elusive. We propose that there is six step process that can be applied to any systems engineering organization to create an environment from which the "art" of that organization can be captured, be allowed to evolve collaboratively and be shared with all members of the organization. This paper details this process as it was applied to NASA Launch Services Program (LSP) Integration Engineering Branch during a pilot program of Confluence, a Commercial Off The Shelf (COTS) wiki tool.
A preliminary investigation of cryogenic CO2 capture utilizing a reverse Brayton Cycle
NASA Astrophysics Data System (ADS)
Yuan, L. C.; Pfotenhauer, J. M.; Qiu, L. M.
2014-01-01
Utilizing CO2 capture and storage (CCS) technologies is a significant way to reduce carbon emissions from coal fired power plants. Cryogenic CO2 capture (CCC) is an innovative and promising CO2 capture technology, which has an apparent energy and environmental advantage compared to alternatives. A process of capturing CO2 from the flue gas of a coal-fired electrical power plant by cryogenically desublimating CO2 has been discussed and demonstrated theoretically. However, pressurizing the inlet flue gas to reduce the energy penalty for the cryogenic process will lead to a more complex system. In this paper, a modified CCC system utilizing a reverse Brayton Cycle is proposed, and the energy penalty of these two systems are compared theoretically.
Efficient electrochemical refrigeration power plant using natural gas with ∼100% CO2 capture
NASA Astrophysics Data System (ADS)
Al-musleh, Easa I.; Mallapragada, Dharik S.; Agrawal, Rakesh
2015-01-01
We propose an efficient Natural Gas (NG) based Solid Oxide Fuel Cell (SOFC) power plant equipped with ∼100% CO2 capture. The power plant uses a unique refrigeration based process to capture and liquefy CO2 from the SOFC exhaust. The capture of CO2 is carried out via condensation and purification using two rectifying columns operating at different pressures. The uncondensed gas mixture, comprising of relatively high purity unconverted fuel, is recycled to the SOFC and found to boost the power generation of the SOFC by 22%, when compared to a stand alone SOFC. If Liquefied Natural Gas (LNG) is available at the plant gate, then the refrigeration available from its evaporation is used for CO2 Capture and Liquefaction (CO2CL). If NG is utilized, then a Mixed Refrigerant (MR) vapor compression cycle is utilized for CO2CL. Alternatively, the necessary refrigeration can be supplied by evaporating the captured liquid CO2 at a lower pressure, which is then compressed to supercritical pressures for pipeline transportation. From rigorous simulations, the power generation efficiency of the proposed processes is found to be 70-76% based on lower heating value (LHV). The benefit of the proposed processes is evident when the efficiency of 73% for a conventional SOFC-Gas turbine power plant without CO2 capture is compared with an equivalent efficiency of 71.2% for the proposed process with CO2CL.
Implementation of a Learning Design Run-Time Environment for the .LRN Learning Management System
ERIC Educational Resources Information Center
del Cid, Jose Pablo Escobedo; de la Fuente Valentin, Luis; Gutierrez, Sergio; Pardo, Abelardo; Kloos, Carlos Delgado
2007-01-01
The IMS Learning Design specification aims at capturing the complete learning flow of courses, without being restricted to a particular pedagogical model. Such flow description for a course, called a Unit of Learning, must be able to be reproduced in different systems using a so called run-time environment. In the last few years there has been…
USDA-ARS?s Scientific Manuscript database
Case Description: 1,081 newborn calves from a commercial dairy were tested for bovine viral diarrhea virus antigen by pooled RT-PCR as part of a screening program. Ear tissue from twenty six calves initially tested positive and 14 confirmed positive with antigen capture ELISA two weeks later (1.3...
Recent European Developments in Helicopters
NASA Technical Reports Server (NTRS)
1921-01-01
Descriptions are given of two captured helicopters, one driven by electric power, the other by a gasoline engine. An account is given of flight tests of the gasoline powered vehicle. After 15 successful flight tests, the gasoline powered vehicle crashed due to the insufficient thrust. Also discussed here are the applications of helicopters for military observations, for meteorological work, and for carrying radio antennas.
ERIC Educational Resources Information Center
Wilson, Erin M.; Green, Jordan R.; Weismer, Gary
2012-01-01
Purpose: The purpose of this investigation was to describe age- and consistency-related changes in the temporal characteristics of chewing in typically developing children between the ages of 4 and 35 months and adults using high-resolution optically based motion capture technology. Method: Data were collected from 60 participants (48 children, 12…
Towards an Interoperability Ontology for Software Development Tools
2003-03-01
The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...ese obstacles was to construct a “pilot” ontology that is extensible. We applied the Feature-Oriented Domain Analysis approach to capture the
40 CFR 63.3920 - What reports must I submit?
Code of Federal Regulations, 2010 CFR
2010-07-01
... limitation (including any periods when emissions bypassed the add-on control device and were diverted to the... month by emission capture systems and add-on control devices using Equations 1 and 1A through 1D of § 63... description of the CPMS. (v) The date of the latest CPMS certification or audit. (vi) The date and time that...
Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena
2015-04-01
To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.
40 CFR 408.140 - Applicability; description of the tuna processing subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 29 2011-07-01 2009-07-01 true Applicability; description of the tuna... Tuna Processing Subcategory § 408.140 Applicability; description of the tuna processing subcategory. The provisions of this subpart are applicable to discharges resulting from the processing of tuna. [40...
40 CFR 408.140 - Applicability; description of the tuna processing subcategory.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 30 2012-07-01 2012-07-01 false Applicability; description of the tuna... Tuna Processing Subcategory § 408.140 Applicability; description of the tuna processing subcategory. The provisions of this subpart are applicable to discharges resulting from the processing of tuna. [40...
40 CFR 408.140 - Applicability; description of the tuna processing subcategory.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 30 2013-07-01 2012-07-01 true Applicability; description of the tuna... Tuna Processing Subcategory § 408.140 Applicability; description of the tuna processing subcategory. The provisions of this subpart are applicable to discharges resulting from the processing of tuna. [40...
40 CFR 408.140 - Applicability; description of the tuna processing subcategory.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 29 2014-07-01 2012-07-01 true Applicability; description of the tuna... Tuna Processing Subcategory § 408.140 Applicability; description of the tuna processing subcategory. The provisions of this subpart are applicable to discharges resulting from the processing of tuna. [40...
40 CFR 408.140 - Applicability; description of the tuna processing subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Applicability; description of the tuna... Tuna Processing Subcategory § 408.140 Applicability; description of the tuna processing subcategory. The provisions of this subpart are applicable to discharges resulting from the processing of tuna. [40...
Standard services for the capture, processing, and distribution of packetized telemetry data
NASA Technical Reports Server (NTRS)
Stallings, William H.
1989-01-01
Standard functional services for the capture, processing, and distribution of packetized data are discussed with particular reference to the future implementation of packet processing systems, such as those for the Space Station Freedom. The major functions are listed under the following major categories: input processing, packet processing, and output processing. A functional block diagram of a packet data processing facility is presented, showing the distribution of the various processing functions as well as the primary data flow through the facility.
NASA Astrophysics Data System (ADS)
Simeone, C.; Maneta, M. P.; Holden, Z. A.; Dobrowski, S.; Sala, A.
2017-12-01
Recent studies indicate that increases in drought stress due to climate change will increase forest mortality across the western U.S. Although ecohydrologic models used to study regional hydrologic stress response in forests have made rapid advances in recent years, they often incorporate simplified descriptions of the local hydrology, do not implement an explicit description of plant hydraulics, and do not permit to study the tradeoffs between frequency, intensity, and accumulation of hydrologic stress in vegetation. We use the spatially-distributed, mechanistic ecohydrologic model Ech2o, which effectively captures spatial variations in both hydrology, energy exchanges, and regional climate to simulate high-resolution tree hydraulics, estimating soil and leaf water potential, tree effective water conductance, and percent loss of conductivity in the xylem (PLC) at 250 meter resolution and sub-daily timestep across a topographically complex landscape. Tree hydraulics are simulated assuming a diffusive process in the soil-tree-atmosphere continuum. We use PLC to develop a vegetation dynamic stress index that scales plant-level processes to the landscape scale, and that takes into account the temporal accumulation of instantaneous hydraulic stress, growing season length, frequency and duration of drought periods, and plant drought tolerance. The resulting index is interpreted as the probability of drought induced tree mortality in a given location during the simulated period. We apply this index to regions of Northern Idaho and Western Montana. Results show that drought stress is highly spatially variable, sensitive to local-scale hydrologic and atmospheric conditions, and responsive to the recovery rate from individual hydraulic stress episodes.
Quantitative metrics for assessment of chemical image quality and spatial resolution
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
2016-02-28
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
ESIP Information Quality Cluster (IQC)
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.; Peng, Ge; Moroni, David F.
2016-01-01
The Information Quality Cluster (IQC) within the Federation of Earth Science Information Partners (ESIP) was initially formed in 2011 and has evolved significantly over time. The current objectives of the IQC are to: 1. Actively evaluate community data quality best practices and standards; 2. Improve capture, description, discovery, and usability of information about data quality in Earth science data products; 3. Ensure producers of data products are aware of standards and best practices for conveying data quality, and data providers distributors intermediaries establish, improve and evolve mechanisms to assist users in discovering and understanding data quality information; and 4. Consistently provide guidance to data managers and stewards on how best to implement data quality standards and best practices to ensure and improve maturity of their data products. The activities of the IQC include: 1. Identification of additional needs for consistently capturing, describing, and conveying quality information through use case studies with broad and diverse applications; 2. Establishing and providing community-wide guidance on roles and responsibilities of key players and stakeholders including users and management; 3. Prototyping of conveying quality information to users in a more consistent, transparent, and digestible manner; 4. Establishing a baseline of standards and best practices for data quality; 5. Evaluating recommendations from NASA's DQWG in a broader context and proposing possible implementations; and 6. Engaging data providers, data managers, and data user communities as resources to improve our standards and best practices. Following the principles of openness of the ESIP Federation, IQC invites all individuals interested in improving capture, description, discovery, and usability of information about data quality in Earth science data products to participate in its activities.
Quantitative metrics for assessment of chemical image quality and spatial resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
2014-01-01
Seven different types of gasification-based coal conversion processes for producing mainly electricity and in some cases hydrogen (H2), with and without carbon dioxide (CO2) capture, were compared on a consistent basis through simulation studies. The flowsheet for each process was developed in a chemical process simulation tool “Aspen Plus”. The pressure swing adsorption (PSA), physical absorption (Selexol), and chemical looping combustion (CLC) technologies were separately analyzed for processes with CO2 capture. The performances of the above three capture technologies were compared with respect to energetic and exergetic efficiencies, and the level of CO2 emission. The effect of air separation unit (ASU) and gas turbine (GT) integration on the power output of all the CO2 capture cases is assessed. Sensitivity analysis was carried out for the CLC process (electricity-only case) to examine the effect of temperature and water-cooling of the air reactor on the overall efficiency of the process. The results show that, when only electricity production in considered, the case using CLC technology has an electrical efficiency 1.3% and 2.3% higher than the PSA and Selexol based cases, respectively. The CLC based process achieves an overall CO2 capture efficiency of 99.9% in contrast to 89.9% for PSA and 93.5% for Selexol based processes. The overall efficiency of the CLC case for combined electricity and H2 production is marginally higher (by 0.3%) than Selexol and lower (by 0.6%) than PSA cases. The integration between the ASU and GT units benefits all three technologies in terms of electrical efficiency. Furthermore, our results suggest that it is favorable to operate the air reactor of the CLC process at higher temperatures with excess air supply in order to achieve higher power efficiency. PMID:24578590
Mukherjee, Sanjay; Kumar, Prashant; Hosseini, Ali; Yang, Aidong; Fennell, Paul
2014-02-20
Seven different types of gasification-based coal conversion processes for producing mainly electricity and in some cases hydrogen (H 2 ), with and without carbon dioxide (CO 2 ) capture, were compared on a consistent basis through simulation studies. The flowsheet for each process was developed in a chemical process simulation tool "Aspen Plus". The pressure swing adsorption (PSA), physical absorption (Selexol), and chemical looping combustion (CLC) technologies were separately analyzed for processes with CO 2 capture. The performances of the above three capture technologies were compared with respect to energetic and exergetic efficiencies, and the level of CO 2 emission. The effect of air separation unit (ASU) and gas turbine (GT) integration on the power output of all the CO 2 capture cases is assessed. Sensitivity analysis was carried out for the CLC process (electricity-only case) to examine the effect of temperature and water-cooling of the air reactor on the overall efficiency of the process. The results show that, when only electricity production in considered, the case using CLC technology has an electrical efficiency 1.3% and 2.3% higher than the PSA and Selexol based cases, respectively. The CLC based process achieves an overall CO 2 capture efficiency of 99.9% in contrast to 89.9% for PSA and 93.5% for Selexol based processes. The overall efficiency of the CLC case for combined electricity and H 2 production is marginally higher (by 0.3%) than Selexol and lower (by 0.6%) than PSA cases. The integration between the ASU and GT units benefits all three technologies in terms of electrical efficiency. Furthermore, our results suggest that it is favorable to operate the air reactor of the CLC process at higher temperatures with excess air supply in order to achieve higher power efficiency.
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.; Hoffler, Keith D.; Proffitt, Melissa S.; Brown, Philip W.; Phillips, Michael R.; Rivers, Robert A.; Messina, Michael D.; Carzoo, Susan W.; Bacon, Barton J.; Foster, John F.
1994-01-01
This paper describes the design, analysis, and nonlinear simulation results (batch and piloted) for a longitudinal controller which is scheduled to be flight-tested on the High-Alpha Research Vehicle (HARV). The HARV is an F-18 airplane modified for and equipped with multi-axis thrust vectoring. The paper includes a description of the facilities, a detailed review of the feedback controller design, linear analysis results of the feedback controller, a description of the feed-forward controller design, nonlinear batch simulation results, and piloted simulation results. Batch simulation results include maximum pitch stick agility responses, angle of attack alpha captures, and alpha regulation for full lateral stick rolls at several alpha's. Piloted simulation results include task descriptions for several types of maneuvers, task guidelines, the corresponding Cooper-Harper ratings from three test pilots, and some pilot comments. The ratings show that desirable criteria are achieved for almost all of the piloted simulation tasks.
An extended Lagrangian method for subsonic flows
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Loh, Ching Y.
1992-01-01
It is well known that fluid motion can be specified by either the Eulerian of Lagrangian description. Most of Computational Fluid Dynamics (CFD) developments over the last three decades have been based on the Eulerian description and considerable progress has been made. In particular, the upwind methods, inspired and guided by the work of Gudonov, have met with many successes in dealing with complex flows, especially where discontinuities exist. However, this shock capturing property has proven to be accurate only when the discontinuity is aligned with one of the grid lines since most upwind methods are strictly formulated in 1-D framework and only formally extended to multi-dimensions. Consequently, the attractive property of crisp resolution of these discontinuities is lost and research on genuine multi-dimensional approach has just been undertaken by several leading researchers. Nevertheless they are still based on the Eulerian description.
Gritsenko, Valeriya; Dailey, Eric; Kyle, Nicholas; Taylor, Matt; Whittacre, Sean; Swisher, Anne K
2015-01-01
To determine if a low-cost, automated motion analysis system using Microsoft Kinect could accurately measure shoulder motion and detect motion impairments in women following breast cancer surgery. Descriptive study of motion measured via 2 methods. Academic cancer center oncology clinic. 20 women (mean age = 60 yrs) were assessed for active and passive shoulder motions during a routine post-operative clinic visit (mean = 18 days after surgery) following mastectomy (n = 4) or lumpectomy (n = 16) for breast cancer. Participants performed 3 repetitions of active and passive shoulder motions on the side of the breast surgery. Arm motion was recorded using motion capture by Kinect for Windows sensor and on video. Goniometric values were determined from video recordings, while motion capture data were transformed to joint angles using 2 methods (body angle and projection angle). Correlation of motion capture with goniometry and detection of motion limitation. Active shoulder motion measured with low-cost motion capture agreed well with goniometry (r = 0.70-0.80), while passive shoulder motion measurements did not correlate well. Using motion capture, it was possible to reliably identify participants whose range of shoulder motion was reduced by 40% or more. Low-cost, automated motion analysis may be acceptable to screen for moderate to severe motion impairments in active shoulder motion. Automatic detection of motion limitation may allow quick screening to be performed in an oncologist's office and trigger timely referrals for rehabilitation.
Ramsay, Pam; Salisbury, Lisa G; Merriweather, Judith L; Huby, Guro; Rattray, Janice E; Hull, Alastair M; Brett, Stephen J; Mackenzie, Simon J; Murray, Gordon D; Forbes, John F; Walsh, Timothy Simon
2014-01-29
Increasing numbers of patients are surviving critical illness, but survival may be associated with a constellation of physical and psychological sequelae that can cause ongoing disability and reduced health-related quality of life. Limited evidence currently exists to guide the optimum structure, timing, and content of rehabilitation programmes. There is a need to both develop and evaluate interventions to support and expedite recovery during the post-ICU discharge period. This paper describes the construct development for a complex rehabilitation intervention intended to promote physical recovery following critical illness. The intervention is currently being evaluated in a randomised trial (ISRCTN09412438; funder Chief Scientists Office, Scotland). The intervention was developed using the Medical Research Council (MRC) framework for developing complex healthcare interventions. We ensured representation from a wide variety of stakeholders including content experts from multiple specialties, methodologists, and patient representation. The intervention construct was initially based on literature review, local observational and audit work, qualitative studies with ICU survivors, and brainstorming activities. Iterative refinement was aided by the publication of a National Institute for Health and Care Excellence guideline (No. 83), publicly available patient stories (Healthtalkonline), a stakeholder event in collaboration with the James Lind Alliance, and local piloting. Modelling and further work involved a feasibility trial and development of a novel generic rehabilitation assistant (GRA) role. Several rounds of external peer review during successive funding applications also contributed to development. The final construct for the complex intervention involved a dedicated GRA trained to pre-defined competencies across multiple rehabilitation domains (physiotherapy, dietetics, occupational therapy, and speech/language therapy), with specific training in post-critical illness issues. The intervention was from ICU discharge to 3 months post-discharge, including inpatient and post-hospital discharge elements. Clear strategies to provide information to patients/families were included. A detailed taxonomy was developed to define and describe the processes undertaken, and capture them during the trial. The detailed process measure description, together with a range of patient, health service, and economic outcomes were successfully mapped on to the modified CONSORT recommendations for reporting non-pharmacologic trial interventions. The MRC complex intervention framework was an effective guide to developing a novel post-ICU rehabilitation intervention. Combining a clearly defined new healthcare role with a detailed taxonomy of process and activity enabled the intervention to be clearly described for the purpose of trial delivery and reporting. These data will be useful when interpreting the results of the randomised trial, will increase internal and external trial validity, and help others implement the intervention if the intervention proves clinically and cost effective.
2014-01-01
Background Increasing numbers of patients are surviving critical illness, but survival may be associated with a constellation of physical and psychological sequelae that can cause ongoing disability and reduced health-related quality of life. Limited evidence currently exists to guide the optimum structure, timing, and content of rehabilitation programmes. There is a need to both develop and evaluate interventions to support and expedite recovery during the post-ICU discharge period. This paper describes the construct development for a complex rehabilitation intervention intended to promote physical recovery following critical illness. The intervention is currently being evaluated in a randomised trial (ISRCTN09412438; funder Chief Scientists Office, Scotland). Methods The intervention was developed using the Medical Research Council (MRC) framework for developing complex healthcare interventions. We ensured representation from a wide variety of stakeholders including content experts from multiple specialties, methodologists, and patient representation. The intervention construct was initially based on literature review, local observational and audit work, qualitative studies with ICU survivors, and brainstorming activities. Iterative refinement was aided by the publication of a National Institute for Health and Care Excellence guideline (No. 83), publicly available patient stories (Healthtalkonline), a stakeholder event in collaboration with the James Lind Alliance, and local piloting. Modelling and further work involved a feasibility trial and development of a novel generic rehabilitation assistant (GRA) role. Several rounds of external peer review during successive funding applications also contributed to development. Results The final construct for the complex intervention involved a dedicated GRA trained to pre-defined competencies across multiple rehabilitation domains (physiotherapy, dietetics, occupational therapy, and speech/language therapy), with specific training in post-critical illness issues. The intervention was from ICU discharge to 3 months post-discharge, including inpatient and post-hospital discharge elements. Clear strategies to provide information to patients/families were included. A detailed taxonomy was developed to define and describe the processes undertaken, and capture them during the trial. The detailed process measure description, together with a range of patient, health service, and economic outcomes were successfully mapped on to the modified CONSORT recommendations for reporting non-pharmacologic trial interventions. Conclusions The MRC complex intervention framework was an effective guide to developing a novel post-ICU rehabilitation intervention. Combining a clearly defined new healthcare role with a detailed taxonomy of process and activity enabled the intervention to be clearly described for the purpose of trial delivery and reporting. These data will be useful when interpreting the results of the randomised trial, will increase internal and external trial validity, and help others implement the intervention if the intervention proves clinically and cost effective. PMID:24476530
Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.
Costello, Fintan; Watts, Paul
2018-01-01
We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.
Mobile Laser Scanning for Indoor Modelling
NASA Astrophysics Data System (ADS)
Thomson, C.; Apostolopoulos, G.; Backes, D.; Boehm, J.
2013-10-01
The process of capturing and modelling buildings has gained increased focus in recent years with the rise of Building Information Modelling (BIM). At the heart of BIM is a process change for the construction and facilities management industries whereby a BIM aids more collaborative working through better information exchange, and as a part of the process Geomatic/Land Surveyors are not immune from the changes. Terrestrial laser scanning has been proscribed as the preferred method for rapidly capturing buildings for BIM geometry. This is a process change from a traditional measured building survey just with a total station and is aided by the increasing acceptance of point cloud data being integrated with parametric building models in BIM tools such as Autodesk Revit or Bentley Architecture. Pilot projects carried out previously by the authors to investigate the geometry capture and modelling of BIM confirmed the view of others that the process of data capture with static laser scan setups is slow and very involved requiring at least two people for efficiency. Indoor Mobile Mapping Systems (IMMS) present a possible solution to these issues especially in time saved. Therefore this paper investigates their application as a capture device for BIM geometry creation over traditional static methods through a fit-for-purpose test.
40 CFR 408.150 - Applicability; description of the fish meal processing subcategory.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 30 2013-07-01 2012-07-01 true Applicability; description of the fish... CATEGORY Fish Meal Processing Subcategory § 408.150 Applicability; description of the fish meal processing... menhaden on the Gulf and Atlantic Coasts and the processing of anchovy on the West Coast into fish meal...
40 CFR 408.150 - Applicability; description of the fish meal processing subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 29 2011-07-01 2009-07-01 true Applicability; description of the fish... CATEGORY Fish Meal Processing Subcategory § 408.150 Applicability; description of the fish meal processing... menhaden on the Gulf and Atlantic Coasts and the processing of anchovy on the West Coast into fish meal...
40 CFR 408.150 - Applicability; description of the fish meal processing subcategory.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 29 2014-07-01 2012-07-01 true Applicability; description of the fish... CATEGORY Fish Meal Processing Subcategory § 408.150 Applicability; description of the fish meal processing... menhaden on the Gulf and Atlantic Coasts and the processing of anchovy on the West Coast into fish meal...
40 CFR 408.150 - Applicability; description of the fish meal processing subcategory.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 30 2012-07-01 2012-07-01 false Applicability; description of the fish... CATEGORY Fish Meal Processing Subcategory § 408.150 Applicability; description of the fish meal processing... menhaden on the Gulf and Atlantic Coasts and the processing of anchovy on the West Coast into fish meal...
40 CFR 408.150 - Applicability; description of the fish meal processing subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Applicability; description of the fish... CATEGORY Fish Meal Processing Subcategory § 408.150 Applicability; description of the fish meal processing... menhaden on the Gulf and Atlantic Coasts and the processing of anchovy on the West Coast into fish meal...
Engineered yeast for enhanced CO2 mineralization†
Barbero, Roberto; Carnelli, Lino; Simon, Anna; Kao, Albert; Monforte, Alessandra d’Arminio; Riccò, Moreno; Bianchi, Daniele; Belcher, Angela
2014-01-01
In this work, a biologically catalyzed CO2 mineralization process for the capture of CO2 from point sources was designed, constructed at a laboratory scale, and, using standard chemical process scale-up protocols, was modeled and evaluated at an industrial scale. A yeast display system in Saccharomyces cerevisae was used to screen several carbonic anhydrase isoforms and mineralization peptides for their impact on CO2 hydration, CaCO3 mineralization, and particle settling rate. Enhanced rates for each of these steps in the CaCO3 mineralization process were confirmed using quantitative techniques in lab-scale measurements. The effect of these enhanced rates on the CO2 capture cost in an industrial scale CO2 mineralization process using coal fly ash as the CaO source was evaluated. The model predicts a process using bCA2- yeast and fly ash is ~10% more cost effective per ton of CO2 captured than a process with no biological molecules, a savings not realized by wild-type yeast and high-temperature stable recombinant CA2 alone or in combination. The levelized cost of electricity for a power plant using this process was calculated and scenarios in which this process compares favorably to CO2 capture by MEA absorption process are presented. PMID:25289021
Initiating an ergonomic analysis. A process for jobs with highly variable tasks.
Conrad, K M; Lavender, S A; Reichelt, P A; Meyer, F T
2000-09-01
Occupational health nurses play a vital role in addressing ergonomic problems in the workplace. Describing and documenting exposure to ergonomic risk factors is a relatively straightforward process in jobs in which the work is repetitive. In other types of work, the analysis becomes much more challenging because tasks may be repeated infrequently, or at irregular time intervals, or under different environmental and temporal conditions, thereby making it difficult to observe a "representative" sample of the work performed. This article describes a process used to identify highly variable job tasks for ergonomic analyses. The identification of tasks for ergonomic analysis was a two step process involving interviews and a survey of firefighters and paramedics from a consortium of 14 suburban fire departments. The interviews were used to generate a list of frequently performed, physically strenuous job tasks and to capture clear descriptions of those tasks and associated roles. The goals of the survey were to confirm the interview findings across the entire target population and to quantify the frequency and degree of strenuousness of each task. In turn, the quantitative results from the survey were used to prioritize job tasks for simulation. Although this process was used to study firefighters and paramedics, the approach is likely to be suitable for many other types of occupations in which the tasks are highly variable in content and irregular in frequency.
Small Particles Intact Capture Experiment (SPICE)
NASA Technical Reports Server (NTRS)
Nishioka, Ken-Ji; Carle, G. C.; Bunch, T. E.; Mendez, David J.; Ryder, J. T.
1994-01-01
The Small Particles Intact Capture Experiment (SPICE) will develop technologies and engineering techniques necessary to capture nearly intact, uncontaminated cosmic and interplanetary dust particles (IDP's). Successful capture of such particles will benefit the exobiology and planetary science communities by providing particulate samples that may have survived unaltered since the formation of the solar system. Characterization of these particles may contribute fundamental data to our knowledge of how these particles could have formed into our planet Earth and, perhaps, contributed to the beginnings of life. The term 'uncontaminated' means that captured cosmic and IDP particles are free of organic contamination from the capture process and the term 'nearly intact capture' means that their chemical and elemental components are not materially altered during capture. The key to capturing cosmic and IDP particles that are organic-contamination free and nearly intact is the capture medium. Initial screening of capture media included organic foams, multiple thin foil layers, and aerogel (a silica gel); but, with the exception of aerogel, the requirements of no contamination or nearly intact capture were not met. To ensure no contamination of particles in the capture process, high-purity aerogel was chosen. High-purity aerogel results in high clarity (visual clearness), a useful quality in detection and recovery of embedded captured particles from the aerogel. P. Tsou at the Jet Propulsion Laboratory (JPL) originally described the use of aerogel for this purpose and reported laboratory test results. He has flown aerogel as a 'GAS-can Lid' payload on STS-47 and is evaluating the results. The Timeband Capture Cell Experiment (TICCE), a Eureca 1 experiment, is also flying aerogel and is scheduled for recovery in late April.
Contact Kinetics in Fractal Macromolecules.
Dolgushev, Maxim; Guérin, Thomas; Blumen, Alexander; Bénichou, Olivier; Voituriez, Raphaël
2015-11-13
We consider the kinetics of first contact between two monomers of the same macromolecule. Relying on a fractal description of the macromolecule, we develop an analytical method to compute the mean first contact time for various molecular sizes. In our theoretical description, the non-Markovian feature of monomer motion, arising from the interactions with the other monomers, is captured by accounting for the nonequilibrium conformations of the macromolecule at the very instant of first contact. This analysis reveals a simple scaling relation for the mean first contact time between two monomers, which involves only their equilibrium distance and the spectral dimension of the macromolecule, independently of its microscopic details. Our theoretical predictions are in excellent agreement with numerical stochastic simulations.
ResearchEHR: use of semantic web technologies and archetypes for the description of EHRs.
Robles, Montserrat; Fernández-Breis, Jesualdo Tomás; Maldonado, Jose A; Moner, David; Martínez-Costa, Catalina; Bosca, Diego; Menárguez-Tortosa, Marcos
2010-01-01
In this paper, we present the ResearchEHR project. It focuses on the usability of Electronic Health Record (EHR) sources and EHR standards for building advanced clinical systems. The aim is to support healthcare professional, institutions and authorities by providing a set of generic methods and tools for the capture, standardization, integration, description and dissemination of health related information. ResearchEHR combines several tools to manage EHR at two different levels. The internal level that deals with the normalization and semantic upgrading of exiting EHR by using archetypes and the external level that uses Semantic Web technologies to specify clinical archetypes for advanced EHR architectures and systems.
On the theoretical description of weakly charged surfaces.
Wang, Rui; Wang, Zhen-Gang
2015-03-14
It is widely accepted that the Poisson-Boltzmann (PB) theory provides a valid description for charged surfaces in the so-called weak coupling limit. Here, we show that the image charge repulsion creates a depletion boundary layer that cannot be captured by a regular perturbation approach. The correct weak-coupling theory must include the self-energy of the ion due to the image charge interaction. The image force qualitatively alters the double layer structure and properties, and gives rise to many non-PB effects, such as nonmonotonic dependence of the surface energy on concentration and charge inversion. In the presence of dielectric discontinuity, there is no limiting condition for which the PB theory is valid.
Radical “Visual Capture” Observed in a Patient with Severe Visual Agnosia
Takaiwa, Akiko; Yoshimura, Hirokazu; Abe, Hirofumi; Terai, Satoshi
2003-01-01
We report the case of a 79-year-old female with visual agnosia due to brain infarction in the left posterior cerebral artery. She could recognize objects used in daily life rather well by touch (the number of objects correctly identified was 16 out of 20 presented objects), but she could not recognize them as well by vision (6 out of 20). In this case, it was expected that she would recognize them well when permitted to use touch and vision simultaneously. Our patient, however, performed poorly, producing 5 correct answers out of 20 in the Vision-and-Touch condition. It would be natural to think that visual capture functions when vision and touch provide contradictory information on concrete positions and shapes. However, in the present case, it functioned in spite of the visual deficit in recognizing objects. This should be called radical visual capture. By presenting detailed descriptions of her symptoms and neuropsychological and neuroradiological data, we clarify the characteristics of this type of capture. PMID:12719638
n-capture elements in the Sculptor dwarf spheroidal galaxy
NASA Astrophysics Data System (ADS)
Skúladóttir, Ása
2018-06-01
Sculptor is a well studied dwarf galaxy in the Local Group, which is dominated by an old stellar population (>10 Gyr) and is therefore an ideal system to study early chemical evolution. With high-resolution VLT/FLAMES spectra, R~20,000, we are able to get accurate abundances of several n-capture elements in ~100 stars, from both the lighter n-capture elements (Y) as well as the heavier ones, both tracers of the s-process (e.g. Ba) and the r-process (e.g. Eu). I will discuss the similarities and differences in the n-capture elements in Sculptor and the Milky Way, as well as other dwarf galaxies.
2013 R&D 100 Award: Movie-mode electron microscope captures nanoscale
Lagrange, Thomas; Reed, Bryan
2018-01-26
A new instrument developed by LLNL scientists and engineers, the Movie Mode Dynamic Transmission Electron Microscope (MM-DTEM), captures billionth-of-a-meter-scale images with frame rates more than 100,000 times faster than those of conventional techniques. The work was done in collaboration with a Pleasanton-based company, Integrated Dynamic Electron Solutions (IDES) Inc. Using this revolutionary imaging technique, a range of fundamental and technologically important material and biological processes can be captured in action, in complete billionth-of-a-meter detail, for the first time. The primary application of MM-DTEM is the direct observation of fast processes, including microstructural changes, phase transformations and chemical reactions, that shape real-world performance of nanostructured materials and potentially biological entities. The instrument could prove especially valuable in the direct observation of macromolecular interactions, such as protein-protein binding and host-pathogen interactions. While an earlier version of the technology, Single Shot-DTEM, could capture a single snapshot of a rapid process, MM-DTEM captures a multiframe movie that reveals complex sequences of events in detail. It is the only existing technology that can capture multiple electron microscopy images in the span of a single microsecond.
AMUC: Associated Motion capture User Categories.
Norman, Sally Jane; Lawson, Sian E M; Olivier, Patrick; Watson, Paul; Chan, Anita M-A; Dade-Robertson, Martyn; Dunphy, Paul; Green, Dave; Hiden, Hugo; Hook, Jonathan; Jackson, Daniel G
2009-07-13
The AMUC (Associated Motion capture User Categories) project consisted of building a prototype sketch retrieval client for exploring motion capture archives. High-dimensional datasets reflect the dynamic process of motion capture and comprise high-rate sampled data of a performer's joint angles; in response to multiple query criteria, these data can potentially yield different kinds of information. The AMUC prototype harnesses graphic input via an electronic tablet as a query mechanism, time and position signals obtained from the sketch being mapped to the properties of data streams stored in the motion capture repository. As well as proposing a pragmatic solution for exploring motion capture datasets, the project demonstrates the conceptual value of iterative prototyping in innovative interdisciplinary design. The AMUC team was composed of live performance practitioners and theorists conversant with a variety of movement techniques, bioengineers who recorded and processed motion data for integration into the retrieval tool, and computer scientists who designed and implemented the retrieval system and server architecture, scoped for Grid-based applications. Creative input on information system design and navigation, and digital image processing, underpinned implementation of the prototype, which has undergone preliminary trials with diverse users, allowing identification of rich potential development areas.
2013 R&D 100 Award: Movie-mode electron microscope captures nanoscale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lagrange, Thomas; Reed, Bryan
2014-04-03
A new instrument developed by LLNL scientists and engineers, the Movie Mode Dynamic Transmission Electron Microscope (MM-DTEM), captures billionth-of-a-meter-scale images with frame rates more than 100,000 times faster than those of conventional techniques. The work was done in collaboration with a Pleasanton-based company, Integrated Dynamic Electron Solutions (IDES) Inc. Using this revolutionary imaging technique, a range of fundamental and technologically important material and biological processes can be captured in action, in complete billionth-of-a-meter detail, for the first time. The primary application of MM-DTEM is the direct observation of fast processes, including microstructural changes, phase transformations and chemical reactions, that shapemore » real-world performance of nanostructured materials and potentially biological entities. The instrument could prove especially valuable in the direct observation of macromolecular interactions, such as protein-protein binding and host-pathogen interactions. While an earlier version of the technology, Single Shot-DTEM, could capture a single snapshot of a rapid process, MM-DTEM captures a multiframe movie that reveals complex sequences of events in detail. It is the only existing technology that can capture multiple electron microscopy images in the span of a single microsecond.« less
FUNDAMENTAL PROCESSES INVOLVED IN SO2 CAPTURE BY CALCIUM-BASED ADSORBENTS
The paper discusses the fundamental processes in sulfur dioxide (SO2) capture by calcium-based adsorbents for upper furnace, duct, and electrostatic precipitator (ESP) reaction sites. It examines the reactions in light of controlling mechanisms, effect of sorbent physical propert...
The minimum information about a genome sequence (MIGS) specification
Field, Dawn; Garrity, George; Gray, Tanya; Morrison, Norman; Selengut, Jeremy; Sterk, Peter; Tatusova, Tatiana; Thomson, Nicholas; Allen, Michael J; Angiuoli, Samuel V; Ashburner, Michael; Axelrod, Nelson; Baldauf, Sandra; Ballard, Stuart; Boore, Jeffrey; Cochrane, Guy; Cole, James; Dawyndt, Peter; De Vos, Paul; dePamphilis, Claude; Edwards, Robert; Faruque, Nadeem; Feldman, Robert; Gilbert, Jack; Gilna, Paul; Glöckner, Frank Oliver; Goldstein, Philip; Guralnick, Robert; Haft, Dan; Hancock, David; Hermjakob, Henning; Hertz-Fowler, Christiane; Hugenholtz, Phil; Joint, Ian; Kagan, Leonid; Kane, Matthew; Kennedy, Jessie; Kowalchuk, George; Kottmann, Renzo; Kolker, Eugene; Kravitz, Saul; Kyrpides, Nikos; Leebens-Mack, Jim; Lewis, Suzanna E; Li, Kelvin; Lister, Allyson L; Lord, Phillip; Maltsev, Natalia; Markowitz, Victor; Martiny, Jennifer; Methe, Barbara; Mizrachi, Ilene; Moxon, Richard; Nelson, Karen; Parkhill, Julian; Proctor, Lita; White, Owen; Sansone, Susanna-Assunta; Spiers, Andrew; Stevens, Robert; Swift, Paul; Taylor, Chris; Tateno, Yoshio; Tett, Adrian; Turner, Sarah; Ussery, David; Vaughan, Bob; Ward, Naomi; Whetzel, Trish; Gil, Ingio San; Wilson, Gareth; Wipat, Anil
2008-01-01
With the quantity of genomic data increasing at an exponential rate, it is imperative that these data be captured electronically, in a standard format. Standardization activities must proceed within the auspices of open-access and international working bodies. To tackle the issues surrounding the development of better descriptions of genomic investigations, we have formed the Genomic Standards Consortium (GSC). Here, we introduce the minimum information about a genome sequence (MIGS) specification with the intent of promoting participation in its development and discussing the resources that will be required to develop improved mechanisms of metadata capture and exchange. As part of its wider goals, the GSC also supports improving the ‘transparency’ of the information contained in existing genomic databases. PMID:18464787
Retarding friction versus white noise in the description of heavy ion fusion
NASA Astrophysics Data System (ADS)
Chushnyakova, Maria; Gontchar, Igor
2014-03-01
We performed modeling of the collision of two spherical nuclei resulting in capture. For this aim the stochastic differential equations are used with the white or colored noise and with the instant or retarding friction, respectively. The dissipative forces are proportional to the squared derivative of the strong nucleus-nucleus interaction potential (SnnP). The SnnP is calculated in the framework of the double folding approach with the density-dependent M3Y NN-forces. Calculations performed for 28Si+144Sm reaction show that accounting for the fluctuations typically reduces the capture cross sections by not more than 10%. In contradistinction, the influence of the memory effects is found resulting in about 20% enhancement of the cross section.
An innovative permanent total enclosure for blast cleaning and painting ships in drydock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garland, C.; Lukey, M.
1997-12-31
This paper describes a new innovative Permanent Total Enclosure, or CAPE system, which encloses and captures emissions from blast cleaning and painting ship hulls in drydock. A description of the modular enclosure towers with unique seals is shown with several figures. The support barge with its environmental control equipment which includes a dust collector, VOC thermal oxidizer, dehumidifier, boiler, heating coils, air flow fans and, system controls is also described. Data measurements from the first two applications rate this system at 100 percent capture efficiency, 99 percent VOC destruction efficiency and 99.9 percent dust collection efficiency. Ships can be blastmore » cleaned and painted using noncompliant paints and meet all state and federal standards for air emissions.« less
Piezoelectric and Magnetoelectric Thick Films for Fabricating Power Sources in Wireless Sensor Nodes
Priya, Shashank; Ryu, Jungho; Park, Chee-Sung; Oliver, Josiah; Choi, Jong-Jin; Park, Dong-Soo
2009-01-01
In this manuscript, we review the progress made in the synthesis of thick film-based piezoelectric and magnetoelectric structures for harvesting energy from mechanical vibrations and magnetic field. Piezoelectric compositions in the system Pb(Zr,Ti)O3–Pb(Zn1/3Nb2/3)O3 (PZNT) have shown promise for providing enhanced efficiency due to higher energy density and thus form the base of transducers designed for capturing the mechanical energy. Laminate structures of PZNT with magnetostrictive ferrite materials provide large magnitudes of magnetoelectric coupling and are being targeted to capture the stray magnetic field energy. We analyze the models used to predict the performance of the energy harvesters and present a full system description. PMID:22454590
i-TED: A novel concept for high-sensitivity (n,γ) cross-section measurements
NASA Astrophysics Data System (ADS)
Domingo-Pardo, C.
2016-07-01
A new method for measuring (n , γ) cross-sections aiming at enhanced signal-to-background ratio is presented. This new approach is based on the combination of the pulse-height weighting technique with a total energy detection system that features γ-ray imaging capability (i-TED). The latter allows one to exploit Compton imaging techniques to discriminate between true capture γ-rays arising from the sample under study and background γ-rays coming from contaminant neutron (prompt or delayed) captures in the surrounding environment. A general proof-of-concept detection system for this application is presented in this paper together with a description of the imaging method and a conceptual demonstration based on Monte Carlo simulations.
Designing berthing mechanisms for international compatibility
NASA Technical Reports Server (NTRS)
Winch, John; Gonzalez-Vallejo, Juan J.
1991-01-01
The paper examines the technological issues regarding common berthing interfaces for the Space Station Freedom and pressurized modules from U.S., European, and Japanese space programs. The development of the common berthing mechanism (CBM) is based on common requirements concerning specifications, launch environments, and the unique requirements of ESA's Man-Tended Free Flyer. The berthing mechanism is composed of an active and a passive half, a remote manipulator system, 4 capture-latch assemblies, 16 structural bolts, and a pressure gage to verify equalization. Extensive graphic and verbal descriptions of each element are presented emphasizing the capture-latch motion and powered-bolt operation. The support systems to complete the interface are listed, and the manufacturing requirements for consistent fabrication are discussed to ensure effective international development.
Lavis, John N; Moynihan, Ray; Oxman, Andrew D; Paulsen, Elizabeth J
2008-01-01
Background Previous efforts to produce case descriptions have typically not focused on the organizations that produce research evidence and support its use. External evaluations of such organizations have typically not been analyzed as a group to identify the lessons that have emerged across multiple evaluations. Case descriptions offer the potential for capturing the views and experiences of many individuals who are familiar with an organization, including staff, advocates, and critics. Methods We purposively sampled a subgroup of organizations from among those that participated in the second (interview) phase of the study and (once) from among other organizations with which we were familiar. We developed and pilot-tested a case description data collection protocol, and conducted site visits that included both interviews and documentary analyses. Themes were identified from among responses to semi-structured questions using a constant comparative method of analysis. We produced both a brief (one to two pages) written description and a video documentary for each case. Results We conducted 51 interviews as part of the eight site visits. Two organizational strengths were repeatedly cited by individuals participating in the site visits: use of an evidence-based approach (which was identified as being very time-consuming) and existence of a strong relationship between researchers and policymakers (which can be challenged by conflicts of interest). Two organizational weaknesses – a lack of resources and the presence of conflicts of interest – were repeatedly cited by individuals participating in the site visits. Participants offered two main suggestions for the World Health Organization (and other international organizations and networks): 1) mobilize one or more of government support, financial resources, and the participation of both policymakers and researchers; and 2) create knowledge-related global public goods. Conclusion The findings from our case descriptions, the first of their kind, intersect in interesting ways with the messages arising from two systematic reviews of the factors that increase the prospects for research use in policymaking. Strong relationships between researchers and policymakers bodes well given such interactions appear to increase the prospects for research use. The time-consuming nature of an evidence-based approach, on the other hand, suggests the need for more efficient production processes that are 'quick and clean enough.' Our case descriptions and accompanying video documentaries provide a rich description of organizations supporting the use of research evidence, which can be drawn upon by those establishing or leading similar organizations, particularly in low- and middle-income countries. PMID:19091110
DOE Office of Scientific and Technical Information (OSTI.GOV)
Westendorf, Tiffany; Buddle, Stanlee; Caraher, Joel
The objective of this project is to design and build a bench-scale process for a novel phase-changing aminosilicone-based CO 2-capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO 2-capture absorbent for post-combustion capture of CO 2 from coal-fired power plants. The U.S. Department of Energy’s goal for Transformational Carbon Capture Technologies is the development of technologies available for demonstration by 2025 that can capture 90% of emitted CO 2 with at least 95% CO 2 purity for less than $40/tonne of CO 2 captured. In the first budget period of the project,more » the bench-scale phase-changing CO2 capture process was designed using data and operating experience generated under a previous project (ARPA-e project DE-AR0000084). Sizing and specification of all major unit operations was completed, including detailed process and instrumentation diagrams. The system was designed to operate over a wide range of operating conditions to allow for exploration of the effect of process variables on CO 2 capture performance. In the second budget period of the project, individual bench-scale unit operations were tested to determine the performance of each of each unit. Solids production was demonstrated in dry simulated flue gas across a wide range of absorber operating conditions, with single stage CO 2 conversion rates up to 75mol%. Desorber operation was demonstrated in batch mode, resulting in desorption performance consistent with the equilibrium isotherms for GAP-0/CO 2 reaction. Important risks associated with gas humidity impact on solids consistency and desorber temperature impact on thermal degradation were explored, and adjustments to the bench-scale process were made to address those effects. Corrosion experiments were conducted to support selection of suitable materials of construction for the major unit operations in the process. The bench scale unit operations were assembled into a continuous system to support steady state system testing. In the third budget period of the project, continuous system testing was conducted, including closed-loop operation of the absorber and desober systems. Slurries of GAP-0/GAP-0 carbamate/water mixtures produced in the absorber were pumped successfully to the desorber unit, and regenerated solvent was returned to the absorber. A techno-economic analysis, EH&S risk assessment, and solvent manufacturability study were completed.« less
40 CFR 158.330 - Description of production process.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Description of production process. 158.330 Section 158.330 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.330 Description of production process. If...
40 CFR 158.330 - Description of production process.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Description of production process. 158.330 Section 158.330 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.330 Description of production process. If...
40 CFR 158.330 - Description of production process.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Description of production process. 158.330 Section 158.330 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.330 Description of production process. If...
40 CFR 158.330 - Description of production process.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Description of production process. 158.330 Section 158.330 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.330 Description of production process. If...
40 CFR 158.330 - Description of production process.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Description of production process. 158.330 Section 158.330 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.330 Description of production process. If...
7 CFR 52.3751 - Product description.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Product description. 52.3751 Section 52.3751... MARKETING ACT OF 1946 PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 United States Standards for Grades of Canned Ripe Olives 1 Product Description...
7 CFR 52.3181 - Product description.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Product description. 52.3181 Section 52.3181... MARKETING ACT OF 1946 PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 United States Standards for Grades of Dried Prunes Product Description, Varietal...
7 CFR 52.1001 - Product description.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Product description. 52.1001 Section 52.1001... MARKETING ACT OF 1946 PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 United States Standards for Grades of Dates Product Description, Styles, and...
A Measurable Model of the Creative Process in the Context of a Learning Process
ERIC Educational Resources Information Center
Ma, Min; Van Oystaeyen, Fred
2016-01-01
The authors' aim was to arrive at a measurable model of the creative process by putting creativity in the context of a learning process. The authors aimed to provide a rather detailed description of how creative thinking fits in a general description of the learning process without trying to go into an analysis of a biological description of the…
High-efficiency power production from natural gas with carbon capture
NASA Astrophysics Data System (ADS)
Adams, Thomas A.; Barton, Paul I.
A unique electricity generation process uses natural gas and solid oxide fuel cells at high electrical efficiency (74%HHV) and zero atmospheric emissions. The process contains a steam reformer heat-integrated with the fuel cells to provide the heat necessary for reforming. The fuel cells are powered with H 2 and avoid carbon deposition issues. 100% CO 2 capture is achieved downstream of the fuel cells with very little energy penalty using a multi-stage flash cascade process, where high-purity water is produced as a side product. Alternative reforming techniques such as CO 2 reforming, autothermal reforming, and partial oxidation are considered. The capital and energy costs of the proposed process are considered to determine the levelized cost of electricity, which is low when compared to other similar carbon capture-enabled processes.
New teaching aid “Physical Methods of Medical Introscopy”
NASA Astrophysics Data System (ADS)
Ulin, S. E.
2017-01-01
Description of a new teaching aid, in which new methods of reconstruction of hidden images by means of nuclear magnetic resonance, X-gamma-ray, and ultrasonic tomography, is presented. The diagnostics and therapy methods of various oncological diseases with the use of medicine proton and ions beams, as well as neutron capture therapy, are considered. The new teaching aid is intended for senior students and postgraduates.
3D Data Acquisition Platform for Human Activity Understanding
2016-03-02
3D data. The support for the acquisition of such research instrumentation have significantly facilitated our current and future research and educate ...SECURITY CLASSIFICATION OF: In this project, we incorporated motion capture devices, 3D vision sensors, and EMG sensors to cross validate...multimodality data acquisition, and address fundamental research problems of representation and invariant description of 3D data, human motion modeling and
Metabolic Host Responses to Malarial Infection during the Intraerythrocytic Developmental Cycle
2016-08-08
by reproducing the experimentally determined 1) stage-specific production of biomass components and their precursors in the parasite and 2) metabolite...uptake, allow for the prediction of cellular growth ( biomass accumulation) and other phenotypic functions related to metabolism [9]. For example...our group to capture stage-specific growth phenotypes and biomass metabolite production [15]. Among these metabolic descriptions, only the network
Relativistic Photoionization Computations with the Time Dependent Dirac Equation
2016-10-12
fields often occurs in the relativistic regime. A complete description of this phenomenon requires both relativistic and quantum mechanical treatment...photoionization, or other relativis- tic quantum electronics problems. While the Klein-Gordon equation captures much of the relevant physics, especially...for moderately heavy ions (Z 137), it does neglect the spin polarization of the electron. This memo parallels [1], but replaces the Klein-Gordon
Schober, Daniel; Jacob, Daniel; Wilson, Michael; Cruz, Joseph A; Marcu, Ana; Grant, Jason R; Moing, Annick; Deborde, Catherine; de Figueiredo, Luis F; Haug, Kenneth; Rocca-Serra, Philippe; Easton, John; Ebbels, Timothy M D; Hao, Jie; Ludwig, Christian; Günther, Ulrich L; Rosato, Antonio; Klein, Matthias S; Lewis, Ian A; Luchinat, Claudio; Jones, Andrew R; Grauslys, Arturas; Larralde, Martin; Yokochi, Masashi; Kobayashi, Naohiro; Porzel, Andrea; Griffin, Julian L; Viant, Mark R; Wishart, David S; Steinbeck, Christoph; Salek, Reza M; Neumann, Steffen
2018-01-02
NMR is a widely used analytical technique with a growing number of repositories available. As a result, demands for a vendor-agnostic, open data format for long-term archiving of NMR data have emerged with the aim to ease and encourage sharing, comparison, and reuse of NMR data. Here we present nmrML, an open XML-based exchange and storage format for NMR spectral data. The nmrML format is intended to be fully compatible with existing NMR data for chemical, biochemical, and metabolomics experiments. nmrML can capture raw NMR data, spectral data acquisition parameters, and where available spectral metadata, such as chemical structures associated with spectral assignments. The nmrML format is compatible with pure-compound NMR data for reference spectral libraries as well as NMR data from complex biomixtures, i.e., metabolomics experiments. To facilitate format conversions, we provide nmrML converters for Bruker, JEOL and Agilent/Varian vendor formats. In addition, easy-to-use Web-based spectral viewing, processing, and spectral assignment tools that read and write nmrML have been developed. Software libraries and Web services for data validation are available for tool developers and end-users. The nmrML format has already been adopted for capturing and disseminating NMR data for small molecules by several open source data processing tools and metabolomics reference spectral libraries, e.g., serving as storage format for the MetaboLights data repository. The nmrML open access data standard has been endorsed by the Metabolomics Standards Initiative (MSI), and we here encourage user participation and feedback to increase usability and make it a successful standard.
Carbon Mineralization by Aqueous Precipitation for Beneficial Use of CO 2 from Flue Gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devenney, Martin; Gilliam, Ryan; Seeker, Randy
The objective of this project was to demonstrate an innovative process to mineralize CO 2 from flue gas directly to reactive carbonates and maximize the value and versatility of its beneficial use products. The program scope includes the design, construction, and testing of a CO 2 Conversion to Material Products (CCMP) Pilot Demonstration Plant utilizing CO 2 from the flue gas of a power production facility in Moss Landing, CA as well as flue gas from coal combustion. This final report details all development, analysis, design and testing of the project. Also included in the final report are an updatedmore » Techno-Economic Analysis and CO 2 Lifecycle Analysis. The subsystems included in the pilot demonstration plant are the mineralization subsystem, the Alkalinity Based on Low Energy (ABLE) subsystem, the waste calcium oxide processing subsystem, and the fiber cement board production subsystem. The fully integrated plant was proven to be capable of capturing CO 2 from various sources (gas and coal) and mineralizing it into a reactive calcium carbonate binder and subsequently producing commercial size (4ftx8ft) fiber cement boards. The final report provides a description of the “as built” design of these subsystems and the results of the commissioning activities that have taken place to confirm operability. The report also discusses the results of the fully integrated operation of the facility. Fiber cement boards have been produced in this facility exclusively using reactive calcium carbonate from captured CO 2 from flue gas. These boards meet all US and China appropriate acceptance standards. Use demonstrations for these boards are now underway.« less
7 CFR 52.801 - Product description.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Product description. 52.801 Section 52.801 Agriculture... PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 United States Standards for Grades of Frozen Red Tart Pitted Cherries Product Description and Grades § 52...
Hydrodynamic description of spin Calogero-Sutherland model
NASA Astrophysics Data System (ADS)
Abanov, Alexander; Kulkarni, Manas; Franchini, Fabio
2009-03-01
We study a non-linear collective field theory for an integrable spin-Calogero-Sutherland model. The hydrodynamic description of this SU(2) model in terms of charge density, charge velocity and spin currents is used to study non-perturbative solutions (solitons) and examine their correspondence with known quantum numbers of elementary excitations [1]. A conventional linear bosonization or harmonic approximation is not sufficient to describe, for example, the physics of spin-charge (non)separation. Therefore, we need this new collective bosonic field description that captures the effects of the band curvature. In the strong coupling limit [2] this model reduces to integrable SU(2) Haldane-Shastry model. We study a non-linear coupling of left and right spin currents which form a Kac-Moody algebra. Our quantum hydrodynamic description for the spin case is an extension for the one found in the spinless version in [3].[3pt] [1] Y. Kato,T. Yamamoto, and M. Arikawa, J. Phys. Soc. Jpn. 66, 1954-1961 (1997).[0pt] [2] A. Polychronakos, Phys Rev Lett. 70,2329-2331(1993).[0pt] [3] A.G.Abanov and P.B. Wiegmann, Phys Rev Lett 95, 076402(2005)
Minimally inconsistent reasoning in Semantic Web.
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.
Minimally inconsistent reasoning in Semantic Web
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning. PMID:28750030
Owen, Yvonne; Amory, Jonathan R
2011-01-01
Traditional techniques used to capture New World monkeys, such as net capture, can induce high levels of acute stress detrimental to welfare. Alternatively, training nonhuman animals via operant conditioning to voluntarily participate in husbandry and/or veterinary practices is accepted as a humane process that can reduce stress and improve welfare. This study details the use of operant conditioning using positive reinforcement training (PRT) and target training to train a family of 5 captive red-bellied tamarins (Saguinus labiatus) in a wildlife park to voluntarily enter a transportation box and remain calm for 1 min after 54 training sessions. Observations of 2 unrelated net-capture processes provided measures of locomotion and vocalizations as indicators of stress behavior that were compared with those of the trained tamarins. Net-captured monkeys exhibited rapid erratic locomotion and emitted long, high-frequency vocalizations during capture whereas the trained tamarins exhibited minimal locomotion and emitted only 4 brief vocalizations (root mean square 35 dB) during capture. This indicates that the use of PRT considerably reduced potential for stress and improved welfare during the capture and containment of the tamarins. Copyright © Taylor & Francis Group, LLC
r-process enhanched metal-poor stars
NASA Astrophysics Data System (ADS)
Cowan, John; Sneden, Christopher; Lawler, James E.; Den Hartog, Elizabeth A.
Abundance observations indicate the presence of rapid-neutron capture (i.e., r-process) elements in old Galactic halo and globular cluster stars. These observations provide insight into the nature of the earliest generations of stars in the Galaxy - the progenitors of the halo stars - responsible for neutron-capture synthesis of the heavy elements. The large star-to-star scatter observed in the abundances of neutron-capture element/iron ratios at low metallicities - which diminishes with in- creasing metallicity or [Fe/H] - suggests the formation of these heavy elements (presumably from certain types of supernovae) was rare in the early Galaxy. The stellar abundances also indicate a change from the r-process to the slow neutron capture (i.e., s-) process at higher metallicities in the Galaxy and provide insight into Galactic chemical evolution. Finally, the detection of thorium and uranium in halo and globular cluster stars offers an independent age-dating technique that can put lower limits on the age of the Galaxy, and hence the Universe.
Electron emission from transfer ionization reaction in 30 keV amu‑1 He 2+ on Ar collision
NASA Astrophysics Data System (ADS)
Amaya-Tapia, A.; Antillón, A.; Estrada, C. D.
2018-06-01
A model is presented that describes the transfer ionization process in H{e}2++Ar collision at a projectile energy of 30 keV amu‑1. It is based on a semiclassical independent-particle close-coupling method that yields a reasonable agreement between calculated and experimental values of the total single-ionization and single-capture cross sections. It is found that the transfer ionization reaction is predominantly carried out through simultaneous capture and ionization, rather than by sequential processes. The transfer-ionization differential cross section in energy that is obtained satisfactorily reproduces the global behavior of the experimental data. Additionally, the probabilities of capture and ionization as function of the impact parameter for H{e}2++A{r}+ and H{e}++A{r}+ collisions are calculated, as far as we know, for the first time. The results suggest that the model captures essential elements that describe the two-electron transfer ionization process and could be applied to systems and processes of two electrons.
Bohl, Vivian; van den Bos, Wouter
2012-01-01
Traditional theory of mind (ToM) accounts for social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional ToM accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of considering ToM and interactionism as mutually exclusive opponents, they should be integrated into a more comprehensive account of social cognition. We draw on dual process models of social cognition that contrast two different types of social cognitive processing. The first type (labeled Type 1) refers to processes that are fast, efficient, stimulus-driven, and relatively inflexible. The second type (labeled Type 2) refers to processes that are relatively slow, cognitively laborious, flexible, and may involve conscious control. We argue that while interactionism captures aspects of social cognition mostly related to Type 1 processes, ToM is more focused on those based on Type 2 processes. We suggest that real life social interactions are rarely based on either Type 1 or Type 2 processes alone. On the contrary, we propose that in most cases both types of processes are simultaneously involved and that social behavior may be sustained by the interplay between these two types of processes. Finally, we discuss how the new integrative framework can guide experimental research on social interaction. PMID:23087631
Bohl, Vivian; van den Bos, Wouter
2012-01-01
Traditional theory of mind (ToM) accounts for social cognition have been at the basis of most studies in the social cognitive neurosciences. However, in recent years, the need to go beyond traditional ToM accounts for understanding real life social interactions has become all the more pressing. At the same time it remains unclear whether alternative accounts, such as interactionism, can yield a sufficient description and explanation of social interactions. We argue that instead of considering ToM and interactionism as mutually exclusive opponents, they should be integrated into a more comprehensive account of social cognition. We draw on dual process models of social cognition that contrast two different types of social cognitive processing. The first type (labeled Type 1) refers to processes that are fast, efficient, stimulus-driven, and relatively inflexible. The second type (labeled Type 2) refers to processes that are relatively slow, cognitively laborious, flexible, and may involve conscious control. We argue that while interactionism captures aspects of social cognition mostly related to Type 1 processes, ToM is more focused on those based on Type 2 processes. We suggest that real life social interactions are rarely based on either Type 1 or Type 2 processes alone. On the contrary, we propose that in most cases both types of processes are simultaneously involved and that social behavior may be sustained by the interplay between these two types of processes. Finally, we discuss how the new integrative framework can guide experimental research on social interaction.
Kaltenbrunner, Monica; Bengtsson, Lars; Mathiassen, Svend Erik; Engström, Maria
2017-03-24
During the past decade, the concept of Lean has spread rapidly within the healthcare sector, but there is a lack of instruments that can measure staff's perceptions of Lean adoption. Thus, the aim of the present study was to develop a questionnaire measuring Lean in healthcare, based on Liker's description of Lean, by adapting an existing instrument developed for the service sector. A mixed-method design was used. Initially, items from the service sector instrument were categorized according to Liker's 14 principles describing Lean within four domains: philosophy, processes, people and partners and problem-solving. Items were lacking for three of Liker's principles and were therefore developed de novo. Think-aloud interviews were conducted with 12 healthcare staff from different professions to contextualize and examine the face validity of the questionnaire prototype. Thereafter, the adjusted questionnaire's psychometric properties were assessed on the basis of a cross-sectional survey among 386 staff working in primary care. The think-aloud interviews led to adjustments in the questionnaire to better suit a healthcare context, and the number of items was reduced. Confirmatory factor analysis of the adjusted questionnaire showed a generally acceptable correspondence with Liker's description of Lean. Internal consistency, measured using Cronbach's alpha, for the factors in Liker's description of Lean was 0.60 for the factor people and partners, and over 0.70 for the three other factors. Test-retest reliability measured by the intra-class correlation coefficient ranged from 0.77 to 0.88 for the four factors. We designed a questionnaire capturing staff's perceptions of Lean adoption in healthcare on the basis of Liker's description. This Lean in Healthcare Questionnaire (LiHcQ) showed generally acceptable psychometric properties, which supports its usability for measuring Lean adoption in healthcare. We suggest that further research focus on verifying the usability of LiHcQ in other healthcare settings, and on adjusting the instrument if needed.
40 CFR 410.30 - Applicability; description of the low water use processing subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Applicability; description of the low... PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS TEXTILE MILLS POINT SOURCE CATEGORY Low Water Use Processing Subcategory § 410.30 Applicability; description of the low water use processing...
Membrane contactors for CO2 capture processes - critical review
NASA Astrophysics Data System (ADS)
Nogalska, Adrianna; Trojanowska, Anna; Garcia-Valls, Ricard
2017-07-01
The use of membrane contactor in industrial processes is wide, and lately it started to be used in CO2 capture process mainly for gas purification or to reduce the emission. Use of the membrane contactor provides high contact surface area so the size of the absorber unit significantly decreases, which is an important factor for commercialization. The research has been caried out regarding the use of novel materials for the membrane production and absorbent solution improvements. The present review reveals the progress in membrane contactor systems for CO2 capture processes concerning solution for ceramic membrane wetting, comparison study of different polymers used for fabrication and methods of enzyme immobilization for biocomposite membrane. Also information about variety of absorbent solutions is described.
NASA Astrophysics Data System (ADS)
Dubovichenko, Sergey; Dzhazairov-Kakhramanov, Albert
We have studied the neutron-capture reactions 8Li(n,γ)9Li and its role in the primordial nucleosynthesis. The n +8Li →9Li + γ reaction has a significant astrophysical interest because it includes one of the variants of chain of primordial nucleosynthesis processes of the Universe and thermonuclear reactions in type II supernovae. Furthermore, we consider the 9Be(p,γ)10B reaction in the astrophysical energy range in the modified potential cluster model (MPCM) with splitting of orbital states according to Young tableaux and, in some cases, with forbidden states (FS). The reaction 9Be(p,γ)10B plays an important role in primordial and stellar nucleosynthesis of light elements in the p shell. Hydrogen burning in second-generation stars occurs via the proton-proton (pp) chain and CNO cycle, with the 9Be(p,γ)10B reaction serving as an intermediate link between these cycles. Furthermore, the possibility of describing available experimental data for the total reaction cross-sections of neutron radiative capture on 10Be at thermal and astrophysical energies has been shown. This reaction is a part of one of the variants of the chain of primordial nucleosynthesis of the Universe due to which the elements with a mass of A > 11-12 may be formed. The results in the field of study of thermonuclear proton-capture reaction on 10B at ultralow, i.e., astrophysical energies will be presented further. The possibility of description of the experimental data for the astrophysical S-factor of the proton radiative capture on 16O to the ground state (GS) of 17F was considered in the frame of the MPCM with FS and classification of the states according to Young tableaux. It was shown that on the basis of the E1 transitions from the states of p16O scattering to the GS of 17F in the p16O channel generally succeed to explain the value of measured cross-sections at astrophysical energies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Amit; Li, Fanxing; Santiso, Erik
Energy and global climate change are two grand challenges to the modern society. An urgent need exists for development of clean and efficient energy conversion processes. The chemical looping strategy, which utilizes regenerable oxygen carriers (OCs) to indirectly convert carbonaceous fuels via redox reactions, is considered to be one of the more promising approaches for CO2 capture by the U.S. Department of Energy (USDOE). To date, most long-term chemical looping operations were conducted using gaseous fuels, even though direct conversion of coal is more desirable from both economics and CO2 capture viewpoints. The main challenges for direct coal conversion residemore » in the stringent requirements on oxygen carrier performances. In addition, coal char and volatile compounds are more challenging to convert than gaseous fuels. A promising approach for direct conversion of coal is the so called chemical looping with oxygen uncoupling (CLOU) technique. In the CLOU process, a metal oxide that decomposes at the looping temperature, and releases oxygen to the gas phase is used as the OC. The overarching objective of this project was to discover the fundamental principles for rational design and optimization of oxygen carriers (OC) in coal chemical looping combustion (CLC) processes. It directly addresses Topic Area B of the funding opportunity announcement (FOA) in terms of “predictive description of the phase behavior and mechanical properties” of “mixed metal oxide” based OCs and rational development of new OC materials with superior functionality. This was achieved through studies exploring i) iron-containing mixed-oxide composites as oxygen carriers for CLOU, ii) Ca1-xAxMnO3-δ (A = Sr and Ba) as oxygen carriers for CLOU, iii) CaMn1-xBxO3-δ (B=Al, V, Fe, Co, and Ni) as oxygen carrier for CLOU and iv) vacancy creation energy in Mn-containing perovskites as an indicator chemical looping with oxygen uncoupling.« less
Role of attentional tags in working memory-driven attentional capture.
Kuo, Chun-Yu; Chao, Hsuan-Fu
2014-08-01
Recent studies have demonstrated that the contents of working memory capture attention when performing a visual search task. However, it remains an intriguing and unresolved question whether all kinds of items stored in working memory capture attention. The present study investigated this issue by manipulating the attentional tags (target or distractor) associated with information maintained in working memory. The results showed that working memory-driven attentional capture is a flexible process, and that attentional tags associated with items stored in working memory do modulate attentional capture. When items were tagged as a target, they automatically captured attention; however, when items were tagged as a distractor, attentional capture was reduced.
NEW NEUTRON-CAPTURE MEASUREMENTS IN 23 OPEN CLUSTERS. I. THE r -PROCESS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Overbeek, Jamie C.; Friel, Eileen D.; Jacobson, Heather R., E-mail: joverbee@indiana.edu
2016-06-20
Neutron-capture elements, those with Z > 35, are the least well understood in terms of nucleosynthesis and formation environments. The rapid neutron-capture, or r -process, elements are formed in the environments and/or remnants of massive stars, while the slow neutron-capture, or s -process, elements are primarily formed in low-mass AGB stars. These elements can provide much information about Galactic star formation and enrichment, but observational data are limited. We have assembled a sample of 68 stars in 23 open clusters that we use to probe abundance trends for six neutron-capture elements (Eu, Gd, Dy, Mo, Pr, and Nd) with clustermore » age and location in the disk of the Galaxy. In order to keep our analysis as homogeneous as possible, we use an automated synthesis fitting program, which also enables us to measure multiple (3–10) lines for each element. We find that the pure r -process elements (Eu, Gd, and Dy) have positive trends with increasing cluster age, while the mixed r - and s -process elements (Mo, Pr, and Nd) have insignificant trends consistent with zero. Pr, Nd, Eu, Gd, and Dy have similar, slight (although mostly statistically significant) gradients of ∼0.04 dex kpc{sup −1}. The mixed elements also appear to have nonlinear relationships with R {sub GC}.« less
Nuclear structure and weak rates of heavy waiting point nuclei under rp-process conditions
NASA Astrophysics Data System (ADS)
Nabi, Jameel-Un; Böyükata, Mahmut
2017-01-01
The structure and the weak interaction mediated rates of the heavy waiting point (WP) nuclei 80Zr, 84Mo, 88Ru, 92Pd and 96Cd along N = Z line were studied within the interacting boson model-1 (IBM-1) and the proton-neutron quasi-particle random phase approximation (pn-QRPA). The energy levels of the N = Z WP nuclei were calculated by fitting the essential parameters of IBM-1 Hamiltonian and their geometric shapes were predicted by plotting potential energy surfaces (PESs). Half-lives, continuum electron capture rates, positron decay rates, electron capture cross sections of WP nuclei, energy rates of β-delayed protons and their emission probabilities were later calculated using the pn-QRPA. The calculated Gamow-Teller strength distributions were compared with previous calculation. We present positron decay and continuum electron capture rates on these WP nuclei under rp-process conditions using the same model. For the rp-process conditions, the calculated total weak rates are twice the Skyrme HF+BCS+QRPA rates for 80Zr. For remaining nuclei the two calculations compare well. The electron capture rates are significant and compete well with the corresponding positron decay rates under rp-process conditions. The finding of the present study supports that electron capture rates form an integral part of the weak rates under rp-process conditions and has an important role for the nuclear model calculations.
Image charge effects on electron capture by dust grains in dusty plasmas.
Jung, Y D; Tawara, H
2001-07-01
Electron-capture processes by negatively charged dust grains from hydrogenic ions in dusty plasmas are investigated in accordance with the classical Bohr-Lindhard model. The attractive interaction between the electron in a hydrogenic ion and its own image charge inside the dust grain is included to obtain the total interaction energy between the electron and the dust grain. The electron-capture radius is determined by the total interaction energy and the kinetic energy of the released electron in the frame of the projectile dust grain. The classical straight-line trajectory approximation is applied to the motion of the ion in order to visualize the electron-capture cross section as a function of the impact parameter, kinetic energy of the projectile ion, and dust charge. It is found that the image charge inside the dust grain plays a significant role in the electron-capture process near the surface of the dust grain. The electron-capture cross section is found to be quite sensitive to the collision energy and dust charge.
Mars Atmospheric Capture and Gas Separation
NASA Technical Reports Server (NTRS)
Muscatello, Anthony; Santiago-Maldonado, Edgardo; Gibson, Tracy; Devor, Robert; Captain, James
2011-01-01
The Mars atmospheric capture and gas separation project is selecting, developing, and demonstrating techniques to capture and purify Martian atmospheric gases for their utilization for the production of hydrocarbons, oxygen, and water in ISRU systems. Trace gases will be required to be separated from Martian atmospheric gases to provide pure C02 to processing elements. In addition, other Martian gases, such as nitrogen and argon, occur in concentrations high enough to be useful as buffer gas and should be captured as welL To achieve these goals, highly efficient gas separation processes will be required. These gas separation techniques are also required across various areas within the ISRU project to support various consumable production processes. The development of innovative gas separation techniques will evaluate the current state-of-the-art for the gas separation required, with the objective to demonstrate and develop light-weight, low-power methods for gas separation. Gas separation requirements include, but are not limited to the selective separation of: (1) methane and water from un-reacted carbon oxides (C02- CO) and hydrogen typical of a Sabatier-type process, (2) carbon oxides and water from unreacted hydrogen from a Reverse Water-Gas Shift process, (3) carbon oxides from oxygen from a trash/waste processing reaction, and (4) helium from hydrogen or oxygen from a propellant scavenging process. Potential technologies for the separations include freezers, selective membranes, selective solvents, polymeric sorbents, zeolites, and new technologies. This paper and presentation will summarize the results of an extensive literature review and laboratory evaluations of candidate technologies for the capture and separation of C02 and other relevant gases.
Ötes, Ozan; Flato, Hendrik; Winderl, Johannes; Hubbuch, Jürgen; Capito, Florian
2017-10-10
The protein A capture step is the main cost-driver in downstream processing, with high attrition costs especially when using protein A resin not until end of resin lifetime. Here we describe a feasibility study, transferring a batch downstream process to a hybrid process, aimed at replacing batch protein A capture chromatography with a continuous capture step, while leaving the polishing steps unchanged to minimize required process adaptations compared to a batch process. 35g of antibody were purified using the hybrid approach, resulting in comparable product quality and step yield compared to the batch process. Productivity for the protein A step could be increased up to 420%, reducing buffer amounts by 30-40% and showing robustness for at least 48h continuous run time. Additionally, to enable its potential application in a clinical trial manufacturing environment cost of goods were compared for the protein A step between hybrid process and batch process, showing a 300% cost reduction, depending on processed volumes and batch cycles. Copyright © 2017 Elsevier B.V. All rights reserved.
Membrane Process to Capture CO{sub 2} from Coal-Fired Power Plant Flue Gas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merkel, Tim; Wei, Xiaotong; Firat, Bilgen
2012-03-31
This final report describes work conducted for the U.S. Department of Energy National Energy Technology Laboratory (DOE NETL) on development of an efficient membrane process to capture carbon dioxide (CO{sub 2}) from power plant flue gas (award number DE-NT0005312). The primary goal of this research program was to demonstrate, in a field test, the ability of a membrane process to capture up to 90% of CO{sub 2} in coal-fired flue gas, and to evaluate the potential of a full-scale version of the process to perform this separation with less than a 35% increase in the levelized cost of electricity (LCOE).more » Membrane Technology and Research (MTR) conducted this project in collaboration with Arizona Public Services (APS), who hosted a membrane field test at their Cholla coal-fired power plant, and the Electric Power Research Institute (EPRI) and WorleyParsons (WP), who performed a comparative cost analysis of the proposed membrane CO{sub 2} capture process. The work conducted for this project included membrane and module development, slipstream testing of commercial-sized modules with natural gas and coal-fired flue gas, process design optimization, and a detailed systems and cost analysis of a membrane retrofit to a commercial power plant. The Polaris? membrane developed over a number of years by MTR represents a step-change improvement in CO{sub 2} permeance compared to previous commercial CO{sub 2}-selective membranes. During this project, membrane optimization work resulted in a further doubling of the CO{sub 2} permeance of Polaris membrane while maintaining the CO{sub 2}/N{sub 2} selectivity. This is an important accomplishment because increased CO{sub 2} permeance directly impacts the membrane skid cost and footprint: a doubling of CO{sub 2} permeance halves the skid cost and footprint. In addition to providing high CO{sub 2} permeance, flue gas CO{sub 2} capture membranes must be stable in the presence of contaminants including SO{sub 2}. Laboratory tests showed no degradation in Polaris membrane performance during two months of continuous operation in a simulated flue gas environment containing up to 1,000 ppm SO{sub 2}. A successful slipstream field test at the APS Cholla power plant was conducted with commercialsize Polaris modules during this project. This field test is the first demonstration of stable performance by commercial-sized membrane modules treating actual coal-fired power plant flue gas. Process design studies show that selective recycle of CO{sub 2} using a countercurrent membrane module with air as a sweep stream can double the concentration of CO{sub 2} in coal flue gas with little energy input. This pre-concentration of CO{sub 2} by the sweep membrane reduces the minimum energy of CO{sub 2} separation in the capture unit by up to 40% for coal flue gas. Variations of this design may be even more promising for CO{sub 2} capture from NGCC flue gas, in which the CO{sub 2} concentration can be increased from 4% to 20% by selective sweep recycle. EPRI and WP conducted a systems and cost analysis of a base case MTR membrane CO{sub 2} capture system retrofitted to the AEP Conesville Unit 5 boiler. Some of the key findings from this study and a sensitivity analysis performed by MTR include: The MTR membrane process can capture 90% of the CO{sub 2} in coal flue gas and produce high-purity CO{sub 2} (>99%) ready for sequestration. CO{sub 2} recycle to the boiler appears feasible with minimal impact on boiler performance; however, further study by a boiler OEM is recommended. For a membrane process built today using a combination of slight feed compression, permeate vacuum, and current compression equipment costs, the membrane capture process can be competitive with the base case MEA process at 90% CO{sub 2} capture from a coal-fired power plant. The incremental LCOE for the base case membrane process is about equal to that of a base case MEA process, within the uncertainty in the analysis. With advanced membranes (5,000 gpu for CO{sub 2} and 50 for CO{sub 2}/N{sub 2}), operating with no feed compression and low-cost CO{sub 2} compression equipment, an incremental LCOE of $33/MWh at 90% capture can be achieved (40% lower than the advanced MEA case). Even with lower cost compression, it appears unlikely that a membrane process using high feed compression (>5 bar) can be competitive with amine absorption, due to the capital cost and energy consumption of this equipment. Similarly, low vacuum pressure (<0.2 bar) cannot be used due to poor efficiency and high cost of this equipment. High membrane permeance is important to reduce the capital cost and footprint of the membrane unit. CO{sub 2}/N{sub 2} selectivity is less important because it is too costly to generate a pressure ratio where high selectivity can be useful. A potential cost ?sweet spot? exists for use of membrane-based technology, if 50-70% CO{sub 2} capture is acceptable. There is a minimum in the cost of CO{sub 2} avoided/ton that membranes can deliver at 60% CO{sub 2} capture, which is 20% lower than the cost at 90% capture. Membranes operating with no feed compression are best suited for lower capture rates. Currently, it appears that the biggest hurdle to use of membranes for post-combustion CO{sub 2} capture is compression equipment cost. An alternative approach is to use sweep membranes in parallel with another CO{sub 2} capture technology that does not require feed compression or vacuum equipment. Hybrid designs that utilize sweep membranes for selective CO{sub 2} recycle show potential to significantly reduce the minimum energy of CO{sub 2} separation.« less
A Reversed Photosynthesis-like Process for Light-Triggered CO2 Capture, Release, and Conversion.
Wang, Dingguan; Liao, Shenglong; Zhang, Shiming; Wang, Yapei
2017-06-22
Materials for CO 2 capture have been extensively exploited for climate governance and gas separation. However, their regeneration is facing the problems of high energy cost and secondary CO 2 contamination. Herein, a reversed photosynthesis-like process is proposed, in which CO 2 is absorbed in darkness while being released under light illumination. The process is likely supplementary to natural photosynthesis of plants, in which, on the contrary, CO 2 is released during the night. Remarkably, the material used here is able to capture 9.6 wt.% CO 2 according to its active component. Repeatable CO 2 capture at room temperature and release under light irradiation ensures its convenient and cost-effective regeneration. Furthermore, CO 2 released from the system is successfully converted into a stable compound in tandem with specific catalysts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Chien, Steve; Kandt, R. Kirk; Roden, Joseph; Burleigh, Scott; King, Todd; Joy, Steve
1992-01-01
Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and runtime estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks. Because the scientific data processing modules (called fittings) evolve to match scientists' needs, issues regarding maintainability are of prime importance in PIPE. This paper describes the PIPE system and describes how issues in maintainability affected the knowledge representation used in PIPE to capture knowledge about the behavior of fittings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Zhen; Wong, Michael; Gupta, Mayank
The Rice University research team developed a hybrid carbon dioxide (CO 2) absorption process combining absorber and stripper columns using a high surface area ceramic foam gas-liquid contactor for enhanced mass transfer and utilizing waste heat for regeneration. This integrated absorber/desorber arrangement will reduce space requirements, an important factor for retrofitting existing coal-fired power plants with CO 2 capture technology. Described in this report, we performed an initial analysis to estimate the technical and economic feasibility of the process. A one-dimensional (1D) CO 2 absorption column was fabricated to measure the hydrodynamic and mass transfer characteristics of the ceramic foam.more » A bench-scale prototype was constructed to implement the complete CO 2 separation process and tested to study various aspects of fluid flow in the process. A model was developed to simulate the two-dimensional (2D) fluid flow and optimize the CO 2 capture process. Test results were used to develop a final technoeconomic analysis and identify the most appropriate absorbent as well as optimum operating conditions to minimize capital and operating costs. Finally, a technoeconomic study was performed to assess the feasibility of integrating the process into a 600 megawatt electric (MWe) coal-fired power plant. With process optimization, $82/MWh of COE can be achieved using our integrated absorber/desorber CO 2 capture technology, which is very close to DOE's target that no more than a 35% increase in COE with CCS. An environmental, health, and safety (EH&S) assessment of the capture process indicated no significant concern in terms of EH&S effects or legislative compliance.« less
Landman, Adam; Emani, Srinivas; Carlile, Narath; Rosenthal, David I; Semakov, Simon; Pallin, Daniel J; Poon, Eric G
2015-01-02
Photographs are important tools to record, track, and communicate clinical findings. Mobile devices with high-resolution cameras are now ubiquitous, giving clinicians the opportunity to capture and share images from the bedside. However, secure and efficient ways to manage and share digital images are lacking. The aim of this study is to describe the implementation of a secure application for capturing and storing clinical images in the electronic health record (EHR), and to describe initial user experiences. We developed CliniCam, a secure Apple iOS (iPhone, iPad) application that allows for user authentication, patient selection, image capture, image annotation, and storage of images as a Portable Document Format (PDF) file in the EHR. We leveraged our organization's enterprise service-oriented architecture to transmit the image file from CliniCam to our enterprise clinical data repository. There is no permanent storage of protected health information on the mobile device. CliniCam also required connection to our organization's secure WiFi network. Resident physicians from emergency medicine, internal medicine, and dermatology used CliniCam in clinical practice for one month. They were then asked to complete a survey on their experience. We analyzed the survey results using descriptive statistics. Twenty-eight physicians participated and 19/28 (68%) completed the survey. Of the respondents who used CliniCam, 89% found it useful or very useful for clinical practice and easy to use, and wanted to continue using the app. Respondents provided constructive feedback on location of the photos in the EHR, preferring to have photos embedded in (or linked to) clinical notes instead of storing them as separate PDFs within the EHR. Some users experienced difficulty with WiFi connectivity which was addressed by enhancing CliniCam to check for connectivity on launch. CliniCam was implemented successfully and found to be easy to use and useful for clinical practice. CliniCam is now available to all clinical users in our hospital, providing a secure and efficient way to capture clinical images and to insert them into the EHR. Future clinical image apps should more closely link clinical images and clinical documentation and consider enabling secure transmission over public WiFi or cellular networks.
NASA Astrophysics Data System (ADS)
Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.
2014-02-01
Watershed-scale modeling can be a valuable tool to aid in quantification of water quality and yield; however, several challenges remain. In many watersheds, it is difficult to adequately quantify hydrologic partitioning. Data scarcity is prevalent, accuracy of spatially distributed meteorology is difficult to quantify, forest encroachment and land use issues are common, and surface water and groundwater abstractions substantially modify watershed-based processes. Our objective is to assess the capability of the Soil and Water Assessment Tool (SWAT) model to capture event-based and long-term monsoonal rainfall-runoff processes in complex mountainous terrain. To accomplish this, we developed a unique quality-control, gap-filling algorithm for interpolation of high-frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. The interdisciplinary model was calibrated to a unique combination of statistical, hydrologic, and plant growth metrics. Our results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. The addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. While this study shows the challenges of applying the SWAT model to complex terrain and extreme environments; by incorporating anthropogenic features into modeling scenarios, we can enhance our understanding of the hydroecological impact.
NASA Astrophysics Data System (ADS)
Hoose, C.; Lohmann, U.; Stier, P.; Verheggen, B.; Weingartner, E.
2008-04-01
The global aerosol-climate model ECHAM5-HAM has been extended by an explicit treatment of cloud-borne particles. Two additional modes for in-droplet and in-crystal particles are introduced, which are coupled to the number of cloud droplet and ice crystal concentrations simulated by the ECHAM5 double-moment cloud microphysics scheme. Transfer, production, and removal of cloud-borne aerosol number and mass by cloud droplet activation, collision scavenging, aqueous-phase sulfate production, freezing, melting, evaporation, sublimation, and precipitation formation are taken into account. The model performance is demonstrated and validated with observations of the evolution of total and interstitial aerosol concentrations and size distributions during three different mixed-phase cloud events at the alpine high-altitude research station Jungfraujoch (Switzerland). Although the single-column simulations cannot be compared one-to-one with the observations, the governing processes in the evolution of the cloud and aerosol parameters are captured qualitatively well. High scavenged fractions are found during the presence of liquid water, while the release of particles during the Bergeron-Findeisen process results in low scavenged fractions after cloud glaciation. The observed coexistence of liquid and ice, which might be related to cloud heterogeneity at subgrid scales, can only be simulated in the model when assuming nonequilibrium conditions.
Positron follow-up in liquid water: I. A new Monte Carlo track-structure code.
Champion, C; Le Loirec, C
2006-04-07
When biological matter is irradiated by charged particles, a wide variety of interactions occur, which lead to a deep modification of the cellular environment. To understand the fine structure of the microscopic distribution of energy deposits, Monte Carlo event-by-event simulations are particularly suitable. However, the development of these track-structure codes needs accurate interaction cross sections for all the electronic processes: ionization, excitation, positronium formation and even elastic scattering. Under these conditions, we have recently developed a Monte Carlo code for positrons in water, the latter being commonly used to simulate the biological medium. All the processes are studied in detail via theoretical differential and total cross-section calculations performed by using partial wave methods. Comparisons with existing theoretical and experimental data in terms of stopping powers, mean energy transfers and ranges show very good agreements. Moreover, thanks to the theoretical description of positronium formation, we have access, for the first time, to the complete kinematics of the electron capture process. Then, the present Monte Carlo code is able to describe the detailed positronium history, which will provide useful information for medical imaging (like positron emission tomography) where improvements are needed to define with the best accuracy the tumoural volumes.
40 CFR 63.1013 - Sampling connection systems standards.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) National Emission Standards for Equipment Leaks-Control Level 1 § 63.1013 Sampling connection... container are not required to be collected or captured. (c) Equipment design and operation. Each closed... process fluid to a process; or (3) Be designed and operated to capture and transport all the purged...
Manual for the Construction of a Mercury Capture System for Use in Gold Shops
Download a manual for the construction of a mercury capture system for use in gold shops with detailed information for constructing a device to capture mercury aerosol particles emitted from gold shops that process gold dore’, a gold-mercury amalgam.
Singh, Rajinder P.; Dahe, Ganpat J.; Dudeck, Kevin W.; ...
2014-12-31
Sustainable reliance on hydrocarbon feedstocks for energy generation requires CO₂ separation technology development for energy efficient carbon capture from industrial mixed gas streams. High temperature H₂ selective glassy polymer membranes are an attractive option for energy efficient H₂/CO₂ separations in advanced power production schemes with integrated carbon capture. They enable high overall process efficiencies by providing energy efficient CO₂ separations at process relevant operating conditions and correspondingly, minimized parasitic energy losses. Polybenzimidazole (PBI)-based materials have demonstrated commercially attractive H₂/CO₂ separation characteristics and exceptional tolerance to hydrocarbon fuel derived synthesis (syngas) gas operating conditions and chemical environments. To realize a commerciallymore » attractive carbon capture technology based on these PBI materials, development of high performance, robust PBI hollow fiber membranes (HFMs) is required. In this work, we discuss outcomes of our recent efforts to demonstrate and optimize the fabrication and performance of PBI HFMs for use in pre-combustion carbon capture schemes. These efforts have resulted in PBI HFMs with commercially attractive fabrication protocols, defect minimized structures, and commercially attractive permselectivity characteristics at IGCC syngas process relevant conditions. The H₂/CO₂ separation performance of these PBI HFMs presented in this document regarding realistic process conditions is greater than that of any other polymeric system reported to-date.« less
Battaglia, Maurizio; Gottsmann, J.; Carbone, D.; Fernandez, J.
2008-01-01
Time-dependent gravimetric measurements can detect subsurface processes long before magma flow leads to earthquakes or other eruption precursors. The ability of gravity measurements to detect subsurface mass flow is greatly enhanced if gravity measurements are analyzed and modeled with ground-deformation data. Obtaining the maximum information from microgravity studies requires careful evaluation of the layout of network benchmarks, the gravity environmental signal, and the coupling between gravity changes and crustal deformation. When changes in the system under study are fast (hours to weeks), as in hydrothermal systems and restless volcanoes, continuous gravity observations at selected sites can help to capture many details of the dynamics of the intrusive sources. Despite the instrumental effects, mainly caused by atmospheric temperature, results from monitoring at Mt. Etna volcano show that continuous measurements are a powerful tool for monitoring and studying volcanoes.Several analytical and numerical mathematical models can beused to fit gravity and deformation data. Analytical models offer a closed-form description of the volcanic source. In principle, this allows one to readily infer the relative importance of the source parameters. In active volcanic sites such as Long Valley caldera (California, U.S.A.) and Campi Flegrei (Italy), careful use of analytical models and high-quality data sets has produced good results. However, the simplifications that make analytical models tractable might result in misleading volcanological inter-pretations, particularly when the real crust surrounding the source is far from the homogeneous/ isotropic assumption. Using numerical models allows consideration of more realistic descriptions of the sources and of the crust where they are located (e.g., vertical and lateral mechanical discontinuities, complex source geometries, and topography). Applications at Teide volcano (Tenerife) and Campi Flegrei demonstrate the importance of this more realistic description in gravity calculations. ?? 2008 Society of Exploration Geophysicists. All rights reserved.
Ringstad, Oystein
2010-08-01
This paper presents and evaluates a methodological approach aiming at analysing some of the complex interaction between patients and different health care practitioners working together in teams. Qualitative health care research describes the values, perceptions and conceptions of patients and practitioners. In modern clinical work patients and professional practitioners often work together on complex cases involving different kinds of knowledge and values, each of them representing different perspectives. We need studies designed to capture this complexity. The methodological approach presented here is exemplified with a study in rehabilitation medicine. In this part of the health care system the clinical work is organized in multi-professional clinical teams including patients, handling complex rehabilitation processes. In the presented approach data are collected in individual in-depth interviews to have thorough descriptions of each individual perspective. The interaction in the teams is analysed by comparing different descriptions of the same situations from the involved individuals. We may then discuss how these perceptions relate to each other and how the individuals in the team interact. Two examples from an empirical study are presented and discussed, illustrating how communication, differences in evaluations and the interpretation of incidents, arguments, emotions and interpersonal relations may be discussed. It is argued that this approach may give information which can supplement the methods commonly applied in qualitative health care research today.
Toward a metric for patterned injury analysis
NASA Astrophysics Data System (ADS)
Oliver, William R.; Fritsch, Daniel S.
1997-02-01
An intriguing question in the matching of objects with patterned injures in two and three dimensions is that of an appropriate metric for closeness -- is it possible to objectively measure how well an object 'fits' a patterned injury. Many investigators have suggested an energy-based metric, and have used such metrics to analyze craniofacial growth and anatomic variation. A strict dependence on homology is the primary disadvantage of this energy functional for generalized biological structures; many shapes do not have obvious landmarks. Some tentative solutions to the problem of landmark dependency for patterned injury analysis are presented. One intriguing approach comes from recent work in axiomatic vision. This approach has resulted in the development of a multiresolution medial axis for the extraction of shape primitives which can be used as the basis for registration. A scale-based description of this process can be captured in structures called cores, which can describe object shape and position in a highly compact manner. Cores may provide a scale- and shape-based method of determining correspondences necessary for determining the number and position of landmarks for some patterned injuries. Each of the approaches described are generalizable to higher dimensions, and can thus be used to analyze both two- and three- dimensional data. Together, they may represent a reasonable way of measuring shape distance for the purpose of matching objects and wounds, and can be combined with texture measures for a complete description.
SSHAC Workshop 1 - November 15-18, 2010 | NGA East
(2:00-4:30) Methods for finite fault simulations 2:00-3:00 Methods considered: description of selected methods (2.Yuehua Zeng; Sim WG) 3:00-3:30 Discussion 3:30-3:45 Break 3:45-4:00 Discussion: Capture of representative methods 4:00-4:30 Summary of today's key issues (3.Norman Abrahamson) 4:30-5:00
ERIC Educational Resources Information Center
Bird, Elizabeth Kay-Raining; Joshi, Nila; Cleave, Patricia L.
2016-01-01
Purpose: The Expository Scoring Scheme (ESS) is designed to analyze the macrostructure of descriptions of a favorite game or sport. This pilot study examined inter- and intrarater reliability of the ESS and use of the scale to capture developmental change in elementary school children. Method: Twenty-four children in 2 language groups (monolingual…
ERIC Educational Resources Information Center
White, April L.
2013-01-01
Many organizations find selecting a leader to be highly challenging. Investigators have found and admit that the study of leadership is a very complex phenomenon that cannot be easily captured and explained in a manner that could lead to a final description about leadership or offer clear steps on how to choose the right leader. Among the many…
Advanced Extravehicular Mobility Unit Informatics Software Design
NASA Technical Reports Server (NTRS)
Wright, Theodore
2014-01-01
This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.
Noisy Oscillations in the Actin Cytoskeleton of Chemotactic Amoeba.
Negrete, Jose; Pumir, Alain; Hsu, Hsin-Fang; Westendorf, Christian; Tarantola, Marco; Beta, Carsten; Bodenschatz, Eberhard
2016-09-30
Biological systems with their complex biochemical networks are known to be intrinsically noisy. Here we investigate the dynamics of actin polymerization of amoeboid cells, which are close to the onset of oscillations. We show that the large phenotypic variability in the polymerization dynamics can be accurately captured by a generic nonlinear oscillator model in the presence of noise. We determine the relative role of the noise with a single dimensionless, experimentally accessible parameter, thus providing a quantitative description of the variability in a population of cells. Our approach, which rests on a generic description of a system close to a Hopf bifurcation and includes the effect of noise, can characterize the dynamics of a large class of noisy systems close to an oscillatory instability.
Noisy Oscillations in the Actin Cytoskeleton of Chemotactic Amoeba
NASA Astrophysics Data System (ADS)
Negrete, Jose; Pumir, Alain; Hsu, Hsin-Fang; Westendorf, Christian; Tarantola, Marco; Beta, Carsten; Bodenschatz, Eberhard
2016-09-01
Biological systems with their complex biochemical networks are known to be intrinsically noisy. Here we investigate the dynamics of actin polymerization of amoeboid cells, which are close to the onset of oscillations. We show that the large phenotypic variability in the polymerization dynamics can be accurately captured by a generic nonlinear oscillator model in the presence of noise. We determine the relative role of the noise with a single dimensionless, experimentally accessible parameter, thus providing a quantitative description of the variability in a population of cells. Our approach, which rests on a generic description of a system close to a Hopf bifurcation and includes the effect of noise, can characterize the dynamics of a large class of noisy systems close to an oscillatory instability.
Bauer, Ulrike; Willmes, Christoph; Federle, Walter
2009-06-01
Nepenthes pitchers are sophisticated traps that employ a variety of mechanisms to attract, capture and retain prey. The underlying morphological structures and physiological processes are subject to change over the lifetime of a pitcher. Here an investigation was carried out on how pitcher properties and capture efficiency change over the first 2 weeks after pitcher opening. Prey capture, trapping efficiency, extrafloral nectar secretion, pitcher odour, as well as pH and viscoelasticity of the digestive fluid in N. rafflesiana pitchers were monitored in the natural habitat from pitcher opening up to an age of 2 weeks. Pitchers not only increased their attractiveness over this period by becoming more fragrant and secreting more nectar, but also gained mechanical trapping efficiency via an enhanced wettability of the upper pitcher rim (peristome). Consistently, natural prey capture was initially low and increased 3-6 d after opening. It was, however, highly variable within and among pitchers. At the same time, the pH and viscoelasticity of the digestive fluid decreased, suggesting that the latter is not essential for effective prey capture. Prey capture and attraction by Nepenthes are dynamic processes strongly influenced by the changing properties of the pitcher. The results confirm insect aquaplaning on the peristome as the main capture mechanism in N. rafflesiana.
Energy and material balance of CO2 capture from ambient air.
Zeman, Frank
2007-11-01
Current Carbon Capture and Storage (CCS) technologies focus on large, stationary sources that produce approximately 50% of global CO2 emissions. We propose an industrial technology that captures CO2 directly from ambient air to target the remaining emissions. First, a wet scrubbing technique absorbs CO2 into a sodium hydroxide solution. The resultant carbonate is transferred from sodium ions to calcium ions via causticization. The captured CO2 is released from the calcium carbonate through thermal calcination in a modified kiln. The energy consumption is calculated as 350 kJ/mol of CO2 captured. It is dominated by the thermal energy demand of the kiln and the mechanical power required for air movement. The low concentration of CO2 in air requires a throughput of 3 million cubic meters of air per ton of CO2 removed, which could result in significant water losses. Electricity consumption in the process results in CO2 emissions and the use of coal power would significantly reduce to net amount captured. The thermodynamic efficiency of this process is low but comparable to other "end of pipe" capture technologies. As another carbon mitigation technology, air capture could allow for the continued use of liquid hydrocarbon fuels in the transportation sector.
Economic and energetic analysis of capturing CO2 from ambient air
House, Kurt Zenz; Baclig, Antonio C.; Ranjan, Manya; van Nierop, Ernst A.; Wilcox, Jennifer; Herzog, Howard J.
2011-01-01
Capturing carbon dioxide from the atmosphere (“air capture”) in an industrial process has been proposed as an option for stabilizing global CO2 concentrations. Published analyses suggest these air capture systems may cost a few hundred dollars per tonne of CO2, making it cost competitive with mainstream CO2 mitigation options like renewable energy, nuclear power, and carbon dioxide capture and storage from large CO2 emitting point sources. We investigate the thermodynamic efficiencies of commercial separation systems as well as trace gas removal systems to better understand and constrain the energy requirements and costs of these air capture systems. Our empirical analyses of operating commercial processes suggest that the energetic and financial costs of capturing CO2 from the air are likely to have been underestimated. Specifically, our analysis of existing gas separation systems suggests that, unless air capture significantly outperforms these systems, it is likely to require more than 400 kJ of work per mole of CO2, requiring it to be powered by CO2-neutral power sources in order to be CO2 negative. We estimate that total system costs of an air capture system will be on the order of $1,000 per tonne of CO2, based on experience with as-built large-scale trace gas removal systems. PMID:22143760
Fusion cross sections for reactions involving medium and heavy nucleus-nucleus systems
NASA Astrophysics Data System (ADS)
Atta, Debasis; Basu, D. N.
2014-12-01
Existing data on near-barrier fusion excitation functions of medium and heavy nucleus-nucleus systems have been analyzed by using a simple diffused-barrier formula derived assuming the Gaussian shape of the barrier-height distributions. The fusion cross section is obtained by folding the Gaussian barrier distribution with the classical expression for the fusion cross section for a fixed barrier. The energy dependence of the fusion cross section, thus obtained, provides good description to the existing data on near-barrier fusion and capture excitation functions for medium and heavy nucleus-nucleus systems. The theoretical values for the parameters of the barrier distribution are estimated which can be used for fusion or capture cross-section predictions that are especially important for planning experiments for synthesizing new superheavy elements.
Ultrafast chirped optical waveform recorder using a time microscope
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Corey Vincent
2015-04-21
A new technique for capturing both the amplitude and phase of an optical waveform is presented. This technique can capture signals with many THz of bandwidths in a single shot (e.g., temporal resolution of about 44 fs), or be operated repetitively at a high rate. That is, each temporal window (or frame) is captured single shot, in real time, but the process may be run repeatedly or single-shot. By also including a variety of possible demultiplexing techniques, this process is scalable to recoding continuous signals.
FRAP Analysis: Accounting for Bleaching during Image Capture
Wu, Jun; Shekhar, Nandini; Lele, Pushkar P.; Lele, Tanmay P.
2012-01-01
The analysis of Fluorescence Recovery After Photobleaching (FRAP) experiments involves mathematical modeling of the fluorescence recovery process. An important feature of FRAP experiments that tends to be ignored in the modeling is that there can be a significant loss of fluorescence due to bleaching during image capture. In this paper, we explicitly include the effects of bleaching during image capture in the model for the recovery process, instead of correcting for the effects of bleaching using reference measurements. Using experimental examples, we demonstrate the usefulness of such an approach in FRAP analysis. PMID:22912750
Modern alchemy: Fred Hoyle and element building by neutron capture
NASA Astrophysics Data System (ADS)
Burbidge, E. Margaret
Fred Hoyle's fundamental work on building the chemical elements by nuclear processes in stars at various stages in their lives began with the building of elements around iron in the very dense hot interiors of stars. Later, in the paper by Burbidge, Burbidge, Fowler and Hoyle, we four showed that Hoyle's "equilibrium process" is one of eight processes required to make all of the isotopes of all the elements detected in the Sun and stars. Neutron capture reactions, which Fred had not considered in his epochal 1946 paper, but for which experimental data were just becoming available in 1957, are very important, in addition to the energy-generating reactions involving hydrogen, helium, carbon, nitrogen and oxygen, for building all of the elements. They are now providing clues to the late stages of stellar evolution and the earliest history of our Galaxy. I describe here our earliest observational work on neutron capture processes in evolved stars, some new work on stars showing the results of the neutron capture reactions, and data relating to processes ending in the production of lead, and I discuss where this fits into the history of stars in our own Galaxy.
2017-01-01
Developing efficient methods for capture and controlled release of carbon dioxide is crucial to any carbon capture and utilization technology. Herein we present an approach using an organic semiconductor electrode to electrochemically capture dissolved CO2 in aqueous electrolytes. The process relies on electrochemical reduction of a thin film of a naphthalene bisimide derivative, 2,7-bis(4-(2-(2-ethylhexyl)thiazol-4-yl)phenyl)benzo[lmn][3,8]phenanthroline-1,3,6,8(2H,7H)-tetraone (NBIT). This molecule is specifically tailored to afford one-electron reversible and one-electron quasi-reversible reduction in aqueous conditions while not dissolving or degrading. The reduced NBIT reacts with CO2 to form a stable semicarbonate salt, which can be subsequently oxidized electrochemically to release CO2. The semicarbonate structure is confirmed by in situ IR spectroelectrochemistry. This process of capturing and releasing carbon dioxide can be realized in an oxygen-free environment under ambient pressure and temperature, with uptake efficiency for CO2 capture of ∼2.3 mmol g–1. This is on par with the best solution-phase amine chemical capture technologies available today. PMID:28378994
Jayatilleke, Nishamali; Kolliakou, Anna; Ball, Michael; Gorrell, Genevieve; Roberts, Angus; Stewart, Robert
2017-01-01
Objectives We sought to use natural language processing to develop a suite of language models to capture key symptoms of severe mental illness (SMI) from clinical text, to facilitate the secondary use of mental healthcare data in research. Design Development and validation of information extraction applications for ascertaining symptoms of SMI in routine mental health records using the Clinical Record Interactive Search (CRIS) data resource; description of their distribution in a corpus of discharge summaries. Setting Electronic records from a large mental healthcare provider serving a geographic catchment of 1.2 million residents in four boroughs of south London, UK. Participants The distribution of derived symptoms was described in 23 128 discharge summaries from 7962 patients who had received an SMI diagnosis, and 13 496 discharge summaries from 7575 patients who had received a non-SMI diagnosis. Outcome measures Fifty SMI symptoms were identified by a team of psychiatrists for extraction based on salience and linguistic consistency in records, broadly categorised under positive, negative, disorganisation, manic and catatonic subgroups. Text models for each symptom were generated using the TextHunter tool and the CRIS database. Results We extracted data for 46 symptoms with a median F1 score of 0.88. Four symptom models performed poorly and were excluded. From the corpus of discharge summaries, it was possible to extract symptomatology in 87% of patients with SMI and 60% of patients with non-SMI diagnosis. Conclusions This work demonstrates the possibility of automatically extracting a broad range of SMI symptoms from English text discharge summaries for patients with an SMI diagnosis. Descriptive data also indicated that most symptoms cut across diagnoses, rather than being restricted to particular groups. PMID:28096249
NASA Astrophysics Data System (ADS)
Topping, David; Alibay, Irfan; Bane, Michael
2017-04-01
To predict the evolving concentration, chemical composition and ability of aerosol particles to act as cloud droplets, we rely on numerical modeling. Mechanistic models attempt to account for the movement of compounds between the gaseous and condensed phases at a molecular level. This 'bottom up' approach is designed to increase our fundamental understanding. However, such models rely on predicting the properties of molecules and subsequent mixtures. For partitioning between the gaseous and condensed phases this includes: saturation vapour pressures; Henrys law coefficients; activity coefficients; diffusion coefficients and reaction rates. Current gas phase chemical mechanisms predict the existence of potentially millions of individual species. Within a dynamic ensemble model, this can often be used as justification for neglecting computationally expensive process descriptions. Indeed, on whether we can quantify the true sensitivity to uncertainties in molecular properties, even at the single aerosol particle level it has been impossible to embed fully coupled representations of process level knowledge with all possible compounds, typically relying on heavily parameterised descriptions. Relying on emerging numerical frameworks, and designed for the changing landscape of high-performance computing (HPC), in this study we focus specifically on the ability to capture activity coefficients in liquid solutions using the UNIFAC method. Activity coefficients are often neglected with the largely untested hypothesis that they are simply too computationally expensive to include in dynamic frameworks. We present results demonstrating increased computational efficiency for a range of typical scenarios, including a profiling of the energy use resulting from reliance on such computations. As the landscape of HPC changes, the latter aspect is important to consider in future applications.
SEIS-PROV: Practical Provenance for Seismological Data
NASA Astrophysics Data System (ADS)
Krischer, L.; Smith, J. A.; Tromp, J.
2015-12-01
It is widely recognized that reproducibility is crucial to advance science, but at the same time it is very hard to actually achieve. This results in it being recognized but also mostly ignored by a large fraction of the community. A key ingredient towards full reproducibility is to capture and describe the history of data, an issue known as provenance. We present SEIS-PROV, a practical format and data model to store provenance information for seismological data. In a seismological context, provenance can be seen as information about the processes that generated and modified a particular piece of data. For synthetic waveforms the provenance information describes which solver and settings therein were used to generate it. When looking at processed seismograms, the provenance conveys information about the different time series analysis steps that led to it. Additional uses include the description of derived data types, such as cross-correlations and adjoint sources, enabling their proper storage and exchange. SEIS-PROV is based on W3C PROV (http://www.w3.org/TR/prov-overview/), a standard for generic provenance information. It then applies an additional set of constraints to make it suitable for seismology. We present a definition of the SEIS-PROV format, a way to check if any given file is a valid SEIS-PROV document, and two sample implementations: One in SPECFEM3D GLOBE (https://geodynamics.org/cig/software/specfem3d_globe/) to store the provenance information of synthetic seismograms and another one as part of the ObsPy (http://obspy.org) framework enabling automatic tracking of provenance information during a series of analysis and transformation stages. This, along with tools to visualize and interpret provenance graphs, offers a description of data history that can be readily tracked, stored, and exchanged.
Capturing business requirements for the Swedish national information structure.
Kajbjer, Karin; Johansson, Catharina
2009-01-01
As a subproject for the National Information Structure project of the National Board of Health and Welfare, four different stakeholder groups were used to capture business requirements. These were: Subjects of care, Health professionals, Managers/Research and Industry. The process is described with formulating goal models, concept, process and information models.
40 CFR 63.11167 - What definitions apply to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... produce the anodes used in the electrolytic process for the production of zinc. Bag leak detection system... melt cadmium or produce cadmium oxide from the cadmium recovered in the zinc production process. Capture system means the collection of equipment used to capture gases and fumes released from one or more...
40 CFR 408.330 - Applicability; description of the abalone processing subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... abalone processing subcategory. 408.330 Section 408.330 Protection of Environment ENVIRONMENTAL PROTECTION... CATEGORY Abalone Processing Subcategory § 408.330 Applicability; description of the abalone processing... abalone in the contiguous states. ...
Evidence accumulation in decision making: unifying the "take the best" and the "rational" models.
Lee, Michael D; Cummins, Tarrant D R
2004-04-01
An evidence accumulation model of forced-choice decision making is proposed to unify the fast and frugal take the best (TTB) model and the alternative rational (RAT) model with which it is usually contrasted. The basic idea is to treat the TTB model as a sequential-sampling process that terminates as soon as any evidence in favor of a decision is found and the rational approach as a sequential-sampling process that terminates only when all available information has been assessed. The unified TTB and RAT models were tested in an experiment in which participants learned to make correct judgments for a set of real-world stimuli on the basis of feedback, and were then asked to make additional judgments without feedback for cases in which the TTB and the rational models made different predictions. The results show that, in both experiments, there was strong intraparticipant consistency in the use of either the TTB or the rational model but large interparticipant differences in which model was used. The unified model is shown to be able to capture the differences in decision making across participants in an interpretable way and is preferred by the minimum description length model selection criterion.
Comparison of electromagnetic and nuclear dissociation of 17Ne
NASA Astrophysics Data System (ADS)
Wamers, F.; Marganiec, J.; Aksouh, F.; Aksyutina, Yu.; Alvarez-Pol, H.; Aumann, T.; Beceiro-Novo, S.; Bertulani, C. A.; Boretzky, K.; Borge, M. J. G.; Chartier, M.; Chatillon, A.; Chulkov, L. V.; Cortina-Gil, D.; Emling, H.; Ershova, O.; Fraile, L. M.; Fynbo, H. O. U.; Galaviz, D.; Geissel, H.; Heil, M.; Hoffmann, D. H. H.; Hoffman, J.; Johansson, H. T.; Jonson, B.; Karagiannis, C.; Kiselev, O. A.; Kratz, J. V.; Kulessa, R.; Kurz, N.; Langer, C.; Lantz, M.; Le Bleis, T.; Lehr, C.; Lemmon, R.; Litvinov, Yu. A.; Mahata, K.; Müntz, C.; Nilsson, T.; Nociforo, C.; Ott, W.; Panin, V.; Paschalis, S.; Perea, A.; Plag, R.; Reifarth, R.; Richter, A.; Riisager, K.; Rodriguez-Tajes, C.; Rossi, D.; Savran, D.; Schrieder, G.; Simon, H.; Stroth, J.; Sümmerer, K.; Tengblad, O.; Typel, S.; Weick, H.; Wiescher, M.; Wimmer, C.
2018-03-01
The Borromean drip-line nucleus 17Ne has been suggested to possess a two-proton halo structure in its ground state. In the astrophysical r p -process, where the two-proton capture reaction 15O(2 p ,γ )17Ne plays an important role, the calculated reaction rate differs by several orders of magnitude between different theoretical approaches. To add to the understanding of the 17Ne structure we have studied nuclear and electromagnetic dissociation. A 500 MeV/u 17Ne beam was directed toward lead, carbon, and polyethylene targets. Oxygen isotopes in the final state were measured in coincidence with one or two protons. Different reaction branches in the dissociation of 17Ne were disentangled. The relative populations of s and d states in 16F were determined for light and heavy targets. The differential cross section for electromagnetic dissociation (EMD) shows a continuous internal energy spectrum in the three-body system 15O+2 p . The 17Ne EMD data were compared to current theoretical models. None of them, however, yields satisfactory agreement with the experimental data presented here. These new data may facilitate future development of adequate models for description of the fragmentation process.
Figueiro, Ana Claudia; de Araújo Oliveira, Sydia Rosana; Hartz, Zulmira; Couturier, Yves; Bernier, Jocelyne; do Socorro Machado Freire, Maria; Samico, Isabella; Medina, Maria Guadalupe; de Sa, Ronice Franco; Potvin, Louise
2017-03-01
Public health interventions are increasingly represented as complex systems. Research tools for capturing the dynamic of interventions processes, however, are practically non-existent. This paper describes the development and proof of concept process of an analytical tool, the critical event card (CEC), which supports the representation and analysis of complex interventions' evolution, based on critical events. Drawing on the actor-network theory (ANT), we developed and field-tested the tool using three innovative health interventions in northeastern Brazil. Interventions were aimed to promote health equity through intersectoral approaches; were engaged in participatory evaluation and linked to professional training programs. The CEC developing involve practitioners and researchers from projects. Proof of concept was based on document analysis, face-to-face interviews and focus groups. Analytical categories from CEC allow identifying and describing critical events as milestones in the evolution of complex interventions. Categories are (1) event description; (2) actants (human and non-human) involved; (3) interactions between actants; (4) mediations performed; (5) actions performed; (6) inscriptions produced; and (7) consequences for interventions. The CEC provides a tool to analyze and represent intersectoral internvetions' complex and dynamic evolution.
Participatory evaluation (I)--sharing lessons from fieldwork in Asia.
Crishna, B
2007-05-01
There is a need to study methodologies for evaluating social development projects. Traditional methods of evaluation are often not able to capture or measure the 'spirit of change' in people, which is the very essence of human development. Using participatory methodologies is a positive way to ensure that evaluations encourage an understanding of the value of critical analysis among service providers and other stakeholders. Participatory evaluation provides a systematic process of learning through experiences. Practical experiences of conducting a number of evaluation studies in social development projects have led the author to develop four basic principles of participatory evaluation strategies. This has been further conceptualized through an extensive literature search. The article develops and shares these principles through descriptions of field experiences in Asia. The article illustrates that the role of any evaluation remains a learning process, one which promotes a climate of reflection and self-assessment. It shows how using participatory methods can create this environment of learning. However, one needs to keep in mind that participatory evaluation takes time, and that the role and calibre of the facilitator are crucial. Participatory evaluation methods have been recommended for social development projects to ensure that stakeholders remain in control of their own lives and decisions.
Day, Sarah Jane; Riley, Shaun Patrick
2018-02-01
The evolution of three-dimensional printing into prosthetics has opened conversations about the availability and cost of prostheses. This report will discuss how a prosthetic team incorporated additive manufacture techniques into the treatment of a patient with a partial hand amputation to create and test a unique assistive device which he could use to hold his French horn. Case description and methods: Using a process of shape capture, photogrammetry, computer-aided design and finite element analysis, a suitable assistive device was designed and tested. The design was fabricated using three-dimensional printing. Patient satisfaction was measured using a Pugh's Matrix™, and a cost comparison was made between the process used and traditional manufacturing. Findings and outcomes: Patient satisfaction was high. The three-dimensional printed devices were 56% cheaper to fabricate than a similar laminated device. Computer-aided design and three-dimensional printing proved to be an effective method for designing, testing and fabricating a unique assistive device. Clinical relevance CAD and 3D printing techniques can enable devices to be designed, tested and fabricated cheaper than when using traditional techniques. This may lead to improvements in quality and accessibility.
NASA Astrophysics Data System (ADS)
Kastens, K. A.; Shipley, T. F.; Boone, A.
2012-12-01
When geoscience experts look at data visualizations, they can "see" structures, and processes and traces of Earth history. When students look at those same visualizations, they may see only blotches of color, dots or squiggles. What are those experts doing, and how can students learn to do the same? We report on a study in which experts (>10 years of geoscience research experience) and novices (undergrad psychology students) examine shaded-relief/color-coded images of topography/bathymetry, while answering questions aloud and being eye-tracked. Images were a global map, two high-res images of continental terrain and two of oceanic terrain, with hi-res localities chosen to display distinctive traces of important earth processes. The differences in what they look at as recorded by eye-tracking are relatively subtle. On the global image, novices tend to focus on continents, whereas experts distribute their attention more evenly across continents and oceans. Experts universally access the available scale information (distance scale, lat/long axes), whereas most students do not. Novices do attend substantially and spontaneously to the salient geomorphological features in the high-res images: seamounts, mid-ocean ridge/transform intersection, erosional river channels, and compressional ridges and valley system. The more marked differences come in what respondents see, as captured in video recordings of their words and gestures in response to experimenter's questions. When their attention is directed to a small and distinctive part of a high-res image and they are asked to "….describe what you see…", experts typically produce richly detailed descriptions that may include the regional depth/altitude, local relief, shape and spatial distribution of major features, symmetry or lack thereof, cross-cutting relationships, presence of lineations and their orientations, and similar geomorphological details. Following or interwoven with these rich descriptions, some experts also offer interpretations of causal Earth processes. We identified four types of novice answers: (a) "flat" answers, in which the student describes the patches of color on the screen with no mention of shape or relief; (b) "thing" answers, in which the student mentions an inappropriate object, such as "the Great Wall of China," (c) geomorphology answers, in which the student talks about depth/altitude, relief, or shapes of landforms, and (d) process answers, in which student talks about earth processes, such as earthquakes, erosion, or plate tectonics. Novice "geomorphology" (c) answers resemble expert responses, but lack the rich descriptive detail. The "process" (d) category includes many interpretations that lack any grounding in the evidentiary base available in the viewed data. These findings suggest that instruction around earth data should include an emphasis on thoroughly and accurately describing the features that are present in the data--a skill that our experts display and our novices mostly lack. It is unclear, though, how best to sequence the teaching of descriptive and interpretive skills, since the experts' attention to empirical features in the data is steered by their knowledge of which features have causal significance.
What's Next in Complex Networks? Capturing the Concept of Attacking Play in Invasive Team Sports.
Ramos, João; Lopes, Rui J; Araújo, Duarte
2018-01-01
The evolution of performance analysis within sports sciences is tied to technology development and practitioner demands. However, how individual and collective patterns self-organize and interact in invasive team sports remains elusive. Social network analysis has been recently proposed to resolve some aspects of this problem, and has proven successful in capturing collective features resulting from the interactions between team members as well as a powerful communication tool. Despite these advances, some fundamental team sports concepts such as an attacking play have not been properly captured by the more common applications of social network analysis to team sports performance. In this article, we propose a novel approach to team sports performance centered on sport concepts, namely that of an attacking play. Network theory and tools including temporal and bipartite or multilayered networks were used to capture this concept. We put forward eight questions directly related to team performance to discuss how common pitfalls in the use of network tools for capturing sports concepts can be avoided. Some answers are advanced in an attempt to be more precise in the description of team dynamics and to uncover other metrics directly applied to sport concepts, such as the structure and dynamics of attacking plays. Finally, we propose that, at this stage of knowledge, it may be advantageous to build up from fundamental sport concepts toward complex network theory and tools, and not the other way around.
NASA Astrophysics Data System (ADS)
Hardebol, N. J.; Maier, C.; Nick, H.; Geiger, S.; Bertotti, G.; Boro, H.
2015-12-01
A fracture network arrangement is quantified across an isolated carbonate platform from outcrop and aerial imagery to address its impact on fluid flow. The network is described in terms of fracture density, orientation, and length distribution parameters. Of particular interest is the role of fracture cross connections and abutments on the effective permeability. Hence, the flow simulations explicitly account for network topology by adopting Discrete-Fracture-and-Matrix description. The interior of the Latemar carbonate platform (Dolomites, Italy) is taken as outcrop analogue for subsurface reservoirs of isolated carbonate build-ups that exhibit a fracture-dominated permeability. New is our dual strategy to describe the fracture network both as deterministic- and stochastic-based inputs for flow simulations. The fracture geometries are captured explicitly and form a multiscale data set by integration of interpretations from outcrops, airborne imagery, and lidar. The deterministic network descriptions form the basis for descriptive rules that are diagnostic of the complex natural fracture arrangement. The fracture networks exhibit a variable degree of multitier hierarchies with smaller-sized fractures abutting against larger fractures under both right and oblique angles. The influence of network topology on connectivity is quantified using Discrete-Fracture-Single phase fluid flow simulations. The simulation results show that the effective permeability for the fracture and matrix ensemble can be 50 to 400 times higher than the matrix permeability of 1.0 · 10-14 m2. The permeability enhancement is strongly controlled by the connectivity of the fracture network. Therefore, the degree of intersecting and abutting fractures should be captured from outcrops with accuracy to be of value as analogue.
Balancing the stochastic description of uncertainties as a function of hydrologic model complexity
NASA Astrophysics Data System (ADS)
Del Giudice, D.; Reichert, P.; Albert, C.; Kalcic, M.; Logsdon Muenich, R.; Scavia, D.; Bosch, N. S.; Michalak, A. M.
2016-12-01
Uncertainty analysis is becoming an important component of forecasting water and pollutant fluxes in urban and rural environments. Properly accounting for errors in the modeling process can help to robustly assess the uncertainties associated with the inputs (e.g. precipitation) and outputs (e.g. runoff) of hydrological models. In recent years we have investigated several Bayesian methods to infer the parameters of a mechanistic hydrological model along with those of the stochastic error component. The latter describes the uncertainties of model outputs and possibly inputs. We have adapted our framework to a variety of applications, ranging from predicting floods in small stormwater systems to nutrient loads in large agricultural watersheds. Given practical constraints, we discuss how in general the number of quantities to infer probabilistically varies inversely with the complexity of the mechanistic model. Most often, when evaluating a hydrological model of intermediate complexity, we can infer the parameters of the model as well as of the output error model. Describing the output errors as a first order autoregressive process can realistically capture the "downstream" effect of inaccurate inputs and structure. With simpler runoff models we can additionally quantify input uncertainty by using a stochastic rainfall process. For complex hydrologic transport models, instead, we show that keeping model parameters fixed and just estimating time-dependent output uncertainties could be a viable option. The common goal across all these applications is to create time-dependent prediction intervals which are both reliable (cover the nominal amount of validation data) and precise (are as narrow as possible). In conclusion, we recommend focusing both on the choice of the hydrological model and of the probabilistic error description. The latter can include output uncertainty only, if the model is computationally-expensive, or, with simpler models, it can separately account for different sources of errors like in the inputs and the structure of the model.
NASA Astrophysics Data System (ADS)
Rabli, Djamal; McCarroll, Ronald
2018-02-01
This review surveys the different theoretical approaches, used to describe inelastic and rearrangement processes in collisions involving atoms and ions. For a range of energies from a few meV up to about 1 keV, the adiabatic representation is expected to be valid and under these conditions, inelastic and rearrangement processes take place via a network of avoided crossings of the potential energy curves of the collision system. In general, such avoided crossings are finite in number. The non-adiabatic coupling, due to the breakdown of the Born-Oppenheimer separation of the electronic and nuclear variables, depends on the ratio of the electron mass to the nuclear mass terms in the total Hamiltonian. By limiting terms in the total Hamiltonian correct to first order in the electron to nuclear mass ratio, a system of reaction coordinates is found which allows for a correct description of both inelastic channels. The connection between the use of reaction coordinates in the quantum description and the electron translation factors of the impact parameter approach is established. A major result is that only when reaction coordinates are used, is it possible to introduce the notion of a minimal basis set. Such a set must include all avoided crossings including both radial coupling and long range Coriolis coupling. But, only when reactive coordinates are used, can such a basis set be considered as complete. In particular when the centre of nuclear mass is used as centre of coordinates, rather than the correct reaction coordinates, it is shown that erroneous results are obtained. A few results to illustrate this important point are presented: one concerning a simple two-state Landau-Zener type avoided crossing, the other concerning a network of multiple crossings in a typical electron capture process involving a highly charged ion with a neutral atom.
40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?
Code of Federal Regulations, 2011 CFR
2011-07-01
... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...
40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?
Code of Federal Regulations, 2012 CFR
2012-07-01
... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...
40 CFR 60.2025 - What if my chemical recovery unit is not listed in § 60.2020(n)?
Code of Federal Regulations, 2010 CFR
2010-07-01
... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...
40 CFR 60.2558 - What if a chemical recovery unit is not listed in § 60.2555(n)?
Code of Federal Regulations, 2012 CFR
2012-07-01
... materials that are recovered. (3) A description (including a process flow diagram) of the process in which... process. (4) A description (including a process flow diagram) of the chemical constituent recovery process...
Policy capturing as a method of quantifying the determinants of landscape preference
Dennis B. Propst
1979-01-01
Policy Capturing, a potential methodology for evaluating landscape preference, was described and tested. This methodology results in a mathematical model that theoretically represents the human decision-making process. Under experimental conditions, judges were asked to express their preferences for scenes of the Blue Ridge Parkway. An equation which "captures,...
40 CFR 158.335 - Description of formulation process.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Description of formulation process. 158.335 Section 158.335 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.335 Description of formulation...
40 CFR 158.335 - Description of formulation process.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 25 2013-07-01 2013-07-01 false Description of formulation process. 158.335 Section 158.335 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.335 Description of formulation...
40 CFR 158.335 - Description of formulation process.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Description of formulation process. 158.335 Section 158.335 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.335 Description of formulation...
40 CFR 158.335 - Description of formulation process.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Description of formulation process. 158.335 Section 158.335 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.335 Description of formulation...
40 CFR 158.335 - Description of formulation process.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 24 2011-07-01 2011-07-01 false Description of formulation process. 158.335 Section 158.335 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.335 Description of formulation...
Evaluating the Pros and Cons of Different Peer Review Policies via Simulation.
Zhu, Jia; Fung, Gabriel; Wong, Wai Hung; Li, Zhixu; Xu, Chuanhua
2016-08-01
In the academic world, peer review is one of the major processes in evaluating a scholars contribution. In this study, we are interested in quantifying the merits of different policies in a peer review process, such as single-blind review, double-blind review, and obtaining authors feedback. Currently, insufficient work has been undertaken to evaluate the benefits of different peer review policies. One of the major reasons for this situation is the inability to conduct any empirical study because data are presently unavailable. In this case, a computer simulation is one of the best ways to conduct a study. We perform a series of simulations to study the effects of different policies on a peer review process. In this study, we focus on the peer review process of a typical computer science conference. Our results point to the crucial role of program chairs in determining the quality and diversity of the articles to be accepted for publication. We demonstrate the importance of discussion among reviewers, suggest circumstances in which the double-blind review policy should be adopted, and question the credibility of the authors feedback mechanism. Finally, we stress that randomness plays an important role in the peer review process, and this role cannot be eliminated. Although our model may not capture every component of a peer review process, it covers some of the most essential elements. Thus, even the simulation results clearly cannot be taken as literal descriptions of an actual peer review process. However, we can at least still use them to identify alternative directions for future study.
Simulating Thermal Cycling and Isothermal Deformation Response of Polycrystalline NiTi
NASA Technical Reports Server (NTRS)
Manchiraju, Sivom; Gaydosh, Darrell J.; Noebe, Ronald D.; Anderson, Peter M.
2011-01-01
A microstructure-based FEM model that couples crystal plasticity, crystallographic descriptions of the B2-B19' martensitic phase transformation, and anisotropic elasticity is used to simulate thermal cycling and isothermal deformation in polycrystalline NiTi (49.9at% Ni). The model inputs include anisotropic elastic properties, polycrystalline texture, DSC data, and a subset of isothermal deformation and load-biased thermal cycling data. A key experimental trend is captured.namely, the transformation strain during thermal cycling is predicted to reach a peak with increasing bias stress, due to the onset of plasticity at larger bias stress. Plasticity induces internal stress that affects both thermal cycling and isothermal deformation responses. Affected thermal cycling features include hysteretic width, two-way shape memory effect, and evolution of texture with increasing bias stress. Affected isothermal deformation features include increased hardening during loading and retained martensite after unloading. These trends are not captured by microstructural models that lack plasticity, nor are they all captured in a robust manner by phenomenological approaches. Despite this advance in microstructural modeling, quantitative differences exist, such as underprediction of open loop strain during thermal cycling.
NASA Astrophysics Data System (ADS)
Jebeli, Mahvash; Bilesan, Alireza; Arshi, Ahmadreza
2017-06-01
The currently available commercial motion capture systems are constrained by space requirement and thus pose difficulties when used in developing kinematic description of human movements within the existing manufacturing and production cells. The Kinect sensor does not share similar limitations but it is not as accurate. The proposition made in this article is to adopt the Kinect sensor in to facilitate implementation of Health Engineering concepts to industrial environments. This article is an evaluation of the Kinect sensor accuracy when providing three dimensional kinematic data. The sensor is thus utilized to assist in modeling and simulation of worker performance within an industrial cell. For this purpose, Kinect 3D data was compared to that of Vicon motion capture system in a gait analysis laboratory. Results indicated that the Kinect sensor exhibited a coefficient of determination of 0.9996 on the depth axis and 0.9849 along the horizontal axis and 0.2767 on vertical axis. The results prove the competency of the Kinect sensor to be used in the industrial environments.
Extended behavioural device modelling and circuit simulation with Qucs-S
NASA Astrophysics Data System (ADS)
Brinson, M. E.; Kuznetsov, V.
2018-03-01
Current trends in circuit simulation suggest a growing interest in open source software that allows access to more than one simulation engine while simultaneously supporting schematic drawing tools, behavioural Verilog-A and XSPICE component modelling, and output data post-processing. This article introduces a number of new features recently implemented in the 'Quite universal circuit simulator - SPICE variant' (Qucs-S), including structure and fundamental schematic capture algorithms, at the same time highlighting their use in behavioural semiconductor device modelling. Particular importance is placed on the interaction between Qucs-S schematics, equation-defined devices, SPICE B behavioural sources and hardware description language (HDL) scripts. The multi-simulator version of Qucs is a freely available tool that offers extended modelling and simulation features compared to those provided by legacy circuit simulators. The performance of a number of Qucs-S modelling extensions are demonstrated with a GaN HEMT compact device model and data obtained from tests using the Qucs-S/Ngspice/Xyce ©/SPICE OPUS multi-engine circuit simulator.
EPA Tribal Areas (4 of 4): Alaska Native Allotments
This dataset is a spatial representation of the Public Land Survey System (PLSS) in Alaska, generated from land survey records. The data represents a seamless spatial portrayal of native allotment land parcels, their legal descriptions, corner positioning and markings, and survey measurements. This data is intended for mapping purposes only and is not a substitute or replacement for the legal land survey records or other legal documents.Measurement and attribute data are collected from survey records using data entry screens into a relational database. The database design is based upon the FGDC Cadastral Content Data Standard. Corner positions are derived by geodetic calculations using measurement records. Closure and edgematching are applied to produce a seamless dataset. The resultant features do not preserve the original geometry of survey measurements, but the record measurements are reported as attributes. Additional boundary data are derived by spatial capture, protraction and GIS processing. The spatial features are stored and managed within the relational database, with active links to the represented measurement and attribute data.
Environmental DNA sampling protocol - filtering water to capture DNA from aquatic organisms
Laramie, Matthew B.; Pilliod, David S.; Goldberg, Caren S.; Strickler, Katherine M.
2015-09-29
Environmental DNA (eDNA) analysis is an effective method of determining the presence of aquatic organisms such as fish, amphibians, and other taxa. This publication is meant to guide researchers and managers in the collection, concentration, and preservation of eDNA samples from lentic and lotic systems. A sampling workflow diagram and three sampling protocols are included as well as a list of suggested supplies. Protocols include filter and pump assembly using: (1) a hand-driven vacuum pump, ideal for sample collection in remote sampling locations where no electricity is available and when equipment weight is a primary concern; (2) a peristaltic pump powered by a rechargeable battery-operated driver/drill, suitable for remote sampling locations when weight consideration is less of a concern; (3) a 120-volt alternating current (AC) powered peristaltic pump suitable for any location where 120-volt AC power is accessible, or for roadside sampling locations. Images and detailed descriptions are provided for each step in the sampling and preservation process.
The evolution of ecosystem ascendency in a complex systems based model.
Brinck, Katharina; Jensen, Henrik Jeldtoft
2017-09-07
General patterns in ecosystem development can shed light on driving forces behind ecosystem formation and recovery and have been of long interest. In recent years, the need for integrative and process oriented approaches to capture ecosystem growth, development and organisation, as well as the scope of information theory as a descriptive tool has been addressed from various sides. However data collection of ecological network flows is difficult and tedious and comprehensive models are lacking. We use a hierarchical version of the Tangled Nature Model of evolutionary ecology to study the relationship between structure, flow and organisation in model ecosystems, their development over evolutionary time scales and their relation to ecosystem stability. Our findings support the validity of ecosystem ascendency as a meaningful measure of ecosystem organisation, which increases over evolutionary time scales and significantly drops during periods of disturbance. The results suggest a general trend towards both higher integrity and increased stability driven by functional and structural ecosystem coadaptation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fast-ion Dα spectrum diagnostic in the EAST
NASA Astrophysics Data System (ADS)
Hou, Y. M.; Wu, C. R.; Huang, J.; Heidbrink, W. W.; von Hellermann, M. G.; Xu, Z.; Jin, Z.; Chang, J. F.; Zhu, Y. B.; Gao, W.; Chen, Y. J.; Lyu, B.; Hu, R. J.; Zhang, P. F.; Zhang, L.; Gao, W.; Wu, Z. W.; Yu, Y.; Ye, M. Y.
2016-11-01
In toroidal magnetic fusion devices, fast-ion D-alpha diagnostic (FIDA) is a powerful method to study the fast-ion feature. The fast-ion characteristics can be inferred from the Doppler shifted spectrum of Dα light according to charge exchange recombination process between fast ions and probe beam. Since conceptual design presented in the last HTPD conference, significant progress has been made to apply FIDA systems on the Experimental Advanced Superconducting Tokamak (EAST). Both co-current and counter-current neutral beam injectors are available, and each can deliver 2-4 MW beam power with 50-80 keV beam energy. Presently, two sets of high throughput spectrometer systems have been installed on EAST, allowing to capture passing and trapped fast-ion characteristics simultaneously, using Kaiser HoloSpec transmission grating spectrometer and Bunkoukeiki FLP-200 volume phase holographic spectrometer coupled with Princeton Instruments ProEM 1024B eXcelon and Andor DU-888 iXon3 1024 CCD camera, respectively. This paper will present the details of the hardware descriptions and experimental spectrum.
The impact of nurse prescribing on the clinical setting.
Creedon, Rena; Byrne, Stephen; Kennedy, Julia; McCarthy, Suzanne
To investigate the impact nurse prescribing has on the organisation, patient and health professional, and to identify factors associated with the growth of nurse prescribing. Systematic search and narrative review. Data obtained through CINAHL, PubMed, Science direct, Online Computer Library Centre (OCLC), databases/websites, and hand searching. English peer-reviewed quantitative, qualitative and mixed-method articles published from September 2009 through to August 2014 exploring nurse prescribing from the perspective of the organisation, health professional and patient were included. Following a systematic selection process, studies identified were also assessed for quality by applying Cardwell's framework. From the initial 443 citations 37 studies were included in the review. Most studies were descriptive in nature. Commonalities addressed were stakeholders' views, prescribing in practice, jurisdiction, education and benefits/barriers. Prescriptive authority for nurses continues to be a positive addition to clinical practice. However, concerns have emerged regarding appropriate support, relationships and jurisdictional issues. A more comprehensive understanding of nurse and midwife prescribing workloads is required to capture the true impact and cost-effectiveness of the initiative.
Mapping transiently formed and sparsely populated conformations on a complex energy landscape
Wang, Yong; Papaleo, Elena; Lindorff-Larsen, Kresten
2016-01-01
Determining the structures, kinetics, thermodynamics and mechanisms that underlie conformational exchange processes in proteins remains extremely difficult. Only in favourable cases is it possible to provide atomic-level descriptions of sparsely populated and transiently formed alternative conformations. Here we benchmark the ability of enhanced-sampling molecular dynamics simulations to determine the free energy landscape of the L99A cavity mutant of T4 lysozyme. We find that the simulations capture key properties previously measured by NMR relaxation dispersion methods including the structure of a minor conformation, the kinetics and thermodynamics of conformational exchange, and the effect of mutations. We discover a new tunnel that involves the transient exposure towards the solvent of an internal cavity, and show it to be relevant for ligand escape. Together, our results provide a comprehensive view of the structural landscape of a protein, and point forward to studies of conformational exchange in systems that are less characterized experimentally. DOI: http://dx.doi.org/10.7554/eLife.17505.001 PMID:27552057
Influence of the parent cation on the thermalization of subexcitation electrons in solid water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goulet, T.; Jay-Gerin, J.; Patau, J.
1990-09-06
The authors report the results of their Monte Carlo simulations of the thermalization, recombination, and dissociative attachment of subexcitation electrons in solid water. A particular emphasis is placed on the description of the electrons motion in the Coulomb field of its parent cation (H{sub 2}O{sup +}) and on the effect of this positive charge on the fate of the electron. In comparing the results obtained with and without the parent cation they find on the one hand, that the dissociative attachment probability and the electron thermalization distances and times remain practically unaffected by the presence of H{sub 2}O{sup +}. Onmore » the other hand, they find that a certain proportion of subexcitation electrons can be captured, before they thermalize, by a process of dissociative recombination which yields various species such as O, H, OH, and H{sub 2}. The variation of this proportion and of the average thermalization distances and times with the energy of the subexcitation electrons is investigated.« less
Global dynamics for switching systems and their extensions by linear differential equations
NASA Astrophysics Data System (ADS)
Huttinga, Zane; Cummins, Bree; Gedeon, Tomáš; Mischaikow, Konstantin
2018-03-01
Switching systems use piecewise constant nonlinearities to model gene regulatory networks. This choice provides advantages in the analysis of behavior and allows the global description of dynamics in terms of Morse graphs associated to nodes of a parameter graph. The parameter graph captures spatial characteristics of a decomposition of parameter space into domains with identical Morse graphs. However, there are many cellular processes that do not exhibit threshold-like behavior and thus are not well described by a switching system. We consider a class of extensions of switching systems formed by a mixture of switching interactions and chains of variables governed by linear differential equations. We show that the parameter graphs associated to the switching system and any of its extensions are identical. For each parameter graph node, there is an order-preserving map from the Morse graph of the switching system to the Morse graph of any of its extensions. We provide counterexamples that show why possible stronger relationships between the Morse graphs are not valid.
A Brain-wide Circuit Model of Heat-Evoked Swimming Behavior in Larval Zebrafish.
Haesemeyer, Martin; Robson, Drew N; Li, Jennifer M; Schier, Alexander F; Engert, Florian
2018-05-16
Thermosensation provides crucial information, but how temperature representation is transformed from sensation to behavior is poorly understood. Here, we report a preparation that allows control of heat delivery to zebrafish larvae while monitoring motor output and imaging whole-brain calcium signals, thereby uncovering algorithmic and computational rules that couple dynamics of heat modulation, neural activity and swimming behavior. This approach identifies a critical step in the transformation of temperature representation between the sensory trigeminal ganglia and the hindbrain: A simple sustained trigeminal stimulus representation is transformed into a representation of absolute temperature as well as temperature changes in the hindbrain that explains the observed motor output. An activity constrained dynamic circuit model captures the most prominent aspects of these sensori-motor transformations and predicts both behavior and neural activity in response to novel heat stimuli. These findings provide the first algorithmic description of heat processing from sensory input to behavioral output. Copyright © 2018 Elsevier Inc. All rights reserved.
Beyond mean-field description of Gamow-Teller resonances and β-decay
NASA Astrophysics Data System (ADS)
Niu, Yifei; Colò, Gianluca; Vigezzi, Enrico; Bai, Chunlin; Niu, Zhongming; Sagawa, Hiroyuki
2018-02-01
β-decay half-lives set the time scale of the rapid neutron capture process, and are therefore essential for understanding the origin of heavy elements in the universe. The random-phase approximation (RPA) based on Skyrme energy density functionals is widely used to calculate the properties of Gamow-Teller (GT) transitions, which play a dominant role in β-decay half-lives. However, the RPA model has its limitations in reproducing the resonance width and often overestimates β-decay half-lives. To overcome these problems, effects beyond mean-field can be included on top of the RPA model. In particular, this can be obtained by taking into account the particle-vibration coupling (PVC). Within the RPA+PVC model, we successfully reproduce the experimental GT resonance width and β-decay half-lives in magic nuclei. We then extend the formalism to superfluid nuclei and apply it to the GT resonance in 120Sn, obtaining a good reproduction of the experimental strength distribution. The effect of isoscalar pairing is also discussed.
Global dynamics for switching systems and their extensions by linear differential equations.
Huttinga, Zane; Cummins, Bree; Gedeon, Tomáš; Mischaikow, Konstantin
2018-03-15
Switching systems use piecewise constant nonlinearities to model gene regulatory networks. This choice provides advantages in the analysis of behavior and allows the global description of dynamics in terms of Morse graphs associated to nodes of a parameter graph. The parameter graph captures spatial characteristics of a decomposition of parameter space into domains with identical Morse graphs. However, there are many cellular processes that do not exhibit threshold-like behavior and thus are not well described by a switching system. We consider a class of extensions of switching systems formed by a mixture of switching interactions and chains of variables governed by linear differential equations. We show that the parameter graphs associated to the switching system and any of its extensions are identical. For each parameter graph node, there is an order-preserving map from the Morse graph of the switching system to the Morse graph of any of its extensions. We provide counterexamples that show why possible stronger relationships between the Morse graphs are not valid.
Three dimensional shape measurement of wear particle by iterative volume intersection
NASA Astrophysics Data System (ADS)
Wu, Hongkun; Li, Ruowei; Liu, Shilong; Rahman, Md Arifur; Liu, Sanchi; Kwok, Ngaiming; Peng, Zhongxiao
2018-04-01
The morphology of wear particle is a fundamental indicator where wear oriented machine health can be assessed. Previous research proved that thorough measurement of the particle shape allows more reliable explanation of the occurred wear mechanism. However, most of current particle measurement techniques are focused on extraction of the two-dimensional (2-D) morphology, while other critical particle features including volume and thickness are not available. As a result, a three-dimensional (3-D) shape measurement method is developed to enable a more comprehensive particle feature description. The developed method is implemented in three steps: (1) particle profiles in multiple views are captured via a camera mounted above a micro fluid channel; (2) a preliminary reconstruction is accomplished by the shape-from-silhouette approach with the collected particle contours; (3) an iterative re-projection process follows to obtain the final 3-D measurement by minimizing the difference between the original and the re-projected contours. Results from real data are presented, demonstrating the feasibility of the proposed method.
The fluid trampoline: droplets bouncing on a soap film
NASA Astrophysics Data System (ADS)
Bush, John; Gilet, Tristan
2008-11-01
We present the results of a combined experimental and theoretical investigation of droplets falling onto a horizontal soap film. Both static and vertically vibrated soap films are considered. A quasi-static description of the soap film shape yields a force-displacement relation that provides excellent agreement with experiment, and allows us to model the film as a nonlinear spring. This approach yields an accurate criterion for the transition between droplet bouncing and crossing on the static film; moreover, it allows us to rationalize the observed constancy of the contact time and scaling for the coefficient of restitution in the bouncing states. On the vibrating film, a variety of bouncing behaviours were observed, including simple and complex periodic states, multiperiodicity and chaos. A simple theoretical model is developed that captures the essential physics of the bouncing process, reproducing all observed bouncing states. Quantitative agreement between model and experiment is deduced for simple periodic modes, and qualitative agreement for more complex periodic and chaotic bouncing states.
In situ characterization of the brain-microdevice interface using Device Capture Histology
Woolley, Andrew J.; Desai, Himanshi A.; Steckbeck, Mitchell A.; Patel, Neil K.; Otto, Kevin J.
2011-01-01
Accurate assessment of brain-implantable microdevice bio-integration remains a formidable challenge. Prevailing histological methods require device extraction prior to tissue processing, often disrupting and removing the tissue of interest which had been surrounding the device. The Device-Capture Histology method, presented here, overcomes many limitations of the conventional Device-Explant Histology method, by collecting the device and surrounding tissue intact for subsequent labeling. With the implant remaining in situ, accurate and precise imaging of the morphologically preserved tissue at the brain/microdevice interface can then be collected and quantified. First, this article presents the Device-Capture Histology method for obtaining and processing the intact, undisturbed microdevice-tissue interface, and images using fluorescent labeling and confocal microscopy. Second, this article gives examples of how to quantify features found in the captured peridevice tissue. We also share histological data capturing 1) the impact of microdevice implantation on tissue, 2) the effects of an experimental anti-inflammatory coating, 3) a dense grouping of cell nuclei encapsulating a long-term implant, and 4) atypical oligodendrocyte organization neighboring a longterm implant. Data sets collected using the Device-Capture Histology method are presented to demonstrate the significant advantages of processing the intact microdevice-tissue interface, and to underscore the utility of the method in understanding the effects of the brain-implantable microdevices on nearby tissue. PMID:21802446
40 CFR 406.60 - Applicability; description of the parboiled rice processing subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... parboiled rice processing subcategory. 406.60 Section 406.60 Protection of Environment ENVIRONMENTAL... Rice Processing Subcategory § 406.60 Applicability; description of the parboiled rice processing... rice is cleaned, cooked and dried before being milled. ...
Capturing the radical ion-pair intermediate in DNA guanine oxidation
Jie, Jialong; Liu, Kunhui; Wu, Lidan; Zhao, Hongmei; Song, Di; Su, Hongmei
2017-01-01
Although the radical ion pair has been frequently invoked as a key intermediate in DNA oxidative damage reactions and photoinduced electron transfer processes, the unambiguous detection and characterization of this species remain formidable and unresolved due to its extremely unstable nature and low concentration. We use the strategy that, at cryogenic temperatures, the transient species could be sufficiently stabilized to be detectable spectroscopically. By coupling the two techniques (the cryogenic stabilization and the time-resolved laser flash photolysis spectroscopy) together, we are able to capture the ion-pair transient G+•⋯Cl− in the chlorine radical–initiated DNA guanine (G) oxidation reaction, and provide direct evidence to ascertain the intricate type of addition/charge separation mechanism underlying guanine oxidation. The unique spectral signature of the radical ion-pair G+•⋯Cl− is identified, revealing a markedly intense absorption feature peaking at 570 nm that is distinctive from G+• alone. Moreover, the ion-pair spectrum is found to be highly sensitive to the protonation equilibria within guanine-cytosine base pair (G:C), which splits into two resolved bands at 480 and 610 nm as the acidic proton transfers along the central hydrogen bond from G+• to C. We thus use this exquisite sensitivity to track the intrabase-pair proton transfer dynamics in the double-stranded DNA oligonucleotides, which is of critical importance for the description of the proton-coupled charge transfer mechanisms in DNA. PMID:28630924
Practical modeling approaches for geological storage of carbon dioxide.
Celia, Michael A; Nordbotten, Jan M
2009-01-01
The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.
Laube, Inga; Matthews, Natasha; Dean, Angela J.; O’Connell, Redmond G.; Mattingley, Jason B.; Bellgrove, Mark A.
2017-01-01
Limited resources for the in-depth processing of external stimuli make it necessary to select only relevant information from our surroundings and to ignore irrelevant stimuli. Attentional mechanisms facilitate this selection via top-down modulation of stimulus representations in the brain. Previous research has indicated that acetylcholine (ACh) modulates this influence of attention on stimulus processing. However, the role of muscarinic receptors as well as the specific mechanism of cholinergic modulation remains unclear. Here we investigated the influence of ACh on feature-based, top-down control of stimulus processing via muscarinic receptors by using a contingent capture paradigm which specifically tests attentional shifts toward uninformative cue stimuli which display one of the target defining features In a double-blind, placebo controlled study we measured the impact of the muscarinic receptor antagonist scopolamine on behavioral and electrophysiological measures of contingent attentional capture. The results demonstrated all the signs of functional contingent capture, i.e., attentional shifts toward cued locations reflected in increased amplitudes of N1 and N2Pc components, under placebo conditions. However, scopolamine did not affect behavioral or electrophysiological measures of contingent capture. Instead, scopolamine reduced the amplitude of the distractor-evoked Pd component which has recently been associated with active suppression of irrelevant distractor information. The findings suggest a general cholinergic modulation of top-down control during distractor processing. PMID:29270112
40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...
40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...
40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.
Code of Federal Regulations, 2013 CFR
2013-07-01
... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...
40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.
Code of Federal Regulations, 2012 CFR
2012-07-01
... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...
40 CFR 409.10 - Applicability; description of the beet sugar processing subcategory.
Code of Federal Regulations, 2014 CFR
2014-07-01
... sugar processing subcategory. 409.10 Section 409.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS SUGAR PROCESSING POINT SOURCE CATEGORY Beet Sugar Processing Subcategory § 409.10 Applicability; description of the beet sugar processing subcategory. The...
40 CFR 408.170 - Applicability; description of the Alaskan mechanized salmon processing subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Alaskan mechanized salmon processing subcategory. 408.170 Section 408.170 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Alaskan Mechanized Salmon Processing Subcategory § 408.170 Applicability; description of the Alaskan mechanized salmon processing subcategory. The provisions of this subpart are...
40 CFR 408.170 - Applicability; description of the Alaskan mechanized salmon processing subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Alaskan mechanized salmon processing subcategory. 408.170 Section 408.170 Protection of Environment... PROCESSING POINT SOURCE CATEGORY Alaskan Mechanized Salmon Processing Subcategory § 408.170 Applicability; description of the Alaskan mechanized salmon processing subcategory. The provisions of this subpart are...
An integrative solution for managing, tracing and citing sensor-related information
NASA Astrophysics Data System (ADS)
Koppe, Roland; Gerchow, Peter; Macario, Ana; Schewe, Ingo; Rehmcke, Steven; Düde, Tobias
2017-04-01
In a data-driven scientific world, the need to capture information on sensors used in the data acquisition process has become increasingly important. Following the recommendations of the Open Geospatial Consortium (OGC), we started by adopting the SensorML standard for describing platforms, devices and sensors. However, it soon became obvious to us that understanding, implementing and filling such standards costs significant effort and cannot be expected from every scientist individually. So we developed a web-based sensor management solution (https://sensor.awi.de) for describing platforms, devices and sensors as hierarchy of systems which supports tracing changes to a system whereas hiding complexity. Each platform contains devices where each device can have sensors associated with specific identifiers, contacts, events, related online resources (e.g. manufacturer factsheets, calibration documentation, data processing documentation), sensor output parameters and geo-location. In order to better understand and address real world requirements, we have closely interacted with field-going scientists in the context of the key national infrastructure project "FRontiers in Arctic marine Monitoring ocean observatory" (FRAM) during the software development. We learned that not only the lineage of observations is crucial for scientists but also alert services using value ranges, flexible output formats and information on data providers (e.g. FTP sources) for example. Mostly important, persistent and citable versions of sensor descriptions are required for traceability and reproducibility allowing seamless integration with existing information systems, e.g. PANGAEA. Within the context of the EU-funded Ocean Data Interoperability Platform project (ODIP II) and in cooperation with 52north we are proving near real-time data via Sensor Observation Services (SOS) along with sensor descriptions based on our sensor management solution. ODIP II also aims to develop a harmonized SensorML profile for the marine community which we will be adopting in our solution as soon as available. In this presentation we will show our sensor management solution which is embedded in our data flow framework to offer out-of-the-box interoperability with existing information systems and standards. In addition, we will present real world examples and challenges related to the description and traceability of sensor metadata.
THE INTERMEDIATE NEUTRON-CAPTURE PROCESS AND CARBON-ENHANCED METAL-POOR STARS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hampel, Melanie; Stancliffe, Richard J.; Lugaro, Maria
Carbon-enhanced metal-poor (CEMP) stars in the Galactic Halo display enrichments in heavy elements associated with either the s (slow) or the r (rapid) neutron-capture process (e.g., barium and europium, respectively), and in some cases they display evidence of both. The abundance patterns of these CEMP- s / r stars, which show both Ba and Eu enrichment, are particularly puzzling, since the s and the r processes require neutron densities that are more than ten orders of magnitude apart and, hence, are thought to occur in very different stellar sites with very different physical conditions. We investigate whether the abundance patternsmore » of CEMP- s / r stars can arise from the nucleosynthesis of the intermediate neutron-capture process (the i process), which is characterized by neutron densities between those of the s and the r processes. Using nuclear network calculations, we study neutron capture nucleosynthesis at different constant neutron densities n ranging from 10{sup 7}–10{sup 15} cm{sup -3}. With respect to the classical s process resulting from neutron densities on the lowest side of this range, neutron densities on the highest side result in abundance patterns, which show an increased production of heavy s -process and r -process elements, but similar abundances of the light s -process elements. Such high values of n may occur in the thermal pulses of asymptotic giant branch stars due to proton ingestion episodes. Comparison to the surface abundances of 20 CEMP- s / r stars shows that our modeled i -process abundances successfully reproduce observed abundance patterns, which could not be previously explained by s -process nucleosynthesis. Because the i -process models fit the abundances of CEMP- s / r stars so well, we propose that this class should be renamed as CEMP- i .« less
Design Evaluation for Personnel, Training and Human Factors (DEPTH) Final Report.
1998-01-17
human activity was primarily intended to facilitate man-machine design analyses of complex systems. By importing computer aided design (CAD) data, the human figure models and analysis algorithms can help to ensure components can be seen, reached, lifted and removed by most maintainers. These simulations are also useful for logistics data capture, training, and task analysis. DEPTH was also found to be useful in obtaining task descriptions for technical
Optical holography applications for the zero-g Atmospheric Cloud Physics Laboratory
NASA Technical Reports Server (NTRS)
Kurtz, R. L.
1974-01-01
A complete description of holography is provided, both for the time-dependent case of moving scene holography and for the time-independent case of stationary holography. Further, a specific holographic arrangement for application to the detection of particle size distribution in an atmospheric simulation cloud chamber. In this chamber particle growth rate is investigated; therefore, the proposed holographic system must capture continuous particle motion in real time. Such a system is described.
Steinebach, Fabian; Müller-Späth, Thomas; Morbidelli, Massimo
2016-09-01
The economic advantages of continuous processing of biopharmaceuticals, which include smaller equipment and faster, efficient processes, have increased interest in this technology over the past decade. Continuous processes can also improve quality assurance and enable greater controllability, consistent with the quality initiatives of the FDA. Here, we discuss different continuous multi-column chromatography processes. Differences in the capture and polishing steps result in two different types of continuous processes that employ counter-current column movement. Continuous-capture processes are associated with increased productivity per cycle and decreased buffer consumption, whereas the typical purity-yield trade-off of classical batch chromatography can be surmounted by continuous processes for polishing applications. In the context of continuous manufacturing, different but complementary chromatographic columns or devices are typically combined to improve overall process performance and avoid unnecessary product storage. In the following, these various processes, their performances compared with batch processing and resulting product quality are discussed based on a review of the literature. Based on various examples of applications, primarily monoclonal antibody production processes, conclusions are drawn about the future of these continuous-manufacturing technologies. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Designing and Demonstrating a Master Student Project to Explore Carbon Dioxide Capture Technology
ERIC Educational Resources Information Center
Asherman, Florine; Cabot, Gilles; Crua, Cyril; Estel, Lionel; Gagnepain, Charlotte; Lecerf, Thibault; Ledoux, Alain; Leveneur, Sebastien; Lucereau, Marie; Maucorps, Sarah; Ragot, Melanie; Syrykh, Julie; Vige, Manon
2016-01-01
The rise in carbon dioxide (CO[subscript 2]) concentration in the Earth's atmosphere, and the associated strengthening of the greenhouse effect, requires the development of low carbon technologies. New carbon capture processes are being developed to remove CO[subscript 2] that would otherwise be emitted from industrial processes and fossil fuel…
Carbon dioxide capture from a cement manufacturing process
Blount, Gerald C [North Augusta, SC; Falta, Ronald W [Seneca, SC; Siddall, Alvin A [Aiken, SC
2011-07-12
A process of manufacturing cement clinker is provided in which a clean supply of CO.sub.2 gas may be captured. The process also involves using an open loop conversion of CaO/MgO from a calciner to capture CO.sub.2 from combustion flue gases thereby forming CaCO.sub.3/CaMg(CO.sub.3).sub.2. The CaCO.sub.3/CaMg(CO.sub.3).sub.2 is then returned to the calciner where CO.sub.2 gas is evolved. The evolved CO.sub.2 gas, along with other evolved CO.sub.2 gases from the calciner are removed from the calciner. The reactants (CaO/MgO) are feed to a high temperature calciner for control of the clinker production composition.
Correlation of Descriptive Analysis and Instrumental Puncture Testing of Watermelon Cultivars.
Shiu, J W; Slaughter, D C; Boyden, L E; Barrett, D M
2016-06-01
The textural properties of 5 seedless watermelon cultivars were assessed by descriptive analysis and the standard puncture test using a hollow probe with increased shearing properties. The use of descriptive analysis methodology was an effective means of quantifying watermelon sensory texture profiles for characterizing specific cultivars' characteristics. Of the 10 cultivars screened, 71% of the variation in the sensory attributes was measured using the 1st 2 principal components. Pairwise correlation of the hollow puncture probe and sensory parameters determined that initial slope, maximum force, and work after maximum force measurements all correlated well to the sensory attributes crisp and firm. These findings confirm that maximum force correlates well with not only firmness in watermelon, but crispness as well. The initial slope parameter also captures the sensory crispness of watermelon, but is not as practical to measure in the field as maximum force. The work after maximum force parameter is thought to reflect cellular arrangement and membrane integrity that in turn impact sensory firmness and crispness. Watermelon cultivar types were correctly predicted by puncture test measurements in heart tissue 87% of the time, although descriptive analysis was correct 54% of the time. © 2016 Institute of Food Technologists®