Sample records for description capture method

  1. Research accomplished at the Knowledge Based Systems Lab: IDEF3, version 1.0

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Menzel, Christopher P.; Mayer, Paula S. D.

    1991-01-01

    An overview is presented of the foundations and content of the evolving IDEF3 process flow and object state description capture method. This method is currently in beta test. Ongoing efforts in the formulation of formal semantics models for descriptions captured in the outlined form and in the actual application of this method can be expected to cause an evolution in the method language. A language is described for the representation of process and object state centered system description. IDEF3 is a scenario driven process flow modeling methodology created specifically for these types of descriptive activities.

  2. Averaging, passage through resonances, and capture into resonance in two-frequency systems

    NASA Astrophysics Data System (ADS)

    Neishtadt, A. I.

    2014-10-01

    Applying small perturbations to an integrable system leads to its slow evolution. For an approximate description of this evolution the classical averaging method prescribes averaging the rate of evolution over all the phases of the unperturbed motion. This simple recipe does not always produce correct results, because of resonances arising in the process of evolution. The phenomenon of capture into resonance consists in the system starting to evolve in such a way as to preserve the resonance property once it has arisen. This paper is concerned with application of the averaging method to a description of evolution in two-frequency systems. It is assumed that the trajectories of the averaged system intersect transversally the level surfaces of the frequency ratio and that certain other conditions of general position are satisfied. The rate of evolution is characterized by a small parameter \\varepsilon. The main content of the paper is a proof of the following result: outside a set of initial data with measure of order \\sqrt \\varepsilon the averaging method describes the evolution to within O(\\sqrt \\varepsilon \\vert\\ln\\varepsilon\\vert) for periods of time of order 1/\\varepsilon. This estimate is sharp. The exceptional set of measure \\sqrt \\varepsilon contains the initial data for phase points captured into resonance. A description of the motion of such phase points is given, along with a survey of related results on averaging. Examples of capture into resonance are presented for some problems in the dynamics of charged particles. Several open problems are stated. Bibliography: 65 titles.

  3. The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis.

    PubMed

    Hachaj, Tomasz; Ogiela, Marek R

    2016-06-01

    The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment.

  4. THE RADIATIVE NEUTRON CAPTURE ON 2H, 6Li, 7Li, 12C AND 13C AT ASTROPHYSICAL ENERGIES

    NASA Astrophysics Data System (ADS)

    Dubovichenko, Sergey; Dzhazairov-Kakhramanov, Albert; Burkova, Natalia

    2013-05-01

    The continued interest in the study of radiative neutron capture on atomic nuclei is due, on the one hand, to the important role played by this process in the analysis of many fundamental properties of nuclei and nuclear reactions, and, on the other hand, to the wide use of the capture cross-section data in the various applications of nuclear physics and nuclear astrophysics, and, also, to the importance of the analysis of primordial nucleosynthesis in the Universe. This paper is devoted to the description of results for the processes of the radiative neutron capture on certain light atomic nuclei at thermal and astrophysical energies. The consideration of these processes is done within the framework of the potential cluster model (PCM), general description of which was given earlier. The methods of usage of the results obtained, based on the phase shift analysis intercluster potentials, are demonstrated in calculations of the radiative capture characteristics. The considered capture reactions are not part of stellar thermonuclear cycles, but involve in the basic reaction chain of primordial nucleosynthesis in the course of the Universe formation.

  5. Multimodal RNA-seq using single-strand, double-strand, and CircLigase-based capture yields a refined and extended description of the C. elegans transcriptome.

    PubMed

    Lamm, Ayelet T; Stadler, Michael R; Zhang, Huibin; Gent, Jonathan I; Fire, Andrew Z

    2011-02-01

    We have used a combination of three high-throughput RNA capture and sequencing methods to refine and augment the transcriptome map of a well-studied genetic model, Caenorhabditis elegans. The three methods include a standard (non-directional) library preparation protocol relying on cDNA priming and foldback that has been used in several previous studies for transcriptome characterization in this species, and two directional protocols, one involving direct capture of single-stranded RNA fragments and one involving circular-template PCR (CircLigase). We find that each RNA-seq approach shows specific limitations and biases, with the application of multiple methods providing a more complete map than was obtained from any single method. Of particular note in the analysis were substantial advantages of CircLigase-based and ssRNA-based capture for defining sequences and structures of the precise 5' ends (which were lost using the double-strand cDNA capture method). Of the three methods, ssRNA capture was most effective in defining sequences to the poly(A) junction. Using data sets from a spectrum of C. elegans strains and stages and the UCSC Genome Browser, we provide a series of tools, which facilitate rapid visualization and assignment of gene structures.

  6. Shock compression modeling of metallic single crystals: comparison of finite difference, steady wave, and analytical solutions

    DOE PAGES

    Lloyd, Jeffrey T.; Clayton, John D.; Austin, Ryan A.; ...

    2015-07-10

    Background: The shock response of metallic single crystals can be captured using a micro-mechanical description of the thermoelastic-viscoplastic material response; however, using a such a description within the context of traditional numerical methods may introduce a physical artifacts. Advantages and disadvantages of complex material descriptions, in particular the viscoplastic response, must be framed within approximations introduced by numerical methods. Methods: Three methods of modeling the shock response of metallic single crystals are summarized: finite difference simulations, steady wave simulations, and algebraic solutions of the Rankine-Hugoniot jump conditions. For the former two numerical techniques, a dislocation density based framework describes themore » rate- and temperature-dependent shear strength on each slip system. For the latter analytical technique, a simple (two-parameter) rate- and temperature-independent linear hardening description is necessarily invoked to enable simultaneous solution of the governing equations. For all models, the same nonlinear thermoelastic energy potential incorporating elastic constants of up to order 3 is applied. Results: Solutions are compared for plate impact of highly symmetric orientations (all three methods) and low symmetry orientations (numerical methods only) of aluminum single crystals shocked to 5 GPa (weak shock regime) and 25 GPa (overdriven regime). Conclusions: For weak shocks, results of the two numerical methods are very similar, regardless of crystallographic orientation. For strong shocks, artificial viscosity affects the finite difference solution, and effects of transverse waves for the lower symmetry orientations not captured by the steady wave method become important. The analytical solution, which can only be applied to highly symmetric orientations, provides reasonable accuracy with regards to prediction of most variables in the final shocked state but, by construction, does not provide insight into the shock structure afforded by the numerical methods.« less

  7. SSHAC Workshop 1 - November 15-18, 2010 | NGA East

    Science.gov Websites

    (2:00-4:30) Methods for finite fault simulations 2:00-3:00 Methods considered: description of selected methods (2.Yuehua Zeng; Sim WG) 3:00-3:30 Discussion 3:30-3:45 Break 3:45-4:00 Discussion: Capture of representative methods 4:00-4:30 Summary of today's key issues (3.Norman Abrahamson) 4:30-5:00

  8. New teaching aid “Physical Methods of Medical Introscopy”

    NASA Astrophysics Data System (ADS)

    Ulin, S. E.

    2017-01-01

    Description of a new teaching aid, in which new methods of reconstruction of hidden images by means of nuclear magnetic resonance, X-gamma-ray, and ultrasonic tomography, is presented. The diagnostics and therapy methods of various oncological diseases with the use of medicine proton and ions beams, as well as neutron capture therapy, are considered. The new teaching aid is intended for senior students and postgraduates.

  9. Spectral method for a kinetic swarming model

    DOE PAGES

    Gamba, Irene M.; Haack, Jeffrey R.; Motsch, Sebastien

    2015-04-28

    Here we present the first numerical method for a kinetic description of the Vicsek swarming model. The kinetic model poses a unique challenge, as there is a distribution dependent collision invariant to satisfy when computing the interaction term. We use a spectral representation linked with a discrete constrained optimization to compute these interactions. To test the numerical scheme we investigate the kinetic model at different scales and compare the solution with the microscopic and macroscopic descriptions of the Vicsek model. Lastly, we observe that the kinetic model captures key features such as vortex formation and traveling waves.

  10. Compound-nuclear Reactions with Unstable Isotopes: Constraining Capture Cross Sections with Indirect Data and Theory

    NASA Astrophysics Data System (ADS)

    Escher, Jutta

    2016-09-01

    Cross sections for compound-nuclear reactions involving unstable targets are important for many applications, but can often not be measured directly. Several indirect methods have recently been proposed to determine neutron capture cross sections for unstable isotopes. These methods aim at constraining statistical calculations of capture cross sections with data obtained from the decay of the compound nucleus relevant to the desired reaction. Each method produces this compound nucleus in a different manner (via a light-ion reaction, a photon-induced reaction, or β decay) and requires additional ingredients to yield the sought-after cross section. This contribution focuses on the process of determining capture cross sections from inelastic scattering and transfer experiments. Specifically, theoretical descriptions of the (p,d) transfer reaction have been developed to complement recent measurements in the Zr-Y region. The procedure for obtaining constraints for unknown capture cross sections is illustrated. The main advantages and challenges of this approach are compared to those of the proposed alternatives. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  11. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1995-09-01

    vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems

  12. IDEF3 formalization report

    NASA Technical Reports Server (NTRS)

    Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.

    1991-01-01

    The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.

  13. Real-time look-up table-based color correction for still image stabilization of digital cameras without using frame memory

    NASA Astrophysics Data System (ADS)

    Luo, Lin-Bo; An, Sang-Woo; Wang, Chang-Shuai; Li, Ying-Chun; Chong, Jong-Wha

    2012-09-01

    Digital cameras usually decrease exposure time to capture motion-blur-free images. However, this operation will generate an under-exposed image with a low-budget complementary metal-oxide semiconductor image sensor (CIS). Conventional color correction algorithms can efficiently correct under-exposed images; however, they are generally not performed in real time and need at least one frame memory if they are implemented by hardware. The authors propose a real-time look-up table-based color correction method that corrects under-exposed images with hardware without using frame memory. The method utilizes histogram matching of two preview images, which are exposed for a long and short time, respectively, to construct an improved look-up table (ILUT) and then corrects the captured under-exposed image in real time. Because the ILUT is calculated in real time before processing the captured image, this method does not require frame memory to buffer image data, and therefore can greatly save the cost of CIS. This method not only supports single image capture, but also bracketing to capture three images at a time. The proposed method was implemented by hardware description language and verified by a field-programmable gate array with a 5 M CIS. Simulations show that the system can perform in real time with a low cost and can correct the color of under-exposed images well.

  14. An extended Lagrangian method for subsonic flows

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Loh, Ching Y.

    1992-01-01

    It is well known that fluid motion can be specified by either the Eulerian of Lagrangian description. Most of Computational Fluid Dynamics (CFD) developments over the last three decades have been based on the Eulerian description and considerable progress has been made. In particular, the upwind methods, inspired and guided by the work of Gudonov, have met with many successes in dealing with complex flows, especially where discontinuities exist. However, this shock capturing property has proven to be accurate only when the discontinuity is aligned with one of the grid lines since most upwind methods are strictly formulated in 1-D framework and only formally extended to multi-dimensions. Consequently, the attractive property of crisp resolution of these discontinuities is lost and research on genuine multi-dimensional approach has just been undertaken by several leading researchers. Nevertheless they are still based on the Eulerian description.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, Steven Adriel

    The following discussion contains a high-level description of methods used to implement software for data processing. It describes the required directory structures and file handling required to use Excel's Visual Basic for Applications programming language and how to identify shot, test and capture types to appropriately process data. It also describes how to interface with the software.

  16. Profiling a Mind Map User: A Descriptive Appraisal

    ERIC Educational Resources Information Center

    Tucker, Joanne M.; Armstrong, Gary R.; Massad, Victor J.

    2010-01-01

    Whether manually or through the use of software, a non-linear information organization framework known as mind mapping offers an alternative method for capturing thoughts, ideas and information to linear thinking modes such as outlining. Mind mapping is brainstorming, organizing, and problem solving. This paper examines mind mapping techniques,…

  17. Attenuated coupled cluster: a heuristic polynomial similarity transformation incorporating spin symmetry projection into traditional coupled cluster theory

    NASA Astrophysics Data System (ADS)

    Gomez, John A.; Henderson, Thomas M.; Scuseria, Gustavo E.

    2017-11-01

    In electronic structure theory, restricted single-reference coupled cluster (CC) captures weak correlation but fails catastrophically under strong correlation. Spin-projected unrestricted Hartree-Fock (SUHF), on the other hand, misses weak correlation but captures a large portion of strong correlation. The theoretical description of many important processes, e.g. molecular dissociation, requires a method capable of accurately capturing both weak and strong correlation simultaneously, and would likely benefit from a combined CC-SUHF approach. Based on what we have recently learned about SUHF written as particle-hole excitations out of a symmetry-adapted reference determinant, we here propose a heuristic CC doubles model to attenuate the dominant spin collective channel of the quadratic terms in the CC equations. Proof of principle results presented here are encouraging and point to several paths forward for improving the method further.

  18. Supervised segmentation of phenotype descriptions for the human skeletal phenome using hybrid methods.

    PubMed

    Groza, Tudor; Hunter, Jane; Zankl, Andreas

    2012-10-15

    Over the course of the last few years there has been a significant amount of research performed on ontology-based formalization of phenotype descriptions. In order to fully capture the intrinsic value and knowledge expressed within them, we need to take advantage of their inner structure, which implicitly combines qualities and anatomical entities. The first step in this process is the segmentation of the phenotype descriptions into their atomic elements. We present a two-phase hybrid segmentation method that combines a series individual classifiers using different aggregation schemes (set operations and simple majority voting). The approach is tested on a corpus comprised of skeletal phenotype descriptions emerged from the Human Phenotype Ontology. Experimental results show that the best hybrid method achieves an F-Score of 97.05% in the first phase and F-Scores of 97.16% / 94.50% in the second phase. The performance of the initial segmentation of anatomical entities and qualities (phase I) is not affected by the presence / absence of external resources, such as domain dictionaries. From a generic perspective, hybrid methods may not always improve the segmentation accuracy as they are heavily dependent on the goal and data characteristics.

  19. A description of the first live Poouli captured

    USGS Publications Warehouse

    Baker, P.E.

    1998-01-01

    The Poouli (Melamprosops phaeosoma) is an endangered Hawaiian honeycreeper found only on Maui, Hawaii. It was rare at the time of its discovery in 1973, but by 1997 was on the brink of extinction with fewer than six individuals left. Two specimens were collected for the description of the species, but both proved to be immature by comparison with a pair of adults at a nest. Until 1997 no Poouli had ever been captured alive, and consequently descriptions of adult Poouli were produced from field observations. In 1997, I captured an adult male Poouli which is described here for the first time. Detailed comparisons of the plumage of this adult with that of an immature specimen and previous descriptions of the species are discussed in this paper, as are differences in plumage between adult and immature males and females that may aid the sexing and ageing of birds in the field.

  20. i-TED: A novel concept for high-sensitivity (n,γ) cross-section measurements

    NASA Astrophysics Data System (ADS)

    Domingo-Pardo, C.

    2016-07-01

    A new method for measuring (n , γ) cross-sections aiming at enhanced signal-to-background ratio is presented. This new approach is based on the combination of the pulse-height weighting technique with a total energy detection system that features γ-ray imaging capability (i-TED). The latter allows one to exploit Compton imaging techniques to discriminate between true capture γ-rays arising from the sample under study and background γ-rays coming from contaminant neutron (prompt or delayed) captures in the surrounding environment. A general proof-of-concept detection system for this application is presented in this paper together with a description of the imaging method and a conceptual demonstration based on Monte Carlo simulations.

  1. Improving Patient Experience and Primary Care Quality for Patients With Complex Chronic Disease Using the Electronic Patient-Reported Outcomes Tool: Adopting Qualitative Methods Into a User-Centered Design Approach.

    PubMed

    Steele Gray, Carolyn; Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl

    2016-02-18

    Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis-Possible, Implementable, (to be) Challenged, (to be) Killed-guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers.

  2. Deep hierarchical attention network for video description

    NASA Astrophysics Data System (ADS)

    Li, Shuohao; Tang, Min; Zhang, Jun

    2018-03-01

    Pairing video to natural language description remains a challenge in computer vision and machine translation. Inspired by image description, which uses an encoder-decoder model for reducing visual scene into a single sentence, we propose a deep hierarchical attention network for video description. The proposed model uses convolutional neural network (CNN) and bidirectional LSTM network as encoders while a hierarchical attention network is used as the decoder. Compared to encoder-decoder models used in video description, the bidirectional LSTM network can capture the temporal structure among video frames. Moreover, the hierarchical attention network has an advantage over single-layer attention network on global context modeling. To make a fair comparison with other methods, we evaluate the proposed architecture with different types of CNN structures and decoders. Experimental results on the standard datasets show that our model has a more superior performance than the state-of-the-art techniques.

  3. IDEF5 Ontology Description Capture Method: Concepts and Formal Foundations

    DTIC Science & Technology

    1992-11-01

    cutter comes to exist. The puzzle here goes back to Greek times iii the guise of the Ship of Theseus : if we bit by bit replace the planks of a ship...831 Barwise, J. and Perry, J., Situations and Attitudes, The MIT Press, Cambridge, 1983. [Burch 911 Burch, R., A Peircean Reduction Thesis : The

  4. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1992-05-01

    methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key

  5. A complex systems analysis of stick-slip dynamics of a laboratory fault

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, David M.; Tordesillas, Antoinette, E-mail: atordesi@unimelb.edu.au; Small, Michael

    2014-03-15

    We study the stick-slip behavior of a granular bed of photoelastic disks sheared by a rough slider pulled along the surface. Time series of a proxy for granular friction are examined using complex systems methods to characterize the observed stick-slip dynamics of this laboratory fault. Nonlinear surrogate time series methods show that the stick-slip behavior appears more complex than a periodic dynamics description. Phase space embedding methods show that the dynamics can be locally captured within a four to six dimensional subspace. These slider time series also provide an experimental test for recent complex network methods. Phase space networks, constructedmore » by connecting nearby phase space points, proved useful in capturing the key features of the dynamics. In particular, network communities could be associated to slip events and the ranking of small network subgraphs exhibited a heretofore unreported ordering.« less

  6. A Kinematic Description of the Temporal Characteristics of Jaw Motion for Early Chewing: Preliminary Findings

    ERIC Educational Resources Information Center

    Wilson, Erin M.; Green, Jordan R.; Weismer, Gary

    2012-01-01

    Purpose: The purpose of this investigation was to describe age- and consistency-related changes in the temporal characteristics of chewing in typically developing children between the ages of 4 and 35 months and adults using high-resolution optically based motion capture technology. Method: Data were collected from 60 participants (48 children, 12…

  7. Towards an Interoperability Ontology for Software Development Tools

    DTIC Science & Technology

    2003-03-01

    The description of feature models was tied to the introduction of the Feature-Oriented Domain Analysis ( FODA *) [KANG90] approach in the late eighties...Feature-oriented domain analysis ( FODA ) is a domain analysis method developed at the Software...ese obstacles was to construct a “pilot” ontology that is extensible. We applied the Feature-Oriented Domain Analysis approach to capture the

  8. Improving Patient Experience and Primary Care Quality for Patients With Complex Chronic Disease Using the Electronic Patient-Reported Outcomes Tool: Adopting Qualitative Methods Into a User-Centered Design Approach

    PubMed Central

    Khan, Anum Irfan; Kuluski, Kerry; McKillop, Ian; Sharpe, Sarah; Bierman, Arlene S; Lyons, Renee F; Cott, Cheryl

    2016-01-01

    Background Many mHealth technologies do not meet the needs of patients with complex chronic disease and disabilities (CCDDs) who are among the highest users of health systems worldwide. Furthermore, many of the development methodologies used in the creation of mHealth and eHealth technologies lack the ability to embrace users with CCDD in the specification process. This paper describes how we adopted and modified development techniques to create the electronic Patient-Reported Outcomes (ePRO) tool, a patient-centered mHealth solution to help improve primary health care for patients experiencing CCDD. Objective This paper describes the design and development approach, specifically the process of incorporating qualitative research methods into user-centered design approaches to create the ePRO tool. Key lessons learned are offered as a guide for other eHealth and mHealth research and technology developers working with complex patient populations and their primary health care providers. Methods Guided by user-centered design principles, interpretive descriptive qualitative research methods were adopted to capture user experiences through interviews and working groups. Consistent with interpretive descriptive methods, an iterative analysis technique was used to generate findings, which were then organized in relation to the tool design and function to help systematically inform modifications to the tool. User feedback captured and analyzed through this method was used to challenge the design and inform the iterative development of the tool. Results Interviews with primary health care providers (n=7) and content experts (n=6), and four focus groups with patients and carers (n=14) along with a PICK analysis—Possible, Implementable, (to be) Challenged, (to be) Killed—guided development of the first prototype. The initial prototype was presented in three design working groups with patients/carers (n=5), providers (n=6), and experts (n=5). Working group findings were broken down into categories of what works and what does not work to inform modifications to the prototype. This latter phase led to a major shift in the purpose and design of the prototype, validating the importance of using iterative codesign processes. Conclusions Interpretive descriptive methods allow for an understanding of user experiences of patients with CCDD, their carers, and primary care providers. Qualitative methods help to capture and interpret user needs, and identify contextual barriers and enablers to tool adoption, informing a redesign to better suit the needs of this diverse user group. This study illustrates the value of adopting interpretive descriptive methods into user-centered mHealth tool design and can also serve to inform the design of other eHealth technologies. Our approach is particularly useful in requirements determination when developing for a complex user group and their health care providers. PMID:26892952

  9. New quasibound states of the compound nucleus in α -particle capture by the nucleus

    NASA Astrophysics Data System (ADS)

    Maydanyuk, Sergei P.; Zhang, Peng-Ming; Zou, Li-Ping

    2017-07-01

    We generalize the theory of nuclear decay and capture of Gamow that is based on tunneling through the barrier and internal oscillations inside the nucleus. In our formalism an additional factor is obtained, which describes distribution of the wave function of the the α particle inside the nuclear region. We discover new most stable states (called quasibound states) of the compound nucleus (CN) formed during the capture of α particle by the nucleus. With a simple example, we explain why these states cannot appear in traditional calculations of the α capture cross sections based on monotonic penetrabilities of a barrier, but they appear in a complete description of the evolution of the CN. Our result is obtained by a complete description of the CN evolution, which has the advantages of (1) a clear picture of the formation of the CN and its disintegration, (2) a detailed quantum description of the CN, (3) tests of the calculated amplitudes based on quantum mechanics (not realized in other approaches), and (4) high accuracy of calculations (not achieved in other approaches). These peculiarities are shown with the capture reaction of α +44Ca . We predict quasibound energy levels and determine fusion probabilities for this reaction. The difference between our approach and theory of quasistationary states with complex energies applied for the α capture is also discussed. We show (1) that theory does not provide calculations for the cross section of α capture (according to modern models of the α capture), in contrast with our formalism, and (2) these two approaches describe different states of the α capture (for the same α -nucleus potential).

  10. The Human Variome Project (HVP) 2009 Forum "Towards Establishing Standards".

    PubMed

    Howard, Heather J; Horaitis, Ourania; Cotton, Richard G H; Vihinen, Mauno; Dalgleish, Raymond; Robinson, Peter; Brookes, Anthony J; Axton, Myles; Hoffmann, Robert; Tuffery-Giraud, Sylvie

    2010-03-01

    The May 2009 Human Variome Project (HVP) Forum "Towards Establishing Standards" was a round table discussion attended by delegates from groups representing international efforts aimed at standardizing several aspects of the HVP: mutation nomenclature, description and annotation, clinical ontology, means to better characterize unclassified variants (UVs), and methods to capture mutations from diagnostic laboratories for broader distribution to the medical genetics research community. Methods for researchers to receive credit for their effort at mutation detection were also discussed. (c) 2010 Wiley-Liss, Inc.

  11. A screened independent atom model for the description of ion collisions from atomic and molecular clusters

    NASA Astrophysics Data System (ADS)

    Lüdde, Hans Jürgen; Horbatsch, Marko; Kirchner, Tom

    2018-05-01

    We apply a recently introduced model for an independent-atom-like calculation of ion-impact electron transfer and ionization cross sections to proton collisions from water, neon, and carbon clusters. The model is based on a geometrical interpretation of the cluster cross section as an effective area composed of overlapping circular disks that are representative of the atomic contributions. The latter are calculated using a time-dependent density-functional-theory-based single-particle description with accurate exchange-only ground-state potentials. We find that the net capture and ionization cross sections in p-X n collisions are proportional to n α with 2/3 ≤ α ≤ 1. For capture from water clusters at 100 keV impact energy α is close to one, which is substantially different from the value α = 2/3 predicted by a previous theoretical work based on the simplest-level electron nuclear dynamics method. For ionization at 100 keV and for capture at lower energies we find smaller α values than for capture at 100 keV. This can be understood by considering the magnitude of the atomic cross sections and the resulting overlaps of the circular disks that make up the cluster cross section in our model. Results for neon and carbon clusters confirm these trends. Simple parametrizations are found which fit the cross sections remarkably well and suggest that they depend on the relevant bond lengths.

  12. A minimization principle for the description of modes associated with finite-time instabilities

    PubMed Central

    Babaee, H.

    2016-01-01

    We introduce a minimization formulation for the determination of a finite-dimensional, time-dependent, orthonormal basis that captures directions of the phase space associated with transient instabilities. While these instabilities have finite lifetime, they can play a crucial role either by altering the system dynamics through the activation of other instabilities or by creating sudden nonlinear energy transfers that lead to extreme responses. However, their essentially transient character makes their description a particularly challenging task. We develop a minimization framework that focuses on the optimal approximation of the system dynamics in the neighbourhood of the system state. This minimization formulation results in differential equations that evolve a time-dependent basis so that it optimally approximates the most unstable directions. We demonstrate the capability of the method for two families of problems: (i) linear systems, including the advection–diffusion operator in a strongly non-normal regime as well as the Orr–Sommerfeld/Squire operator, and (ii) nonlinear problems, including a low-dimensional system with transient instabilities and the vertical jet in cross-flow. We demonstrate that the time-dependent subspace captures the strongly transient non-normal energy growth (in the short-time regime), while for longer times the modes capture the expected asymptotic behaviour. PMID:27118900

  13. The use of cognitive task analysis to improve instructional descriptions of procedures.

    PubMed

    Clark, Richard E; Pugh, Carla M; Yates, Kenneth A; Inaba, Kenji; Green, Donald J; Sullivan, Maura E

    2012-03-01

    Surgical training relies heavily on the ability of expert surgeons to provide complete and accurate descriptions of a complex procedure. However, research from a variety of domains suggests that experts often omit critical information about the judgments, analysis, and decisions they make when solving a difficult problem or performing a complex task. In this study, we compared three methods for capturing surgeons' descriptions of how to perform the procedure for inserting a femoral artery shunt (unaided free-recall, unaided free-recall with simulation, and cognitive task analysis methods) to determine which method produced more accurate and complete results. Cognitive task analysis was approximately 70% more complete and accurate than free-recall and or free-recall during a simulation of the procedure. Ten expert trauma surgeons at a major urban trauma center were interviewed separately and asked to describe how to perform an emergency shunt procedure. Four surgeons provided an unaided free-recall description of the shunt procedure, five surgeons provided an unaided free-recall description of the procedure using visual aids and surgical instruments (simulation), and one (chosen randomly) was interviewed using cognitive task analysis (CTA) methods. An 11th vascular surgeon approved the final CTA protocol. The CTA interview with only one expert surgeon resulted in significantly greater accuracy and completeness of the descriptions compared with the unaided free-recall interviews with multiple expert surgeons. Surgeons in the unaided group omitted nearly 70% of necessary decision steps. In the free-recall group, heavy use of simulation improved surgeons' completeness when describing the steps of the procedure. CTA significantly increases the completeness and accuracy of surgeons' instructional descriptions of surgical procedures. In addition, simulation during unaided free-recall interviews may improve the completeness of interview data. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Feasibility of Using Low-Cost Motion Capture for Automated Screening of Shoulder Motion Limitation after Breast Cancer Surgery.

    PubMed

    Gritsenko, Valeriya; Dailey, Eric; Kyle, Nicholas; Taylor, Matt; Whittacre, Sean; Swisher, Anne K

    2015-01-01

    To determine if a low-cost, automated motion analysis system using Microsoft Kinect could accurately measure shoulder motion and detect motion impairments in women following breast cancer surgery. Descriptive study of motion measured via 2 methods. Academic cancer center oncology clinic. 20 women (mean age = 60 yrs) were assessed for active and passive shoulder motions during a routine post-operative clinic visit (mean = 18 days after surgery) following mastectomy (n = 4) or lumpectomy (n = 16) for breast cancer. Participants performed 3 repetitions of active and passive shoulder motions on the side of the breast surgery. Arm motion was recorded using motion capture by Kinect for Windows sensor and on video. Goniometric values were determined from video recordings, while motion capture data were transformed to joint angles using 2 methods (body angle and projection angle). Correlation of motion capture with goniometry and detection of motion limitation. Active shoulder motion measured with low-cost motion capture agreed well with goniometry (r = 0.70-0.80), while passive shoulder motion measurements did not correlate well. Using motion capture, it was possible to reliably identify participants whose range of shoulder motion was reduced by 40% or more. Low-cost, automated motion analysis may be acceptable to screen for moderate to severe motion impairments in active shoulder motion. Automatic detection of motion limitation may allow quick screening to be performed in an oncologist's office and trigger timely referrals for rehabilitation.

  15. A new method for digital video documentation in surgical procedures and minimally invasive surgery.

    PubMed

    Wurnig, P N; Hollaus, P H; Wurnig, C H; Wolf, R K; Ohtsuka, T; Pridun, N S

    2003-02-01

    Documentation of surgical procedures is limited to the accuracy of description, which depends on the vocabulary and the descriptive prowess of the surgeon. Even analog video recording could not solve the problem of documentation satisfactorily due to the abundance of recorded material. By capturing the video digitally, most problems are solved in the circumstances described in this article. We developed a cheap and useful digital video capturing system that consists of conventional computer components. Video images and clips can be captured intraoperatively and are immediately available. The system is a commercial personal computer specially configured for digital video capturing and is connected by wire to the video tower. Filming was done with a conventional endoscopic video camera. A total of 65 open and endoscopic procedures were documented in an orthopedic and a thoracic surgery unit. The median number of clips per surgical procedure was 6 (range, 1-17), and the median storage volume was 49 MB (range, 3-360 MB) in compressed form. The median duration of a video clip was 4 min 25 s (range, 45 s to 21 min). Median time for editing a video clip was 12 min for an advanced user (including cutting, title for the movie, and compression). The quality of the clips renders them suitable for presentations. This digital video documentation system allows easy capturing of intraoperative video sequences in high quality. All possibilities of documentation can be performed. With the use of an endoscopic video camera, no compromises with respect to sterility and surgical elbowroom are necessary. The cost is much lower than commercially available systems, and setting changes can be performed easily without trained specialists.

  16. Methods and apparatus for capture and storage of semantic information with sub-files in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-02-03

    Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.

  17. Transport coefficient computation based on input/output reduced order models

    NASA Astrophysics Data System (ADS)

    Hurst, Joshua L.

    The guiding purpose of this thesis is to address the optimal material design problem when the material description is a molecular dynamics model. The end goal is to obtain a simplified and fast model that captures the property of interest such that it can be used in controller design and optimization. The approach is to examine model reduction analysis and methods to capture a specific property of interest, in this case viscosity, or more generally complex modulus or complex viscosity. This property and other transport coefficients are defined by a input/output relationship and this motivates model reduction techniques that are tailored to preserve input/output behavior. In particular Singular Value Decomposition (SVD) based methods are investigated. First simulation methods are identified that are amenable to systems theory analysis. For viscosity, these models are of the Gosling and Lees-Edwards type. They are high order nonlinear Ordinary Differential Equations (ODEs) that employ Periodic Boundary Conditions. Properties can be calculated from the state trajectories of these ODEs. In this research local linear approximations are rigorously derived and special attention is given to potentials that are evaluated with Periodic Boundary Conditions (PBC). For the Gosling description LTI models are developed from state trajectories but are found to have limited success in capturing the system property, even though it is shown that full order LTI models can be well approximated by reduced order LTI models. For the Lees-Edwards SLLOD type model nonlinear ODEs will be approximated by a Linear Time Varying (LTV) model about some nominal trajectory and both balanced truncation and Proper Orthogonal Decomposition (POD) will be used to assess the plausibility of reduced order models to this system description. An immediate application of the derived LTV models is Quasilinearization or Waveform Relaxation. Quasilinearization is a Newton's method applied to the ODE operator equation. Its a recursive method that solves nonlinear ODE's by solving a LTV systems at each iteration to obtain a new closer solution. LTV models are derived for both Gosling and Lees-Edwards type models. Particular attention is given to SLLOD Lees-Edwards models because they are in a form most amenable to performing Taylor series expansion, and the most commonly used model to examine viscosity. With linear models developed a method is presented to calculate viscosity based on LTI Gosling models but is shown to have some limitations. To address these issues LTV SLLOD models are analyzed with both Balanced Truncation and POD and both show that significant order reduction is possible. By examining the singular values of both techniques it is shown that Balanced Truncation has a potential to offer greater reduction, which should be expected as it is based on the input/output mapping instead of just the state information as in POD. Obtaining reduced order systems that capture the property of interest is challenging. For Balanced Truncation reduced order models for 1-D LJ and FENE systems are obtained and are shown to capture the output of interest fairly well. However numerical challenges currently limit this analysis to small order systems. Suggestions are presented to extend this method to larger systems. In addition reduced 2nd order systems are obtained from POD. Here the challenge is extending the solution beyond the original period used for the projection, in particular identifying the manifold the solution travels along. The remaining challenges are presented and discussed.

  18. Music viewed by its entropy content: A novel window for comparative analysis

    PubMed Central

    Febres, Gerardo; Jaffe, Klaus

    2017-01-01

    Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the ‘2nd Order Entropy’. Applying these methods to a variety of musical pieces showed how the space of ‘symbolic specific diversity-entropy’ and that of ‘2nd order entropy’ captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning. PMID:29040288

  19. Music viewed by its entropy content: A novel window for comparative analysis.

    PubMed

    Febres, Gerardo; Jaffe, Klaus

    2017-01-01

    Polyphonic music files were analyzed using the set of symbols that produced the Minimal Entropy Description, which we call the Fundamental Scale. This allowed us to create a novel space to represent music pieces by developing: (a) a method to adjust a textual description from its original scale of observation to an arbitrarily selected scale, (b) a method to model the structure of any textual description based on the shape of the symbol frequency profiles, and (c) the concept of higher order entropy as the entropy associated with the deviations of a frequency-ranked symbol profile from a perfect Zipfian profile. We call this diversity index the '2nd Order Entropy'. Applying these methods to a variety of musical pieces showed how the space of 'symbolic specific diversity-entropy' and that of '2nd order entropy' captures characteristics that are unique to each music type, style, composer and genre. Some clustering of these properties around each musical category is shown. These methods allow us to visualize a historic trajectory of academic music across this space, from medieval to contemporary academic music. We show that the description of musical structures using entropy, symbol frequency profiles and specific symbolic diversity allows us to characterize traditional and popular expressions of music. These classification techniques promise to be useful in other disciplines for pattern recognition and machine learning.

  20. Assessing the Reliability and Use of the Expository Scoring Scheme as a Measure of Developmental Change in Monolingual English and Bilingual French/English Children

    ERIC Educational Resources Information Center

    Bird, Elizabeth Kay-Raining; Joshi, Nila; Cleave, Patricia L.

    2016-01-01

    Purpose: The Expository Scoring Scheme (ESS) is designed to analyze the macrostructure of descriptions of a favorite game or sport. This pilot study examined inter- and intrarater reliability of the ESS and use of the scale to capture developmental change in elementary school children. Method: Twenty-four children in 2 language groups (monolingual…

  1. Theoretical investigation of the electron capture and loss processes in the collisions of He2+ + Ne.

    PubMed

    Hong, Xuhai; Wang, Feng; Jiao, Yalong; Su, Wenyong; Wang, Jianguo; Gou, Bingcong

    2013-08-28

    Based on the time-dependent density functional theory, a method is developed to study ion-atom collision dynamics, which self-consistently couples the quantum mechanical description of electron dynamics with the classical treatment of the ion motion. Employing real-time and real-space method, the coordinate space translation technique is introduced to allow one to focus on the region of target or projectile depending on the actual concerned process. The benchmark calculations are performed for the collisions of He(2+) + Ne, and the time evolution of electron density distribution is monitored, which provides interesting details of the interaction dynamics between the electrons and ion cores. The cross sections of single and many electron capture and loss have been calculated in the energy range of 1-1000 keV/amu, and the results show a good agreement with the available experiments over a wide range of impact energies.

  2. Estimation of M 1 scissors mode strength for deformed nuclei in the medium- to heavy-mass region by statistical Hauser-Feshbach model calculations

    NASA Astrophysics Data System (ADS)

    Mumpower, M. R.; Kawano, T.; Ullmann, J. L.; Krtička, M.; Sprouse, T. M.

    2017-08-01

    Radiative neutron capture is an important nuclear reaction whose accurate description is needed for many applications ranging from nuclear technology to nuclear astrophysics. The description of such a process relies on the Hauser-Feshbach theory which requires the nuclear optical potential, level density, and γ -strength function as model inputs. It has recently been suggested that the M 1 scissors mode may explain discrepancies between theoretical calculations and evaluated data. We explore statistical model calculations with the strength of the M 1 scissors mode estimated to be dependent on the nuclear deformation of the compound system. We show that the form of the M 1 scissors mode improves the theoretical description of evaluated data and the match to experiment in both the fission product and actinide regions. Since the scissors mode occurs in the range of a few keV to a few MeV, it may also impact the neutron capture cross sections of neutron-rich nuclei that participate in the rapid neutron capture process of nucleosynthesis. We comment on the possible impact to nucleosynthesis by evaluating neutron capture rates for neutron-rich nuclei with the M 1 scissors mode active.

  3. Contact Kinetics in Fractal Macromolecules.

    PubMed

    Dolgushev, Maxim; Guérin, Thomas; Blumen, Alexander; Bénichou, Olivier; Voituriez, Raphaël

    2015-11-13

    We consider the kinetics of first contact between two monomers of the same macromolecule. Relying on a fractal description of the macromolecule, we develop an analytical method to compute the mean first contact time for various molecular sizes. In our theoretical description, the non-Markovian feature of monomer motion, arising from the interactions with the other monomers, is captured by accounting for the nonequilibrium conformations of the macromolecule at the very instant of first contact. This analysis reveals a simple scaling relation for the mean first contact time between two monomers, which involves only their equilibrium distance and the spectral dimension of the macromolecule, independently of its microscopic details. Our theoretical predictions are in excellent agreement with numerical stochastic simulations.

  4. ResearchEHR: use of semantic web technologies and archetypes for the description of EHRs.

    PubMed

    Robles, Montserrat; Fernández-Breis, Jesualdo Tomás; Maldonado, Jose A; Moner, David; Martínez-Costa, Catalina; Bosca, Diego; Menárguez-Tortosa, Marcos

    2010-01-01

    In this paper, we present the ResearchEHR project. It focuses on the usability of Electronic Health Record (EHR) sources and EHR standards for building advanced clinical systems. The aim is to support healthcare professional, institutions and authorities by providing a set of generic methods and tools for the capture, standardization, integration, description and dissemination of health related information. ResearchEHR combines several tools to manage EHR at two different levels. The internal level that deals with the normalization and semantic upgrading of exiting EHR by using archetypes and the external level that uses Semantic Web technologies to specify clinical archetypes for advanced EHR architectures and systems.

  5. Resonating group method as applied to the spectroscopy of α-transfer reactions

    NASA Astrophysics Data System (ADS)

    Subbotin, V. B.; Semjonov, V. M.; Gridnev, K. A.; Hefter, E. F.

    1983-10-01

    In the conventional approach to α-transfer reactions the finite- and/or zero-range distorted-wave Born approximation is used in liaison with a macroscopic description of the captured α particle in the residual nucleus. Here the specific example of 16O(6Li,d)20Ne reactions at different projectile energies is taken to present a microscopic resonating group method analysis of the α particle in the final nucleus (for the reaction part the simple zero-range distorted-wave Born approximation is employed). In the discussion of suitable nucleon-nucleon interactions, force number one of the effective interactions presented by Volkov is shown to be most appropriate for the system considered. Application of the continuous analog of Newton's method to the evaluation of the resonating group method equations yields an increased accuracy with respect to traditional methods. The resonating group method description induces only minor changes in the structures of the angular distributions, but it does serve its purpose in yielding reliable and consistent spectroscopic information. NUCLEAR STRUCTURE 16O(6Li,d)20Ne; E=20 to 32 MeV; calculated B(E2); reduced widths, dσdΩ extracted α-spectroscopic factors. ZRDWBA with microscope RGM description of residual α particle in 20Ne; application of continuous analog of Newton's method; tested and applied Volkov force No. 1; direct mechanism.

  6. Adiabatic description of capture into resonance and surfatron acceleration of charged particles by electromagnetic waves.

    PubMed

    Artemyev, A V; Neishtadt, A I; Zelenyi, L M; Vainchtein, D L

    2010-12-01

    We present an analytical and numerical study of the surfatron acceleration of nonrelativistic charged particles by electromagnetic waves. The acceleration is caused by capture of particles into resonance with one of the waves. We investigate capture for systems with one or two waves and provide conditions under which the obtained results can be applied to systems with more than two waves. In the case of a single wave, the once captured particles never leave the resonance and their velocity grows linearly with time. However, if there are two waves in the system, the upper bound of the energy gain may exist and we find the analytical value of that bound. We discuss several generalizations including the relativistic limit, different wave amplitudes, and a wide range of the waves' wavenumbers. The obtained results are used for qualitative description of some phenomena observed in the Earth's magnetosphere. © 2010 American Institute of Physics.

  7. Introducing Explorer of Taxon Concepts with a case study on spider measurement matrix building.

    PubMed

    Cui, Hong; Xu, Dongfang; Chong, Steven S; Ramirez, Martin; Rodenhausen, Thomas; Macklin, James A; Ludäscher, Bertram; Morris, Robert A; Soto, Eduardo M; Koch, Nicolás Mongiardino

    2016-11-17

    Taxonomic descriptions are traditionally composed in natural language and published in a format that cannot be directly used by computers. The Exploring Taxon Concepts (ETC) project has been developing a set of web-based software tools that convert morphological descriptions published in telegraphic style to character data that can be reused and repurposed. This paper introduces the first semi-automated pipeline, to our knowledge, that converts morphological descriptions into taxon-character matrices to support systematics and evolutionary biology research. We then demonstrate and evaluate the use of the ETC Input Creation - Text Capture - Matrix Generation pipeline to generate body part measurement matrices from a set of 188 spider morphological descriptions and report the findings. From the given set of spider taxonomic publications, two versions of input (original and normalized) were generated and used by the ETC Text Capture and ETC Matrix Generation tools. The tools produced two corresponding spider body part measurement matrices, and the matrix from the normalized input was found to be much more similar to a gold standard matrix hand-curated by the scientist co-authors. Special conventions utilized in the original descriptions (e.g., the omission of measurement units) were attributed to the lower performance of using the original input. The results show that simple normalization of the description text greatly increased the quality of the machine-generated matrix and reduced edit effort. The machine-generated matrix also helped identify issues in the gold standard matrix. ETC Text Capture and ETC Matrix Generation are low-barrier and effective tools for extracting measurement values from spider taxonomic descriptions and are more effective when the descriptions are self-contained. Special conventions that make the description text less self-contained challenge automated extraction of data from biodiversity descriptions and hinder the automated reuse of the published knowledge. The tools will be updated to support new requirements revealed in this case study.

  8. A Description for Rock Joint Roughness Based on Terrestrial Laser Scanner and Image Analysis

    PubMed Central

    Ge, Yunfeng; Tang, Huiming; Eldin, M. A. M Ez; Chen, Pengyu; Wang, Liangqing; Wang, Jinge

    2015-01-01

    Shear behavior of rock mass greatly depends upon the rock joint roughness which is generally characterized by anisotropy, scale effect and interval effect. A new index enabling to capture all the three features, namely brightness area percentage (BAP), is presented to express the roughness based on synthetic illumination of a digital terrain model derived from terrestrial laser scanner (TLS). Since only tiny planes facing opposite to shear direction make contribution to resistance during shear failure, therefore these planes are recognized through the image processing technique by taking advantage of the fact that they appear brighter than other ones under the same light source. Comparison with existing roughness indexes and two case studies were illustrated to test the performance of BAP description. The results reveal that the rock joint roughness estimated by the presented description has a good match with existing roughness methods and displays a wider applicability. PMID:26585247

  9. Magnetic exchange couplings from noncollinear perturbation theory: dinuclear CuII complexes.

    PubMed

    Phillips, Jordan J; Peralta, Juan E

    2014-08-07

    To benchmark the performance of a new method based on noncollinear coupled-perturbed density functional theory [J. Chem. Phys. 138, 174115 (2013)], we calculate the magnetic exchange couplings in a series of triply bridged ferromagnetic dinuclear Cu(II) complexes that have been recently synthesized [Phys. Chem. Chem. Phys. 15, 1966 (2013)]. We find that for any basis-set the couplings from our noncollinear coupled-perturbed methodology are practically identical to those of spin-projected energy-differences when a hybrid density functional approximation is employed. This demonstrates that our methodology properly recovers a Heisenberg description for these systems, and is robust in its predictive power of magnetic couplings. Furthermore, this indicates that the failure of density functional theory to capture the subtle variation of the exchange couplings in these complexes is not simply an artifact of broken-symmetry methods, but rather a fundamental weakness of current approximate density functionals for the description of magnetic couplings.

  10. A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions: FULLY COUPLED PARALLEL SIMULATION OF HYDRAULIC FRACTURES IN 3-D

    DOE PAGES

    Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.; ...

    2016-09-18

    This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.

  11. A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions: FULLY COUPLED PARALLEL SIMULATION OF HYDRAULIC FRACTURES IN 3-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.

    This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.

  12. Explicit evaluation of discontinuities in 2-D unsteady flows solved by the method of characteristics

    NASA Astrophysics Data System (ADS)

    Osnaghi, C.

    When shock waves appear in the numerical solution of flows, a choice is necessary between shock capturing techniques, possible when equations are written in conservative form, and shock fitting techniques. If the second one is preferred, e.g. in order to obtain better definition and more physical description of the shock evolution in time, the method of characteristics is advantageous in the vicinity of the shock and it seems natural to use this method everywhere. This choice requires to improve the efficiency of the numerical scheme in order to produce competitive codes, preserving accuracy and flexibility, which are intrinsic features of the method: this is the goal of the present work.

  13. Estimation of M 1 scissors mode strength for deformed nuclei in the medium- to heavy-mass region by statistical Hauser-Feshbach model calculations

    DOE PAGES

    Mumpower, Matthew Ryan; Kawano, Toshihiko; Ullmann, John Leonard; ...

    2017-08-17

    Radiative neutron capture is an important nuclear reaction whose accurate description is needed for many applications ranging from nuclear technology to nuclear astrophysics. The description of such a process relies on the Hauser-Feshbach theory which requires the nuclear optical potential, level density, and γ-strength function as model inputs. It has recently been suggested that the M1 scissors mode may explain discrepancies between theoretical calculations and evaluated data. We explore statistical model calculations with the strength of the M1 scissors mode estimated to be dependent on the nuclear deformation of the compound system. We show that the form of the M1more » scissors mode improves the theoretical description of evaluated data and the match to experiment in both the fission product and actinide regions. Since the scissors mode occurs in the range of a few keV to a few MeV, it may also impact the neutron capture cross sections of neutron-rich nuclei that participate in the rapid neutron capture process of nucleosynthesis. As a result, we comment on the possible impact to nucleosynthesis by evaluating neutron capture rates for neutron-rich nuclei with the M1 scissors mode active.« less

  14. Mott physics beyond the Brinkman-Rice scenario

    NASA Astrophysics Data System (ADS)

    Wysokiński, Marcin M.; Fabrizio, Michele

    2017-04-01

    The main flaw of the well-known Brinkman-Rice description, obtained through the Gutzwiller approximation, of the paramagnetic Mott transition in the Hubbard model is in neglecting high-energy virtual processes that generate, for instance, the antiferromagnetic exchange J ˜t2/U . Here, we propose a way to capture those processes by combining the Brinkman-Rice approach with a variational Schrieffer-Wolff transformation, and apply this method to study the single-band metal-to-insulator transition in a Bethe lattice with infinite coordination number, where the Gutzwiller approximation becomes exact. We indeed find for the Mott transition a description very close to the real one provided by the dynamical mean-field theory, an encouraging result in view of possible applications to more involved models.

  15. Hexahistidine (6xHis) fusion-based assays for protein-protein interactions.

    PubMed

    Puckett, Mary C

    2015-01-01

    Fusion-protein tags provide a useful method to study protein-protein interactions. One widely used fusion tag is hexahistidine (6xHis). This tag has unique advantages over others due to its small size and the relatively low abundance of naturally occurring consecutive histidine repeats. 6xHis tags can interact with immobilized metal cations to provide for the capture of proteins and protein complexes of interest. In this chapter, a description of the benefits and uses of 6xHis-fusion proteins as well as a detailed method for performing a 6xHis-pulldown assay are described.

  16. Measuring Graduate Students' Teaching and Research Skills through Self-Report: Descriptive Findings and Validity Evidence

    ERIC Educational Resources Information Center

    Gilmore, Joanna; Feldon, David

    2010-01-01

    This study extends research on graduate student development by examining descriptive findings and validity of a self-report survey designed to capture graduate students' assessments of their teaching and research skills. Descriptive findings provide some information about areas of growth among graduate students' in the first years of their…

  17. Temporal variability of local abundance, sex ratio and activity in the Sardinian chalk hill blue butterfly

    USGS Publications Warehouse

    Casula, P.; Nichols, J.D.

    2003-01-01

    When capturing and marking of individuals is possible, the application of newly developed capture-recapture models can remove several sources of bias in the estimation of population parameters such as local abundance and sex ratio. For example, observation of distorted sex ratios in counts or captures can reflect either different abundances of the sexes or different sex-specific capture probabilities, and capture-recapture models can help distinguish between these two possibilities. Robust design models and a model selection procedure based on information-theoretic methods were applied to study the local population structure of the endemic Sardinian chalk hill blue butterfly, Polyommatus coridon gennargenti. Seasonal variations of abundance, plus daily and weather-related variations of active populations of males and females were investigated. Evidence was found of protandry and male pioneering of the breeding space. Temporary emigration probability, which describes the proportion of the population not exposed to capture (e.g. absent from the study area) during the sampling process, was estimated, differed between sexes, and was related to temperature, a factor known to influence animal activity. The correlation between temporary emigration and average daily temperature suggested interpreting temporary emigration as inactivity of animals. Robust design models were used successfully to provide a detailed description of the population structure and activity in this butterfly and are recommended for studies of local abundance and animal activity in the field.

  18. Using DNase Hi-C techniques to map global and local three-dimensional genome architecture at high resolution.

    PubMed

    Ma, Wenxiu; Ay, Ferhat; Lee, Choli; Gulsoy, Gunhan; Deng, Xinxian; Cook, Savannah; Hesson, Jennifer; Cavanaugh, Christopher; Ware, Carol B; Krumm, Anton; Shendure, Jay; Blau, C Anthony; Disteche, Christine M; Noble, William S; Duan, ZhiJun

    2018-06-01

    The folding and three-dimensional (3D) organization of chromatin in the nucleus critically impacts genome function. The past decade has witnessed rapid advances in genomic tools for delineating 3D genome architecture. Among them, chromosome conformation capture (3C)-based methods such as Hi-C are the most widely used techniques for mapping chromatin interactions. However, traditional Hi-C protocols rely on restriction enzymes (REs) to fragment chromatin and are therefore limited in resolution. We recently developed DNase Hi-C for mapping 3D genome organization, which uses DNase I for chromatin fragmentation. DNase Hi-C overcomes RE-related limitations associated with traditional Hi-C methods, leading to improved methodological resolution. Furthermore, combining this method with DNA capture technology provides a high-throughput approach (targeted DNase Hi-C) that allows for mapping fine-scale chromatin architecture at exceptionally high resolution. Hence, targeted DNase Hi-C will be valuable for delineating the physical landscapes of cis-regulatory networks that control gene expression and for characterizing phenotype-associated chromatin 3D signatures. Here, we provide a detailed description of method design and step-by-step working protocols for these two methods. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. African Primary Care Research: Quantitative analysis and presentation of results

    PubMed Central

    Ogunbanjo, Gboyega A.

    2014-01-01

    Abstract This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report. PMID:26245435

  20. Activities report in nuclear physics and particle acceleration

    NASA Astrophysics Data System (ADS)

    Jansen, J. F. W.; Demeijer, R. J.

    1984-04-01

    Research on nuclear resonances; charge transfer; breakup of light and heavy ions; reaction mechanisms of heavy ion collisions; high-spin states; and fundamental symmetries in weak interactions are outlined. Group theoretical methods applied to supersymmetries; phenomenological description of rotation-vibration coupling; a microscopic theory of collective variables; the binding energy of hydrogen adsorbed on stepped platinium; and single electron capture are discussed. Isotopes for nuclear medicine, for off-line nuclear spectroscopy work, and for the study of hyperfine interactions were produced.

  1. Capturing Cognitive Processing Time for Active Authentication

    DTIC Science & Technology

    2014-02-01

    cognitive fingerprint for continuous authentication. Its effectiveness has been verified through a campus-wide experiment at Iowa State University...2 3.1 Cognitive Fingerprint Description...brief to capture a “ cognitive fingerprint .” In the current keystroke-authentication commercial market, some products combine the timing information of

  2. Population Estimation Methods for Free-Ranging Dogs: A Systematic Review.

    PubMed

    Belo, Vinícius Silva; Werneck, Guilherme Loureiro; da Silva, Eduardo Sérgio; Barbosa, David Soeiro; Struchiner, Claudio José

    2015-01-01

    The understanding of the structure of free-roaming dog populations is of extreme importance for the planning and monitoring of populational control strategies and animal welfare. The methods used to estimate the abundance of this group of dogs are more complex than the ones used with domiciled owned dogs. In this systematic review, we analyze the techniques and the results obtained in studies that seek to estimate the size of free-ranging dog populations. Twenty-six studies were reviewed regarding the quality of execution and their capacity to generate valid estimates. Seven of the eight publications that take a simple count of the animal population did not consider the different probabilities of animal detection; only one study used methods based on distances; twelve relied on capture-recapture models for closed populations without considering heterogeneities in capture probabilities; six studies applied their own methods with different potential and limitations. Potential sources of bias in the studies were related to the inadequate description or implementation of animal capturing or viewing procedures and to inadequacies in the identification and registration of dogs. Thus, there was a predominance of estimates with low validity. Abundance and density estimates carried high variability, and all studies identified a greater number of male dogs. We point to enhancements necessary for the implementation of future studies and to potential updates and revisions to the recommendations of the World Health Organization with respect to the estimation of free-ranging dog populations.

  3. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan, M.E.; Wilson, M.L.; Wightman, J.

    1996-12-31

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity & permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based onmore » marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic & petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less

  4. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan, M.E.; Wilson, M.L.; Wightman, J.

    1996-01-01

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on markermore » correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less

  5. Description of Lutzomyia (Trichophoromyia) pabloi n. sp. and the female of L. howardi (Diptera: Psychodidae) from Colombia.

    PubMed

    Barreto, Mauricio; Burbano, María Elena; Young, David G

    2002-07-01

    A new Lutzomyia species in the subgenus Trichophoromyia, L. pabloi, is described and illustrated. A description of the previously unknown female of L. howardi Young is also presented. These specimens were captured in the Amazon region of Colombia.

  6. Integrated geostatistics for modeling fluid contacts and shales in Prudhoe Bay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez, G.; Chopra, A.K.; Severson, C.D.

    1997-12-01

    Geostatistics techniques are being used increasingly to model reservoir heterogeneity at a wide range of scales. A variety of techniques is now available with differing underlying assumptions, complexity, and applications. This paper introduces a novel method of geostatistics to model dynamic gas-oil contacts and shales in the Prudhoe Bay reservoir. The method integrates reservoir description and surveillance data within the same geostatistical framework. Surveillance logs and shale data are transformed to indicator variables. These variables are used to evaluate vertical and horizontal spatial correlation and cross-correlation of gas and shale at different times and to develop variogram models. Conditional simulationmore » techniques are used to generate multiple three-dimensional (3D) descriptions of gas and shales that provide a measure of uncertainty. These techniques capture the complex 3D distribution of gas-oil contacts through time. The authors compare results of the geostatistical method with conventional techniques as well as with infill wells drilled after the study. Predicted gas-oil contacts and shale distributions are in close agreement with gas-oil contacts observed at infill wells.« less

  7. Implementation of a hybrid particle code with a PIC description in r–z and a gridless description in ϕ into OSIRIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, A., E-mail: davidsoa@physics.ucla.edu; Tableman, A., E-mail: Tableman@physics.ucla.edu; An, W., E-mail: anweiming@ucla.edu

    2015-01-15

    For many plasma physics problems, three-dimensional and kinetic effects are very important. However, such simulations are very computationally intensive. Fortunately, there is a class of problems for which there is nearly azimuthal symmetry and the dominant three-dimensional physics is captured by the inclusion of only a few azimuthal harmonics. Recently, it was proposed [1] to model one such problem, laser wakefield acceleration, by expanding the fields and currents in azimuthal harmonics and truncating the expansion. The complex amplitudes of the fundamental and first harmonic for the fields were solved on an r–z grid and a procedure for calculating the complexmore » current amplitudes for each particle based on its motion in Cartesian geometry was presented using a Marder's correction to maintain the validity of Gauss's law. In this paper, we describe an implementation of this algorithm into OSIRIS using a rigorous charge conserving current deposition method to maintain the validity of Gauss's law. We show that this algorithm is a hybrid method which uses a particles-in-cell description in r–z and a gridless description in ϕ. We include the ability to keep an arbitrary number of harmonics and higher order particle shapes. Examples for laser wakefield acceleration, plasma wakefield acceleration, and beam loading are also presented and directions for future work are discussed.« less

  8. Electron capture and excitation processes in H+-H collisions in dense quantum plasmas

    NASA Astrophysics Data System (ADS)

    Jakimovski, D.; Markovska, N.; Janev, R. K.

    2016-10-01

    Electron capture and excitation processes in proton-hydrogen atom collisions taking place in dense quantum plasmas are studied by employing the two-centre atomic orbital close-coupling (TC-AOCC) method. The Debye-Hückel cosine (DHC) potential is used to describe the plasma screening effects on the Coulomb interaction between charged particles. The properties of a hydrogen atom with DHC potential are investigated as a function of the screening strength of the potential. It is found that the decrease in binding energy of nl levels with increasing screening strength is considerably faster than in the case of the Debye-Hückel (DH) screening potential, appropriate for description of charged particle interactions in weakly coupled classical plasmas. This results in a reduction in the number of bound states in the DHC potential with respect to that in the DH potential for the same plasma screening strength, and is reflected in the dynamics of excitation and electron capture processes for the two screened potentials. The TC-AOCC cross sections for total and state-selective electron capture and excitation cross sections with the DHC potential are calculated for a number of representative screening strengths in the 1-300 keV energy range and compared with those for the DH and pure Coulomb potential. The total capture cross sections for a selected number of screening strengths are compared with the available results from classical trajectory Monte Carlo calculations.

  9. Quantitative metrics for assessment of chemical image quality and spatial resolution

    DOE PAGES

    Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.

    2016-02-28

    Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less

  10. Quantitative metrics for assessment of chemical image quality and spatial resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.

    Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less

  11. Automatic anatomical structures location based on dynamic shape measurement

    NASA Astrophysics Data System (ADS)

    Witkowski, Marcin; Rapp, Walter; Sitnik, Robert; Kujawinska, Malgorzata; Vander Sloten, Jos; Haex, Bart; Bogaert, Nico; Heitmann, Kjell

    2005-09-01

    New image processing methods and active photonics apparatus have made possible the development of relatively inexpensive optical systems for complex shape and object measurements. We present dynamic 360° scanning method for analysis of human lower body biomechanics, with an emphasis on the analysis of the knee joint. The anatomical structure (of high medical interest) that is possible to scan and analyze, is patella. Tracking of patella position and orientation under dynamic conditions may lead to detect pathological patella movements and help in knee joint disease diagnosis. The processed data is obtained from a dynamic laser triangulation surface measurement system, able to capture slow to normal movements with a scan frequency between 15 and 30 Hz. These frequency rates are enough to capture controlled movements used e.g. for medical examination purposes. The purpose of the work presented is to develop surface analysis methods that may be used as support of diagnosis of motoric abilities of lower limbs. The paper presents algorithms used to process acquired lower limbs surface data in order to find the position and orientation of patella. The algorithms implemented include input data preparation, curvature description methods, knee region discrimination and patella assumed position/orientation calculation. Additionally, a method of 4D (3D + time) medical data visualization is proposed. Also some exemplary results are presented.

  12. 3-D conditional hyperbolic method of moments for high-fidelity Euler-Euler simulations of particle-laden flows

    NASA Astrophysics Data System (ADS)

    Patel, Ravi; Kong, Bo; Capecelatro, Jesse; Fox, Rodney; Desjardins, Olivier

    2017-11-01

    Particle-laden turbulent flows are important features of many environmental and industrial processes. Euler-Euler (EE) simulations of these flows are more computationally efficient than Euler-Lagrange (EL) simulations. However, traditional EE methods, such as the two-fluid model, cannot faithfully capture dilute regions of flow with finite Stokes number particles. For this purpose, the multi-valued nature of the particle velocity field must be treated with a polykinetic description. Various quadrature-based moment methods (QBMM) can be used to approximate the full kinetic description by solving for a set of moments of the particle velocity distribution function (VDF) and providing closures for the higher-order moments. Early QBMM fail to maintain the strict hyperbolicity of the kinetic equations, producing unphysical delta shocks (i.e., mass accumulation at a point). In previous work, a 2-D conditional hyperbolic quadrature method of moments (CHyQMOM) was proposed as a fourth-order QBMM closure that maintains strict hyperbolicity. Here, we present the 3-D extension of CHyQMOM. We compare results from CHyQMOM to other QBMM and EL in the context of particle trajectory crossing, cluster-induced turbulence, and particle-laden channel flow. NSF CBET-1437903.

  13. The Study Team for Early Life Asthma Research (STELAR) consortium ‘Asthma e-lab’: team science bringing data, methods and investigators together

    PubMed Central

    Custovic, Adnan; Ainsworth, John; Arshad, Hasan; Bishop, Christopher; Buchan, Iain; Cullinan, Paul; Devereux, Graham; Henderson, John; Holloway, John; Roberts, Graham; Turner, Steve; Woodcock, Ashley; Simpson, Angela

    2015-01-01

    We created Asthma e-Lab, a secure web-based research environment to support consistent recording, description and sharing of data, computational/statistical methods and emerging findings across the five UK birth cohorts. The e-Lab serves as a data repository for our unified dataset and provides the computational resources and a scientific social network to support collaborative research. All activities are transparent, and emerging findings are shared via the e-Lab, linked to explanations of analytical methods, thus enabling knowledge transfer. eLab facilitates the iterative interdisciplinary dialogue between clinicians, statisticians, computer scientists, mathematicians, geneticists and basic scientists, capturing collective thought behind the interpretations of findings. PMID:25805205

  14. Classification of human carcinoma cells using multispectral imagery

    NASA Astrophysics Data System (ADS)

    Ćinar, Umut; Y. Ćetin, Yasemin; Ćetin-Atalay, Rengul; Ćetin, Enis

    2016-03-01

    In this paper, we present a technique for automatically classifying human carcinoma cell images using textural features. An image dataset containing microscopy biopsy images from different patients for 14 distinct cancer cell line type is studied. The images are captured using a RGB camera attached to an inverted microscopy device. Texture based Gabor features are extracted from multispectral input images. SVM classifier is used to generate a descriptive model for the purpose of cell line classification. The experimental results depict satisfactory performance, and the proposed method is versatile for various microscopy magnification options.

  15. Adaptive simplification of complex multiscale systems.

    PubMed

    Chiavazzo, Eliodoro; Karlin, Ilya

    2011-03-01

    A fully adaptive methodology is developed for reducing the complexity of large dissipative systems. This represents a significant step toward extracting essential physical knowledge from complex systems, by addressing the challenging problem of a minimal number of variables needed to exactly capture the system dynamics. Accurate reduced description is achieved, by construction of a hierarchy of slow invariant manifolds, with an embarrassingly simple implementation in any dimension. The method is validated with the autoignition of the hydrogen-air mixture where a reduction to a cascade of slow invariant manifolds is observed.

  16. Using observational methods in nursing research.

    PubMed

    Salmon, Jenny

    2015-07-08

    Observation is a research data-collection method used generally to capture the activities of participants as well as when and where things are happening in a given setting. It checks description of the phenomena against what the researcher perceives to be fact in a rich experiential context. The method's main strength is that it provides direct access to the social phenomena under consideration. It can be used quantitatively or qualitatively, depending on the research question. Challenges in using observation relate to adopting the role of participant or non-participant researcher as observer. This article discusses some of the complexities involved when nurse researchers seek to collect observational data on social processes in naturalistic settings using unstructured or structured observational methods in qualitative research methodology. A glossary of research terms is provided.

  17. Method of conditional moments (MCM) for the Chemical Master Equation: a unified framework for the method of moments and hybrid stochastic-deterministic models.

    PubMed

    Hasenauer, J; Wolf, V; Kazeroonian, A; Theis, F J

    2014-09-01

    The time-evolution of continuous-time discrete-state biochemical processes is governed by the Chemical Master Equation (CME), which describes the probability of the molecular counts of each chemical species. As the corresponding number of discrete states is, for most processes, large, a direct numerical simulation of the CME is in general infeasible. In this paper we introduce the method of conditional moments (MCM), a novel approximation method for the solution of the CME. The MCM employs a discrete stochastic description for low-copy number species and a moment-based description for medium/high-copy number species. The moments of the medium/high-copy number species are conditioned on the state of the low abundance species, which allows us to capture complex correlation structures arising, e.g., for multi-attractor and oscillatory systems. We prove that the MCM provides a generalization of previous approximations of the CME based on hybrid modeling and moment-based methods. Furthermore, it improves upon these existing methods, as we illustrate using a model for the dynamics of stochastic single-gene expression. This application example shows that due to the more general structure, the MCM allows for the approximation of multi-modal distributions.

  18. Modeling Amorphous Microporous Polymers for CO2 Capture and Separations.

    PubMed

    Kupgan, Grit; Abbott, Lauren J; Hart, Kyle E; Colina, Coray M

    2018-06-13

    This review concentrates on the advances of atomistic molecular simulations to design and evaluate amorphous microporous polymeric materials for CO 2 capture and separations. A description of atomistic molecular simulations is provided, including simulation techniques, structural generation approaches, relaxation and equilibration methodologies, and considerations needed for validation of simulated samples. The review provides general guidelines and a comprehensive update of the recent literature (since 2007) to promote the acceleration of the discovery and screening of amorphous microporous polymers for CO 2 capture and separation processes.

  19. Mean-Field Description of Ionic Size Effects with Non-Uniform Ionic Sizes: A Numerical Approach

    PubMed Central

    Zhou, Shenggao; Wang, Zhongming; Li, Bo

    2013-01-01

    Ionic size effects are significant in many biological systems. Mean-field descriptions of such effects can be efficient but also challenging. When ionic sizes are different, explicit formulas in such descriptions are not available for the dependence of the ionic concentrations on the electrostatic potential, i.e., there is no explicit, Boltzmann type distributions. This work begins with a variational formulation of the continuum electrostatics of an ionic solution with such non-uniform ionic sizes as well as multiple ionic valences. An augmented Lagrange multiplier method is then developed and implemented to numerically solve the underlying constrained optimization problem. The method is shown to be accurate and efficient, and is applied to ionic systems with non-uniform ionic sizes such as the sodium chloride solution. Extensive numerical tests demonstrate that the mean-field model and numerical method capture qualitatively some significant ionic size effects, particularly those for multivalent ionic solutions, such as the stratification of multivalent counterions near a charged surface. The ionic valence-to-volume ratio is found to be the key physical parameter in the stratification of concentrations. All these are not well described by the classical Poisson–Boltzmann theory, or the generalized Poisson–Boltzmann theory that treats uniform ionic sizes. Finally, various issues such as the close packing, limitation of the continuum model, and generalization of this work to molecular solvation are discussed. PMID:21929014

  20. Measuring exposure to protobacco marketing and media: a field study using ecological momentary assessment.

    PubMed

    Martino, Steven C; Scharf, Deborah M; Setodji, Claude M; Shadel, William G

    2012-04-01

    The aims of this study were to validate ecological momentary assessment (EMA) as a method for measuring exposure to tobacco-related marketing and media and to use this method to provide detailed descriptive data on college students' exposure to protobacco marketing and media. College students (n = 134; ages 18-24 years) recorded their exposures to protobacco marketing and media on handheld devices for 21 consecutive days. Participants also recalled exposures to various types of protobacco marketing and media at the end of the study period. Retrospectively recalled and EMA-based estimates of protobacco marketing exposure captured different information. The correlation between retrospectively recalled and EMA-logged exposures to tobacco marketing and media was moderate (r = .37, p < .001), and EMA-logged exposures were marginally associated with the intention to smoke at the end of the study, whereas retrospective recall of exposure was not. EMA data showed that college students were exposed to protobacco marketing through multiple channels in a relatively short period: Exposures (M = 8.24, SD = 7.85) occurred primarily in the afternoon (42%), on weekends (35%), and at point-of-purchase locations (68%) or in movies/TV (20%), and exposures to Marlboro, Newport, and Camel represented 56% of all exposures combined and 70% of branded exposures. Findings support the validity of EMA as a method for capturing detailed information about youth exposure to protobacco marketing and media that are not captured through other existing methods. Such data have the potential to highlight areas for policy change and prevention in order to reduce the impact of tobacco marketing on youth.

  1. Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording

    ERIC Educational Resources Information Center

    Mayer, Kimberly L.; DiGennaro Reed, Florence D.

    2013-01-01

    Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…

  2. Programmable Potentials: Approximate N-body potentials from coarse-level logic.

    PubMed

    Thakur, Gunjan S; Mohr, Ryan; Mezić, Igor

    2016-09-27

    This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the "coefficients" of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out.

  3. Programmable Potentials: Approximate N-body potentials from coarse-level logic

    NASA Astrophysics Data System (ADS)

    Thakur, Gunjan S.; Mohr, Ryan; Mezić, Igor

    2016-09-01

    This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the “coefficients” of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out.

  4. Programmable Potentials: Approximate N-body potentials from coarse-level logic

    PubMed Central

    Thakur, Gunjan S.; Mohr, Ryan; Mezić, Igor

    2016-01-01

    This paper gives a systematic method for constructing an N-body potential, approximating the true potential, that accurately captures meso-scale behavior of the chemical or biological system using pairwise potentials coming from experimental data or ab initio methods. The meso-scale behavior is translated into logic rules for the dynamics. Each pairwise potential has an associated logic function that is constructed using the logic rules, a class of elementary logic functions, and AND, OR, and NOT gates. The effect of each logic function is to turn its associated potential on and off. The N-body potential is constructed as linear combination of the pairwise potentials, where the “coefficients” of the potentials are smoothed versions of the associated logic functions. These potentials allow a potentially low-dimensional description of complex processes while still accurately capturing the relevant physics at the meso-scale. We present the proposed formalism to construct coarse-grained potential models for three examples: an inhibitor molecular system, bond breaking in chemical reactions, and DNA transcription from biology. The method can potentially be used in reverse for design of molecular processes by specifying properties of molecules that can carry them out. PMID:27671683

  5. Longitudinal construct validity: establishment of clinical meaning in patient evaluative instruments.

    PubMed

    Liang, M H

    2000-09-01

    Although widely used and reported in research for the evaluation of groups, measures of health status and health-related quality of life have had little application in clinical practice for the assessment of individual patients. One of the principal barriers is the demonstration that these measures add clinically significant information to measures of function or symptoms alone. Here, we review the methods for evaluation of construct validity in longitudinal studies and make recommendations for nomenclature, reporting of study results, and future research agenda. Analytical review. The terms "sensitivity" and "responsiveness" have been used interchangeably, and there are few studies that evaluate the extent to which health status or health-related quality-of life measures capture clinically important changes ("responsiveness"). Current methods of evaluating responsiveness are not standardized or evaluated. Approaches for the assessment of a clinically significant or meaningful change are described; rather than normative information, however, standardized transition questions are proposed. They would be reported routinely and as separate axes of description to capture individual perceptions. Research in methods to assess the subject's evaluation of the importance and magnitude of a measured change are critical if health status and health-related quality-of-life measures are to have an impact on patient care.

  6. ``Glue" approximation for the pairing interaction in the Hubbard model with next nearest neighbor hopping

    NASA Astrophysics Data System (ADS)

    Khatami, Ehsan; Macridin, Alexandru; Jarrell, Mark

    2008-03-01

    Recently, several authors have employed the ``glue" approximation for the Cuprates in which the full pairing vertex is approximated by the spin susceptibility. We study this approximation using Quantum Monte Carlo Dynamical Cluster Approximation methods on a 2D Hubbard model. By considering a reasonable finite value for the next nearest neighbor hopping, we find that this ``glue" approximation, in the current form, does not capture the correct pairing symmetry. Here, d-wave is not the leading pairing symmetry while it is the dominant symmetry using the ``exact" QMC results. We argue that the sensitivity of this approximation to the band structure changes leads to this inconsistency and that this form of interaction may not be the appropriate description of the pairing mechanism in Cuprates. We suggest improvements to this approximation which help to capture the the essential features of the QMC data.

  7. Measuring Exposure to Protobacco Marketing and Media: A Field Study Using Ecological Momentary Assessment

    PubMed Central

    Scharf, Deborah M.; Setodji, Claude M.; Shadel, William G.

    2012-01-01

    Introduction: The aims of this study were to validate ecological momentary assessment (EMA) as a method for measuring exposure to tobacco-related marketing and media and to use this method to provide detailed descriptive data on college students’ exposure to protobacco marketing and media. Methods: College students (n = 134; ages 18–24 years) recorded their exposures to protobacco marketing and media on handheld devices for 21 consecutive days. Participants also recalled exposures to various types of protobacco marketing and media at the end of the study period. Results: Retrospectively recalled and EMA-based estimates of protobacco marketing exposure captured different information. The correlation between retrospectively recalled and EMA-logged exposures to tobacco marketing and media was moderate (r = .37, p < .001), and EMA-logged exposures were marginally associated with the intention to smoke at the end of the study, whereas retrospective recall of exposure was not. EMA data showed that college students were exposed to protobacco marketing through multiple channels in a relatively short period: Exposures (M = 8.24, SD = 7.85) occurred primarily in the afternoon (42%), on weekends (35%), and at point-of-purchase locations (68%) or in movies/TV (20%), and exposures to Marlboro, Newport, and Camel represented 56% of all exposures combined and 70% of branded exposures. Conclusions: Findings support the validity of EMA as a method for capturing detailed information about youth exposure to protobacco marketing and media that are not captured through other existing methods. Such data have the potential to highlight areas for policy change and prevention in order to reduce the impact of tobacco marketing on youth. PMID:22039076

  8. CHEMICAL EFFECTS IN BIOLOGICAL SYSTEMS – DATA DICTIONARY (CEBS-DD): A COMPENDIUM OF TERMS FOR THE CAPTURE AND INTEGRATION OF BIOLOGICAL STUDY DESIGN DESCRIPTION, CONVENTIONAL PHENOTYPES AND ‘OMICS’ DATA

    EPA Science Inventory

    A critical component in the design of the Chemical Effects in Biological Systems (CEBS) Knowledgebase is a strategy to capture toxicogenomics study protocols and the toxicity endpoint data (clinical pathology and histopathology). A Study is generally an experiment carried out du...

  9. IDEF5 Ontology Description Capture Method: Concept Paper

    NASA Technical Reports Server (NTRS)

    Menzel, Christopher P.; Mayer, Richard J.

    1990-01-01

    The results of research towards an ontology capture method referred to as IDEF5 are presented. Viewed simply as the study of what exists in a domain, ontology is an activity that can be understood to be at work across the full range of human inquiry prompted by the persistent effort to understand the world in which it has found itself - and which it has helped to shape. In the contest of information management, ontology is the task of extracting the structure of a given engineering, manufacturing, business, or logistical domain and storing it in an usable representational medium. A key to effective integration is a system ontology that can be accessed and modified across domains and which captures common features of the overall system relevant to the goals of the disparate domains. If the focus is on information integration, then the strongest motivation for ontology comes from the need to support data sharing and function interoperability. In the correct architecture, an enterprise ontology base would allow th e construction of an integrated environment in which legacy systems appear to be open architecture integrated resources. If the focus is on system/software development, then support for the rapid acquisition of reliable systems is perhaps the strongest motivation for ontology. Finally, ontological analysis was demonstrated to be an effective first step in the construction of robust knowledge based systems.

  10. Measures for brain connectivity analysis: nodes centrality and their invariant patterns

    NASA Astrophysics Data System (ADS)

    da Silva, Laysa Mayra Uchôa; Baltazar, Carlos Arruda; Silva, Camila Aquemi; Ribeiro, Mauricio Watanabe; de Aratanha, Maria Adelia Albano; Deolindo, Camila Sardeto; Rodrigues, Abner Cardoso; Machado, Birajara Soares

    2017-07-01

    The high dynamical complexity of the brain is related to its small-world topology, which enable both segregated and integrated information processing capabilities. Several measures of connectivity estimation have already been employed to characterize functional brain networks from multivariate electrophysiological data. However, understanding the properties of each measure that lead to a better description of the real topology and capture the complex phenomena present in the brain remains challenging. In this work we compared four nonlinear connectivity measures and show that each method characterizes distinct features of brain interactions. The results suggest an invariance of global network parameters from different behavioral states and that more complete description may be reached considering local features, independently of the connectivity measure employed. Our findings also point to future perspectives in connectivity studies that combine distinct and complementary dependence measures in assembling higher dimensions manifolds.

  11. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 2: Framework process description

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.

  12. Statistical inference for capture-recapture experiments

    USGS Publications Warehouse

    Pollock, Kenneth H.; Nichols, James D.; Brownie, Cavell; Hines, James E.

    1990-01-01

    This monograph presents a detailed, practical exposition on the design, analysis, and interpretation of capture-recapture studies. The Lincoln-Petersen model (Chapter 2) and the closed population models (Chapter 3) are presented only briefly because these models have been covered in detail elsewhere. The Jolly- Seber open population model, which is central to the monograph, is covered in detail in Chapter 4. In Chapter 5 we consider the "enumeration" or "calendar of captures" approach, which is widely used by mammalogists and other vertebrate ecologists. We strongly recommend that it be abandoned in favor of analyses based on the Jolly-Seber model. We consider 2 restricted versions of the Jolly-Seber model. We believe the first of these, which allows losses (mortality or emigration) but not additions (births or immigration), is likely to be useful in practice. Another series of restrictive models requires the assumptions of a constant survival rate or a constant survival rate and a constant capture rate for the duration of the study. Detailed examples are given that illustrate the usefulness of these restrictions. There often can be a substantial gain in precision over Jolly-Seber estimates. In Chapter 5 we also consider 2 generalizations of the Jolly-Seber model. The temporary trap response model allows newly marked animals to have different survival and capture rates for 1 period. The other generalization is the cohort Jolly-Seber model. Ideally all animals would be marked as young, and age effects considered by using the Jolly-Seber model on each cohort separately. In Chapter 6 we present a detailed description of an age-dependent Jolly-Seber model, which can be used when 2 or more identifiable age classes are marked. In Chapter 7 we present a detailed description of the "robust" design. Under this design each primary period contains several secondary sampling periods. We propose an estimation procedure based on closed and open population models that allows for heterogeneity and trap response of capture rates (hence the name robust design). We begin by considering just 1 age class and then extend to 2 age classes. When there are 2 age classes it is possible to distinguish immigrants and births. In Chapter 8 we give a detailed discussion of the design of capture-recapture studies. First, capture-recapture is compared to other possible sampling procedures. Next, the design of capture-recapture studies to minimize assumption violations is considered. Finally, we consider the precision of parameter estimates and present figures on proportional standard errors for a variety of initial parameter values to aid the biologist about to plan a study. A new program, JOLLY, has been written to accompany the material on the Jolly-Seber model (Chapter 4) and its extensions (Chapter 5). Another new program, JOLLYAGE, has been written for a special case of the age-dependent model (Chapter 6) where there are only 2 age classes. In Chapter 9 a brief description of the different versions of the 2 programs is given. Chapter 10 gives a brief description of some alternative approaches that were not considered in this monograph. We believe that an excellent overall view of capture- recapture models may be obtained by reading the monograph by White et al. (1982) emphasizing closed models and then reading this monograph where we concentrate on open models. The important recent monograph by Burnham et al. (1987) could then be read if there were interest in the comparison of different populations.

  13. A Principled Approach to the Specification of System Architectures for Space Missions

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  14. Firing rate dynamics in recurrent spiking neural networks with intrinsic and network heterogeneity.

    PubMed

    Ly, Cheng

    2015-12-01

    Heterogeneity of neural attributes has recently gained a lot of attention and is increasing recognized as a crucial feature in neural processing. Despite its importance, this physiological feature has traditionally been neglected in theoretical studies of cortical neural networks. Thus, there is still a lot unknown about the consequences of cellular and circuit heterogeneity in spiking neural networks. In particular, combining network or synaptic heterogeneity and intrinsic heterogeneity has yet to be considered systematically despite the fact that both are known to exist and likely have significant roles in neural network dynamics. In a canonical recurrent spiking neural network model, we study how these two forms of heterogeneity lead to different distributions of excitatory firing rates. To analytically characterize how these types of heterogeneities affect the network, we employ a dimension reduction method that relies on a combination of Monte Carlo simulations and probability density function equations. We find that the relationship between intrinsic and network heterogeneity has a strong effect on the overall level of heterogeneity of the firing rates. Specifically, this relationship can lead to amplification or attenuation of firing rate heterogeneity, and these effects depend on whether the recurrent network is firing asynchronously or rhythmically firing. These observations are captured with the aforementioned reduction method, and furthermore simpler analytic descriptions based on this dimension reduction method are developed. The final analytic descriptions provide compact and descriptive formulas for how the relationship between intrinsic and network heterogeneity determines the firing rate heterogeneity dynamics in various settings.

  15. A painless and constraint-free method to estimate viscoelastic passive dynamics of limbs' joints to support diagnosis of neuromuscular diseases.

    PubMed

    Venture, Gentiane; Nakamura, Yoshihiko; Yamane, Katsu; Hirashima, Masaya

    2007-01-01

    Though seldom identified, the human joints dynamics is important in the fields of medical robotics and medical research. We present a general solution to estimate in-vivo and simultaneously the passive dynamics of the human limbs' joints. It is based on the use of the multi-body description of the human body and its kinematics and dynamics computations. The linear passive joint dynamics of the shoulders and the elbows: stiffness, viscosity and friction, is estimated simultaneously using the linear least squares method. Acquisition of movements is achieved with an optical motion capture studio on one examinee during the clinical diagnosis of neuromuscular diseases. Experimental results are given and discussed.

  16. Approaches to capturing the financial cost of family care-giving within a palliative care context: a systematic review.

    PubMed

    Gardiner, Clare; Brereton, Louise; Frey, Rosemary; Wilkinson-Meyers, Laura; Gott, Merryn

    2016-09-01

    The economic burden faced by family caregivers of people at the end of life is well recognised. Financial burden has a significant impact on the provision of family care-giving in the community setting, but has seen limited research attention. A systematic review with realist review synthesis and thematic analysis was undertaken to identify literature relating to the financial costs and impact of family care-giving at the end of life. This paper reports findings relating to previously developed approaches which capture the financial costs and implications of caring for family members receiving palliative/end-of-life care. Seven electronic databases were searched from inception to April 2012, for original research studies relating to the financial impact of care-giving at the end of life. Studies were independently screened to identify those which met the study inclusion criteria, and the methodological quality of included studies was appraised using realist review criteria of relevance and rigour. A descriptive thematic approach was used to synthesise data. Twelve articles met the inclusion criteria for the review. Various approaches to capturing data on the financial costs of care-giving at the end of life were noted; however, no single tool was identified with the sole purpose of exploring these costs. The majority of approaches used structured questionnaires and were administered by personal interview, with most studies using longitudinal designs. Calculation of costs was most often based on recall by patients and family caregivers, in some studies combined with objective measures of resource use. While the studies in this review provide useful data on approaches to capturing costs of care-giving, more work is needed to develop methods which accurately and sensitively capture the financial costs of caring at the end of life. Methodological considerations include study design and method of administration, contextual and cultural relevance, and accuracy of cost estimates. © 2015 John Wiley & Sons Ltd.

  17. Model-Based Systems Engineering for Capturing Mission Architecture System Processes with an Application Case Study - Orion Flight Test 1

    NASA Technical Reports Server (NTRS)

    Bonanne, Kevin H.

    2011-01-01

    Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.

  18. Decomposing phenotype descriptions for the human skeletal phenome.

    PubMed

    Groza, Tudor; Hunter, Jane; Zankl, Andreas

    2013-01-01

    Over the course of the last few years there has been a significant amount of research performed on ontology-based formalization of phenotype descriptions. The intrinsic value and knowledge captured within such descriptions can only be expressed by taking advantage of their inner structure that implicitly combines qualities and anatomical entities. We present a meta-model (the Phenotype Fragment Ontology) and a processing pipeline that enable together the automatic decomposition and conceptualization of phenotype descriptions for the human skeletal phenome. We use this approach to showcase the usefulness of the generic concept of phenotype decomposition by performing an experimental study on all skeletal phenotype concepts defined in the Human Phenotype Ontology.

  19. A fast numerical scheme for causal relativistic hydrodynamics with dissipation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takamoto, Makoto, E-mail: takamoto@tap.scphys.kyoto-u.ac.jp; Inutsuka, Shu-ichiro

    2011-08-01

    Highlights: {yields} We have developed a new multi-dimensional numerical scheme for causal relativistic hydrodynamics with dissipation. {yields} Our new scheme can calculate the evolution of dissipative relativistic hydrodynamics faster and more effectively than existing schemes. {yields} Since we use the Riemann solver for solving the advection steps, our method can capture shocks very accurately. - Abstract: In this paper, we develop a stable and fast numerical scheme for relativistic dissipative hydrodynamics based on Israel-Stewart theory. Israel-Stewart theory is a stable and causal description of dissipation in relativistic hydrodynamics although it includes relaxation process with the timescale for collision of constituentmore » particles, which introduces stiff equations and makes practical numerical calculation difficult. In our new scheme, we use Strang's splitting method, and use the piecewise exact solutions for solving the extremely short timescale problem. In addition, since we split the calculations into inviscid step and dissipative step, Riemann solver can be used for obtaining numerical flux for the inviscid step. The use of Riemann solver enables us to capture shocks very accurately. Simple numerical examples are shown. The present scheme can be applied to various high energy phenomena of astrophysics and nuclear physics.« less

  20. Typewriting: Toward Duplicating Success

    ERIC Educational Resources Information Center

    Orsborn, Karen J.

    1977-01-01

    A description of two projects (secretarial handbook and memo pad and personalized stationery) for use in teaching the duplication process that will capture the interests of students in an advanced typewriting class. (HD)

  1. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  2. Fermionic topological quantum states as tensor networks

    NASA Astrophysics Data System (ADS)

    Wille, C.; Buerschaper, O.; Eisert, J.

    2017-06-01

    Tensor network states, and in particular projected entangled pair states, play an important role in the description of strongly correlated quantum lattice systems. They do not only serve as variational states in numerical simulation methods, but also provide a framework for classifying phases of quantum matter and capture notions of topological order in a stringent and rigorous language. The rapid development in this field for spin models and bosonic systems has not yet been mirrored by an analogous development for fermionic models. In this work, we introduce a tensor network formalism capable of capturing notions of topological order for quantum systems with fermionic components. At the heart of the formalism are axioms of fermionic matrix-product operator injectivity, stable under concatenation. Building upon that, we formulate a Grassmann number tensor network ansatz for the ground state of fermionic twisted quantum double models. A specific focus is put on the paradigmatic example of the fermionic toric code. This work shows that the program of describing topologically ordered systems using tensor networks carries over to fermionic models.

  3. Non-linear corrections to the time-covariance function derived from a multi-state chemical master equation.

    PubMed

    Scott, M

    2012-08-01

    The time-covariance function captures the dynamics of biochemical fluctuations and contains important information about the underlying kinetic rate parameters. Intrinsic fluctuations in biochemical reaction networks are typically modelled using a master equation formalism. In general, the equation cannot be solved exactly and approximation methods are required. For small fluctuations close to equilibrium, a linearisation of the dynamics provides a very good description of the relaxation of the time-covariance function. As the number of molecules in the system decrease, deviations from the linear theory appear. Carrying out a systematic perturbation expansion of the master equation to capture these effects results in formidable algebra; however, symbolic mathematics packages considerably expedite the computation. The authors demonstrate that non-linear effects can reveal features of the underlying dynamics, such as reaction stoichiometry, not available in linearised theory. Furthermore, in models that exhibit noise-induced oscillations, non-linear corrections result in a shift in the base frequency along with the appearance of a secondary harmonic.

  4. Modeling Of Metabolic Heat Regenerated Temperature Swing Adsorption (MTSA) Subassembly For Prototype Design

    NASA Technical Reports Server (NTRS)

    Bower, Chad E.; Padilla, Sebastian A.; Iacomini, Christie S.; Paul, Heather L.

    2010-01-01

    This paper describes modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly: a sorbent bed, a sublimation (cooling) heat exchanger (SHX), and a condensing icing (warming) heat exchanger (CIHX). The primary function of the MTSA, removing carbon dioxide from a space suit Portable Life Support System (PLSS) ventilation loop, is performed via the sorbent bed. The CIHX is used to heat the sorbent bed for desorption and to remove moisture from the ventilation loop while the SHX is alternately employed to cool the sorbent bed via sublimation of a spray of water at low pressure to prepare the reconditioned bed for the next cycle. This paper describes subsystem heat a mass transfer modeling methodologies relevant to the description of the MTSA subassembly in Thermal Desktop and SINDA/FLUINT. Several areas of particular modeling interest are discussed. In the sorbent bed, capture of the translating carbon dioxide (CO2) front and associated local energy and mass balance in both adsorbing and desorbing modes is covered. The CIHX poses particular challenges for modeling in SINDA/FLUINT as accounting for solids states in fluid submodels are not a native capability. Methods for capturing phase change and latent heat of ice as well as the transport properties across a layer of low density accreted frost are developed. This extended modeling capacity is applicable to temperatures greater than 258 K. To extend applicability to the minimum device temperature of 235 K, a method for a mapped transformation of temperatures from below the limit temperatures to some value above is given along with descriptions for associated material property transformations and the resulting impacts to total heat and mass transfer. Similar considerations are given for the SHX along with functional relationships for areal sublimation rates as limited by flow mechanics in t1he outlet duct.

  5. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification

    PubMed Central

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Background Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. Methods At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. Result 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Conclusion Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints. PMID:27355447

  6. Characteristics of work-related fatal and hospitalised injuries not captured in workers’ compensation data

    PubMed Central

    Koehoorn, M; Tamburic, L; Xu, F; Alamgir, H; Demers, P A; McLeod, C B

    2015-01-01

    Objectives (1) To identify work-related fatal and non-fatal hospitalised injuries using multiple data sources, (2) to compare case-ascertainment from external data sources with accepted workers’ compensation claims and (3) to investigate the characteristics of work-related fatal and hospitalised injuries not captured by workers’ compensation. Methods Work-related fatal injuries were ascertained from vital statistics, coroners and hospital discharge databases using payment and diagnosis codes and injury and work descriptions; and work-related (non-fatal) injuries were ascertained from the hospital discharge database using admission, diagnosis and payment codes. Injuries for British Columbia residents aged 15–64 years from 1991 to 2009 ascertained from the above external data sources were compared to accepted workers’ compensation claims using per cent captured, validity analyses and logistic regression. Results The majority of work-related fatal injuries identified in the coroners data (83%) and the majority of work-related hospitalised injuries (95%) were captured as an accepted workers’ compensation claim. A work-related coroner report was a positive predictor (88%), and the responsibility of payment field in the hospital discharge record a sensitive indicator (94%), for a workers’ compensation claim. Injuries not captured by workers’ compensation were associated with female gender, type of work (natural resources and other unspecified work) and injury diagnosis (eg, airway-related, dislocations and undetermined/unknown injury). Conclusions Some work-related injuries captured by external data sources were not found in workers’ compensation data in British Columbia. This may be the result of capturing injuries or workers that are ineligible for workers’ compensation, or the result of injuries that go unreported to the compensation system. Hospital discharge records and coroner reports may provide opportunities to identify workers (or family members) with an unreported work-related injury and to provide them with information for submitting a workers’ compensation claim. PMID:25713157

  7. Connecting single cell to collective cell behavior in a unified theoretical framework

    NASA Astrophysics Data System (ADS)

    George, Mishel; Bullo, Francesco; Campàs, Otger

    Collective cell behavior is an essential part of tissue and organ morphogenesis during embryonic development, as well as of various disease processes, such as cancer. In contrast to many in vitro studies of collective cell migration, most cases of in vivo collective cell migration involve rather small groups of cells, with large sheets of migrating cells being less common. The vast majority of theoretical descriptions of collective cell behavior focus on large numbers of cells, but fail to accurately capture the dynamics of small groups of cells. Here we introduce a low-dimensional theoretical description that successfully captures single cell migration, cell collisions, collective dynamics in small groups of cells, and force propagation during sheet expansion, all within a common theoretical framework. Our description is derived from first principles and also includes key phenomenological aspects of cell migration that control the dynamics of traction forces. Among other results, we explain the counter-intuitive observations that pairs of cells repel each other upon collision while they behave in a coordinated manner within larger clusters.

  8. Design and Test Plan for an Integrated Iodine Scrubber and Polishing Bed System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jubin, Robert Thomas

    The capture and subsequent immobilization of four regulated volatile radionuclides ( 3H, 14C, 85Kr, and 129I) and relevant semivolatile species from the off-gas streams of a used nuclear fuel (UNF) reprocessing facility has been a topic of significant research interest on the part of the US Department of Energy and other international organizations. Significant research and development has been conducted over the past decade. In 2016 an initial engineering evaluation and design of the off-gas abatement systems required for a hypothetical 1000 t/yr UNF reprocessing facility treating 5 yr–cooled, 60 GWd/tIHM UNF was completed. One of the key findings ofmore » that report was that the consumption rate of silver-based iodine sorbents in the dissolver off-gas primary iodine capture bed is very high and may warrant the evaluation of alternative methods to capture the bulk of the iodine that could significantly reduce the associated frequent remote handing of the iodine filter beds. This report is intended to describe the design of an experimental system that can be used to examine the use of aqueous scrubbing to remove the bulk of the iodine from the dissolver off-gas stream prior to a silver-based solid sorbent that would be used to provide the final iodine capture or polishing step. This report also provides a description of the initial series of tests that are proposed for this system.« less

  9. Rational and mechanistic perspectives on reinforcement learning.

    PubMed

    Chater, Nick

    2009-12-01

    This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: mechanistic and rational. Reinforcement learning is often viewed in mechanistic terms--as describing the operation of aspects of an agent's cognitive and neural machinery. Yet it can also be viewed as a rational level of description, specifically, as describing a class of methods for learning from experience, using minimal background knowledge. This paper considers how rational and mechanistic perspectives differ, and what types of evidence distinguish between them. Reinforcement learning research in the cognitive and brain sciences is often implicitly committed to the mechanistic interpretation. Here the opposite view is put forward: that accounts of reinforcement learning should apply at the rational level, unless there is strong evidence for a mechanistic interpretation. Implications of this viewpoint for reinforcement-based theories in the cognitive and brain sciences are discussed.

  10. Exploring a strongly non-Markovian behavior

    NASA Astrophysics Data System (ADS)

    Alba, Vasyl; Berman, Gordon; Bialek, William; Shaevitz, Joshua

    Is there some simplicity or universality underlying the complexities of natural animal behavior? Using the walking fruit fly as a model system, we have shown that unconstrained behaviors can be categorized into roughly one hundred discrete states, which all individuals from a single species visit repeatedly. In each state, the fly executes stereotyped movements, and the transitions between states are organized hierarchically. The sequences of states, however, are strongly non-Markovian: correlations persist for orders of magnitude longer than expected from the state-to-state transition probabilities, and there are hints of power law decay. But with 100 states, further analysis is difficult. Here we develop a generalization of the information bottleneck method to compress these states into a more compact description that preserves as much of the temporal correlations as possible. We find that, even on compressing down to just two states, this coarse grained description of behavior captures the long ranged correlations. Power law decays are clearer in this reduced representation, which opens the way for more quantitative analysis.

  11. An efficient method for quantum transport simulations in the time domain

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Yam, C.-Y.; Frauenheim, Th.; Chen, G. H.; Niehaus, T. A.

    2011-11-01

    An approximate method based on adiabatic time dependent density functional theory (TDDFT) is presented, that allows for the description of the electron dynamics in nanoscale junctions under arbitrary time dependent external potentials. The density matrix of the device region is propagated according to the Liouville-von Neumann equation. The semi-infinite leads give rise to dissipative terms in the equation of motion which are calculated from first principles in the wide band limit. In contrast to earlier ab initio implementations of this formalism, the Hamiltonian is here approximated in the spirit of the density functional based tight-binding (DFTB) method. Results are presented for two prototypical molecular devices and compared to full TDDFT calculations. The temporal profile of the current traces is qualitatively well captured by the DFTB scheme. Steady state currents show considerable variations, both in comparison of approximate and full TDDFT, but also among TDDFT calculations with different basis sets.

  12. An FPGA-based heterogeneous image fusion system design method

    NASA Astrophysics Data System (ADS)

    Song, Le; Lin, Yu-chi; Chen, Yan-hua; Zhao, Mei-rong

    2011-08-01

    Taking the advantages of FPGA's low cost and compact structure, an FPGA-based heterogeneous image fusion platform is established in this study. Altera's Cyclone IV series FPGA is adopted as the core processor of the platform, and the visible light CCD camera and infrared thermal imager are used as the image-capturing device in order to obtain dualchannel heterogeneous video images. Tailor-made image fusion algorithms such as gray-scale weighted averaging, maximum selection and minimum selection methods are analyzed and compared. VHDL language and the synchronous design method are utilized to perform a reliable RTL-level description. Altera's Quartus II 9.0 software is applied to simulate and implement the algorithm modules. The contrast experiments of various fusion algorithms show that, preferably image quality of the heterogeneous image fusion can be obtained on top of the proposed system. The applied range of the different fusion algorithms is also discussed.

  13. Interpersonal Coordination: Methods, Achievements, and Challenges

    PubMed Central

    Cornejo, Carlos; Cuadros, Zamara; Morales, Ricardo; Paredes, Javiera

    2017-01-01

    Research regarding interpersonal coordination can be traced back to the early 1960s when video recording began to be utilized in communication studies. Since then, technological advances have extended the range of techniques that can be used to accurately study interactional phenomena. Although such a diversity of methods contributes to the improvement of knowledge concerning interpersonal coordination, it has become increasingly difficult to maintain a comprehensive view of the field. In the present article, we review the main capture methods by describing their major findings, levels of description and limitations. We group them into three categories: video analysis, motion tracking, and psychophysiological and neurophysiological techniques. Revised evidence suggests that interpersonal coordination encompasses a family of morphological and temporal synchronies at different levels and that it is closely related to the construction and maintenance of a common social and affective space. We conclude by arguing that future research should address methodological challenges to advance the understanding of coordination phenomena. PMID:29021769

  14. Quantifying camouflage: how to predict detectability from appearance.

    PubMed

    Troscianko, Jolyon; Skelhorn, John; Stevens, Martin

    2017-01-06

    Quantifying the conspicuousness of objects against particular backgrounds is key to understanding the evolution and adaptive value of animal coloration, and in designing effective camouflage. Quantifying detectability can reveal how colour patterns affect survival, how animals' appearances influence habitat preferences, and how receiver visual systems work. Advances in calibrated digital imaging are enabling the capture of objective visual information, but it remains unclear which methods are best for measuring detectability. Numerous descriptions and models of appearance have been used to infer the detectability of animals, but these models are rarely empirically validated or directly compared to one another. We compared the performance of human 'predators' to a bank of contemporary methods for quantifying the appearance of camouflaged prey. Background matching was assessed using several established methods, including sophisticated feature-based pattern analysis, granularity approaches and a range of luminance and contrast difference measures. Disruptive coloration is a further camouflage strategy where high contrast patterns disrupt they prey's tell-tale outline, making it more difficult to detect. Disruptive camouflage has been studied intensely over the past decade, yet defining and measuring it have proven far more problematic. We assessed how well existing disruptive coloration measures predicted capture times. Additionally, we developed a new method for measuring edge disruption based on an understanding of sensory processing and the way in which false edges are thought to interfere with animal outlines. Our novel measure of disruptive coloration was the best predictor of capture times overall, highlighting the importance of false edges in concealment over and above pattern or luminance matching. The efficacy of our new method for measuring disruptive camouflage together with its biological plausibility and computational efficiency represents a substantial advance in our understanding of the measurement, mechanism and definition of disruptive camouflage. Our study also provides the first test of the efficacy of many established methods for quantifying how conspicuous animals are against particular backgrounds. The validation of these methods opens up new lines of investigation surrounding the form and function of different types of camouflage, and may apply more broadly to the evolution of any visual signal.

  15. A decline in the prevalence of injecting drug users in Estonia, 2005–2009

    PubMed Central

    Uusküla, A; Rajaleid, K; Talu, A; Abel-Ollo, K; Des Jarlais, DC

    2013-01-01

    Aims and setting Descriptions of behavioural epidemics have received little attention compared with infectious disease epidemics in Eastern Europe. Here we report a study aimed at estimating trends in the prevalence of injection drug use between 2005 and 2009 in Estonia. Design and methods The number of injection drug users (IDUs) aged 15–44 each year between 2005 and 2009 was estimated using capture-recapture methodology based on 4 data sources (2 treatment data bases: drug abuse and non-fatal overdose treatment; criminal justice (drug related offences) and mortality (injection drug use related deaths) data). Poisson log-linear regression models were applied to the matched data, with interactions between data sources fitted to replicate the dependencies between the data sources. Linear regression was used to estimate average change over time. Findings there were 24305, 12292, 238, 545 records and 8100, 1655, 155, 545 individual IDUs identified in the four capture sources (Police, drug treatment, overdose, and death registry, accordingly) over the period 2005 – 2009. The estimated prevalence of IDUs among the population aged 15–44 declined from 2.7% (1.8–7.9%) in 2005 to 2.0% (1.4–5.0%) in 2008, and 0.9% (0.7–1.7%) in 2009. Regression analysis indicated an average reduction of over 1700 injectors per year. Conclusion While the capture-recapture method has known limitations, the results are consistent with other data from Estonia. Identifying the drivers of change in the prevalence of injection drug use warrants further research. PMID:23290632

  16. Physical Model for the Evolution of the Genetic Code

    NASA Astrophysics Data System (ADS)

    Yamashita, Tatsuro; Narikiyo, Osamu

    2011-12-01

    Using the shape space of codons and tRNAs we give a physical description of the genetic code evolution on the basis of the codon capture and ambiguous intermediate scenarios in a consistent manner. In the lowest dimensional version of our description, a physical quantity, codon level is introduced. In terms of the codon levels two scenarios are typically classified into two different routes of the evolutional process. In the case of the ambiguous intermediate scenario we perform an evolutional simulation implemented cost selection of amino acids and confirm a rapid transition of the code change. Such rapidness reduces uncomfortableness of the non-unique translation of the code at intermediate state that is the weakness of the scenario. In the case of the codon capture scenario the survival against mutations under the mutational pressure minimizing GC content in genomes is simulated and it is demonstrated that cells which experience only neutral mutations survive.

  17. TARGET: Rapid Capture of Process Knowledge

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.; Ly, H. V.; Saito, T.; Loftin, R. B.

    1993-01-01

    TARGET (Task Analysis/Rule Generation Tool) represents a new breed of tool that blends graphical process flow modeling capabilities with the function of a top-down reporting facility. Since NASA personnel frequently perform tasks that are primarily procedural in nature, TARGET models mission or task procedures and generates hierarchical reports as part of the process capture and analysis effort. Historically, capturing knowledge has proven to be one of the greatest barriers to the development of intelligent systems. Current practice generally requires lengthy interactions between the expert whose knowledge is to be captured and the knowledge engineer whose responsibility is to acquire and represent the expert's knowledge in a useful form. Although much research has been devoted to the development of methodologies and computer software to aid in the capture and representation of some types of knowledge, procedural knowledge has received relatively little attention. In essence, TARGET is one of the first tools of its kind, commercial or institutional, that is designed to support this type of knowledge capture undertaking. This paper will describe the design and development of TARGET for the acquisition and representation of procedural knowledge. The strategies employed by TARGET to support use by knowledge engineers, subject matter experts, programmers and managers will be discussed. This discussion includes the method by which the tool employs its graphical user interface to generate a task hierarchy report. Next, the approach to generate production rules for incorporation in and development of a CLIPS based expert system will be elaborated. TARGET also permits experts to visually describe procedural tasks as a common medium for knowledge refinement by the expert community and knowledge engineer making knowledge consensus possible. The paper briefly touches on the verification and validation issues facing the CLIPS rule generation aspects of TARGET. A description of efforts to support TARGET's interoperability issues on PCs, Macintoshes and UNIX workstations concludes the paper.

  18. Predicting dynamics and rheology of blood flow: A comparative study of multiscale and low-dimensional models of red blood cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, Wenxiao; Fedosov, Dmitry A.; Caswell, Bruce

    In this work we compare the predictive capability of two mathematical models for red blood cells (RBCs) focusing on blood flow in capillaries and arterioles. Both RBC models as well as their corresponding blood flows are based on the dissipative particle dynamics (DPD) method, a coarse-grained molecular dynamics approach. The first model employs a multiscale description of the RBC (MS-RBC), with its membrane represented by hundreds or even thousands of DPD-particles connected by springs into a triangular network in combination with out-of-plane elastic bending resistance. Extra dissipation within the network accounts for membrane viscosity, while the characteristic biconcave RBC shapemore » is achieved by imposition of constraints for constant membrane area and constant cell volume. The second model is based on a low-dimensional description (LD-RBC) constructed as a closed torus-like ring of only 10 large DPD colloidal particles. They are connected into a ring by worm-like chain (WLC) springs combined with bending resistance. The LD-RBC model can be fitted to represent the entire range of nonlinear elastic deformations as measured by optical-tweezers for healthy and for infected RBCs in malaria. MS-RBCs suspensions model the dynamics and rheology of blood flow accurately for any size vessel but this approach is computationally expensive above 100 microns. Surprisingly, the much more economical suspensions of LD-RBCs also capture the blood flow dynamics and rheology accurately except for vessels with sizes comparable to RBC diameter. In particular, the LD-RBC suspensions are shown to properly capture the experimental data for the apparent viscosity of blood and its cell-free layer (CFL) in tube flow. Taken together, these findings suggest a hierarchical approach in modeling blood flow in the arterial tree, whereby the MS-RBC model should be employed for capillaries and arterioles below 100 microns, the LD-RBC model for arterioles, and the continuum description for arteries.« less

  19. Gamma-widths, lifetimes and fluctuations in the nuclear quasi-continuum

    NASA Astrophysics Data System (ADS)

    Guttormsen, M.; Larsen, A. C.; Midtbø, J. E.; Crespo Campo, L.; Görgen, A.; Ingeberg, V. W.; Renstrøm, T.; Siem, S.; Tveten, G. M.; Zeiser, F.; Kirsch, L. E.

    2018-05-01

    Statistical γ-decay from highly excited states is determined by the nuclear level density (NLD) and the γ-ray strength function (γSF). These average quantities have been measured for several nuclei using the Oslo method. For the first time, we exploit the NLD and γSF to evaluate the γ-width in the energy region below the neutron binding energy, often called the quasi-continuum region. The lifetimes of states in the quasi-continuum are important benchmarks for a theoretical description of nuclear structure and dynamics at high temperature. The lifetimes may also have impact on reaction rates for the rapid neutron-capture process, now demonstrated to take place in neutron star mergers.

  20. [Anthropology, ethnography, and narrative: intersecting paths in understanding the processes of health and sickness].

    PubMed

    Costa, Gabriela M C; Gualda, Dulce M R

    2010-12-01

    The article discusses anthropology, ethnographic method, and narrative as possible ways of coming to know subjects' experiences and the feelings they attribute to them. From an anthropological perspective, the sociocultural universe is taken as a point of reference in understanding the meaning of the processes of health and sickness, using a dense ethnographic description from an interpretivist analytical approach. In this context, narratives afford possible paths to understanding how subjective human experiences are shared and how behavior is organized, with a special focus on meaning, the process by which stories are produced, relations between narrator and other subjects, processes of knowledge, and the manifold ways in which experience can be captured.

  1. An unsupervised method for quantifying the behavior of paired animals

    NASA Astrophysics Data System (ADS)

    Klibaite, Ugne; Berman, Gordon J.; Cande, Jessica; Stern, David L.; Shaevitz, Joshua W.

    2017-02-01

    Behaviors involving the interaction of multiple individuals are complex and frequently crucial for an animal’s survival. These interactions, ranging across sensory modalities, length scales, and time scales, are often subtle and difficult to characterize. Contextual effects on the frequency of behaviors become even more difficult to quantify when physical interaction between animals interferes with conventional data analysis, e.g. due to visual occlusion. We introduce a method for quantifying behavior in fruit fly interaction that combines high-throughput video acquisition and tracking of individuals with recent unsupervised methods for capturing an animal’s entire behavioral repertoire. We find behavioral differences between solitary flies and those paired with an individual of the opposite sex, identifying specific behaviors that are affected by social and spatial context. Our pipeline allows for a comprehensive description of the interaction between two individuals using unsupervised machine learning methods, and will be used to answer questions about the depth of complexity and variance in fruit fly courtship.

  2. Quantification of Finger-Tapping Angle Based on Wearable Sensors

    PubMed Central

    Djurić-Jovičić, Milica; Jovičić, Nenad S.; Roby-Brami, Agnes; Popović, Mirjana B.; Kostić, Vladimir S.; Djordjević, Antonije R.

    2017-01-01

    We propose a novel simple method for quantitative and qualitative finger-tapping assessment based on miniature inertial sensors (3D gyroscopes) placed on the thumb and index-finger. We propose a simplified description of the finger tapping by using a single angle, describing rotation around a dominant axis. The method was verified on twelve subjects, who performed various tapping tasks, mimicking impaired patterns. The obtained tapping angles were compared with results of a motion capture camera system, demonstrating excellent accuracy. The root-mean-square (RMS) error between the two sets of data is, on average, below 4°, and the intraclass correlation coefficient is, on average, greater than 0.972. Data obtained by the proposed method may be used together with scores from clinical tests to enable a better diagnostic. Along with hardware simplicity, this makes the proposed method a promising candidate for use in clinical practice. Furthermore, our definition of the tapping angle can be applied to all tapping assessment systems. PMID:28125051

  3. Quantification of Finger-Tapping Angle Based on Wearable Sensors.

    PubMed

    Djurić-Jovičić, Milica; Jovičić, Nenad S; Roby-Brami, Agnes; Popović, Mirjana B; Kostić, Vladimir S; Djordjević, Antonije R

    2017-01-25

    We propose a novel simple method for quantitative and qualitative finger-tapping assessment based on miniature inertial sensors (3D gyroscopes) placed on the thumb and index-finger. We propose a simplified description of the finger tapping by using a single angle, describing rotation around a dominant axis. The method was verified on twelve subjects, who performed various tapping tasks, mimicking impaired patterns. The obtained tapping angles were compared with results of a motion capture camera system, demonstrating excellent accuracy. The root-mean-square (RMS) error between the two sets of data is, on average, below 4°, and the intraclass correlation coefficient is, on average, greater than 0.972. Data obtained by the proposed method may be used together with scores from clinical tests to enable a better diagnostic. Along with hardware simplicity, this makes the proposed method a promising candidate for use in clinical practice. Furthermore, our definition of the tapping angle can be applied to all tapping assessment systems.

  4. Capturing Functional Independence Measure (FIM®) Ratings.

    PubMed

    Torres, Audrey

    The aim of the study was to identify interventions to capture admission functional independence measure (FIM®) ratings on the day of admission to an inpatient rehabilitation facility. A quantitative evidence-based practice quality improvement study utilizing descriptive statistics. Admission FIM® ratings from patients discharged in June 2012 (retrospective review) were compared to admission FIM® ratings from patients discharged in June 2014 (prospective review). The logic model was utilized to determine the project inputs, outputs, and outcomes. Interventions to capture admission FIM® ratings on the day of admission are essential to accurately predict the patient's burden of care, length of stay, and reimbursement. Waiting until Day 2 or Day 3 after admission to capture the admission FIM® assessment resulted in inflated admission FIM® ratings and suboptimal quality outcomes. Interventions to capture admission FIM® ratings on the day of admission were successful at improving the quality of care, length of stay efficiency, and accurately recording admission FIM® ratings to determine the patient's burden of care.

  5. Value Added: the Case for Point-of-View Camera use in Orthopedic Surgical Education

    PubMed Central

    Thomas, Geb W.; Taylor, Leah; Liu, Xiaoxing; Anthony, Chris A.; Anderson, Donald D.

    2016-01-01

    Abstract Background Orthopedic surgical education is evolving as educators search for new ways to enhance surgical skills training. Orthopedic educators should seek new methods and technologies to augment and add value to real-time orthopedic surgical experience. This paper describes a protocol whereby we have started to capture and evaluate specific orthopedic milestone procedures with a GoPro® point-of-view video camera and a dedicated video reviewing website as a way of supplementing the current paradigm in surgical skills training. We report our experience regarding the details and feasibility of this protocol. Methods Upon identification of a patient undergoing surgical fixation of a hip or ankle fracture, an orthopedic resident places a GoPro® point-of-view camera on his or her forehead. All fluoroscopic images acquired during the case are saved and later incorporated into a video on the reviewing website. Surgical videos are uploaded to a secure server and are accessible for later review and assessment via a custom-built website. An electronic survey of resident participants was performed utilizing Qualtrics software. Results are reported using descriptive statistics. Results A total of 51 surgical videos involving 23 different residents have been captured to date. This includes 20 intertrochanteric hip fracture cases and 31 ankle fracture cases. The average duration of each surgical video was 1 hour and 16 minutes (range 40 minutes to 2 hours and 19 minutes). Of 24 orthopedic resident surgeons surveyed, 88% thought capturing a video portfolio of orthopedic milestones would benefit their education Conclusions There is a growing demand in orthopedic surgical education to extract more value from each surgical experience. While further work in development and refinement of such assessments is necessary, we feel that intraoperative video, particularly when captured and presented in a non-threatening, user friendly manner, can add significant value to the present and future paradigm of orthopedic surgical skill training. PMID:27528828

  6. Implementation of a national anti-tuberculosis drug resistance survey in Tanzania

    PubMed Central

    Chonde, Timothy M; Doulla, Basra; van Leth, Frank; Mfinanga, Sayoki GM; Range, Nyagosya; Lwilla, Fred; Mfaume, Saidi M; van Deun, Armand; Zignol, Matteo; Cobelens, Frank G; Egwaga, Saidi M

    2008-01-01

    Background A drug resistance survey is an essential public health management tool for evaluating and improving the performance of National Tuberculosis control programmes. The current manuscript describes the implementation of the first national drug resistance survey in Tanzania. Methods Description of the implementation process of a national anti-tuberculosis drug resistance survey in Tanzania, in relation to the study protocol and Standard Operating Procedures. Results Factors contributing positively to the implementation of the survey were a continuous commitment of the key stakeholders, the existence of a well organized National Tuberculosis Programme, and a detailed design of cluster-specific arrangements for rapid sputum transportation. Factors contributing negatively to the implementation were a long delay between training and actual survey activities, limited monitoring of activities, and an unclear design of the data capture forms leading to difficulties in form-filling. Conclusion Careful preparation of the survey, timing of planned activities, a strong emphasis on data capture tools and data management, and timely supervision are essential for a proper implementation of a national drug resistance survey. PMID:19116022

  7. An analysis of characteristics of post-authorisation studies registered on the ENCePP EU PAS Register

    PubMed Central

    Carroll, Robert; Ramagopalan, Sreeram V.; Cid-Ruzafa, Javier; Lambrelli, Dimitra; McDonald, Laura

    2017-01-01

    Background: The objective of this study was to investigate the study design characteristics of Post-Authorisation Studies (PAS) requested by the European Medicines Agency which were recorded on the European Union (EU) PAS Register held by the European Network of Centres for Pharmacoepidemiology and Pharmacovigilance (ENCePP). Methods: We undertook a cross-sectional descriptive analysis of all studies registered on the EU PAS Register as of 18 th October 2016. Results: We identified a total of 314 studies on the EU PAS Register, including 81 (26%) finalised, 160 (51%) ongoing and 73 (23%) planned. Of those studies identified, 205 (65%) included risk assessment in their scope, 133 (42%) included drug utilisation and 94 (30%) included effectiveness evaluation. Just over half of the studies (175; 56%) used primary data capture, 135 (43%) used secondary data and 4 (1%) used a hybrid design combining both approaches. Risk assessment and effectiveness studies were more likely to use primary data capture (60% and 85% respectively as compared to 39% and 14% respectively for secondary). The converse was true for drug utilisation studies where 59% were secondary vs. 39% for primary. For type 2 diabetes mellitus, database studies were more commonly used (80% vs 3% chart review, 3% hybrid and 13% primary data capture study designs) whereas for studies in oncology, primary data capture were more likely to be used (85% vs 4% chart review, and 11% database study designs). Conclusions: Results of this analysis show that PAS design varies according to study objectives and therapeutic area. PMID:29188016

  8. Second Supplement to A Catalog of the Mosquitoes of the World (Diptera: Culicidae)

    DTIC Science & Technology

    1984-01-01

    104. Brunhes, J. 1977a. Les moustiques de l’archipel des Comores I. - Inventaire, &partition et description de quatre esptces ou sous-espscies...nouvelles. Cah. O.R.S.T.O.M. Ser. Entomol. Med. Parasitol. 15:131-152. Brunhes, J. 1977b. Les moustiques de l’archipel des Comores 11. - Description de...Dieng. 1978. Aedes (Stegomyia) neoafricanus un nouvelle espzcie de moustique capture’e au Sgne’gal Oriental (Diptera: Culicidae), Cah. O.R.S.T.O.M

  9. Expanding potential of radiofrequency nurse call systems to measure nursing time in patient rooms.

    PubMed

    Fahey, Linda; Dunn Lopez, Karen; Storfjell, Judith; Keenan, Gail

    2013-05-01

    The objective of this study was to determine the utility and feasibility of using data from a nurse call system equipped with radiofrequency identification data (RFID) to measure nursing time spent in patient rooms. Increasing the amount of time nurses spend with hospitalized patients has become a focus after several studies demonstrating that nurses spend most of their time in nondirect care activities rather than delivering patient care. Measurement of nursing time spent in direct care often involves labor-intensive time and motion studies, making frequent or continuous monitoring impractical. Mixed methods were used for this descriptive study. We used 30 days of data from an RFID nurse call system collected on 1 unit in a community hospital to examine nurses time spent in patient rooms. Descriptive statistics were applied to calculate this percentage by role and shift. Data technologists were surveyed to assess how practical the access of data would be in a hospital setting for use in monitoring nursing time spent in patient rooms. The system captured 7393 staff hours. Of that time, 7% did not reflect actual patient care time, so these were eliminated from further analysis. The remaining 6880 hours represented 91% of expected worked time. RNs and nursing assistants spent 33% to 36% of their time in patient rooms, presumably providing direct care. Radiofrequency identification data technology was found to provide feasible and accurate means for capturing and evaluating nursing time spent in patient rooms. Depending on the outcomes per unit, leaders should work with staff to maximize patient care time.

  10. ALAMEDA, a Structural–Functional Model for Faba Bean Crops: Morphological Parameterization and Verification

    PubMed Central

    RUIZ-RAMOS, MARGARITA; MÍNGUEZ, M. INÉS

    2006-01-01

    • Background Plant structural (i.e. architectural) models explicitly describe plant morphology by providing detailed descriptions of the display of leaf and stem surfaces within heterogeneous canopies and thus provide the opportunity for modelling the functioning of plant organs in their microenvironments. The outcome is a class of structural–functional crop models that combines advantages of current structural and process approaches to crop modelling. ALAMEDA is such a model. • Methods The formalism of Lindenmayer systems (L-systems) was chosen for the development of a structural model of the faba bean canopy, providing both numerical and dynamic graphical outputs. It was parameterized according to the results obtained through detailed morphological and phenological descriptions that capture the detailed geometry and topology of the crop. The analysis distinguishes between relationships of general application for all sowing dates and stem ranks and others valid only for all stems of a single crop cycle. • Results and Conclusions The results reveal that in faba bean, structural parameterization valid for the entire plant may be drawn from a single stem. ALAMEDA was formed by linking the structural model to the growth model ‘Simulation d'Allongement des Feuilles’ (SAF) with the ability to simulate approx. 3500 crop organs and components of a group of nine plants. Model performance was verified for organ length, plant height and leaf area. The L-system formalism was able to capture the complex architecture of canopy leaf area of this indeterminate crop and, with the growth relationships, generate a 3D dynamic crop simulation. Future development and improvement of the model are discussed. PMID:16390842

  11. Microscopic study of heavy-ion reactions with n-rich nuclei: dynamic excitation energy and capture

    NASA Astrophysics Data System (ADS)

    Oberacker, Volker; Umar, A. S.

    2010-11-01

    Heavy-ion reactions at RIB facilities allow us to form new exotic neutron-rich nuclei. These experiments present numerous challenges for a microscopic theoretical description. We study reactions between neutron-rich ^132Sn nuclei and ^96Zr within a dynamic microscopic theory, and we compare the properties to those of the stable system ^124Sn+^96Zr. The calculations are carried out on a 3-D lattice using the density-constrained Time-Dependent Hartree-Fock (DC-TDHF) method [1- 3]. In particular, we calculate the dynamic excitation energy E^*(t) and the quadrupole moment of the dinuclear system Q20(t) during the initial stages of the collision. Regarding the heavy-ion interaction potential V(R), we find that the fusion barrier height and width increase dramatically with increasing beam energy. The fusion barriers of the neutron-rich system ^132Sn+^96Zr are systematically 1-2 MeV higher than those of the stable system. Large differences (9 MeV) are found in the interaction barriers of the two systems. Capture cross sections are analyzed in terms of dynamic effects and a comparison with recently measured capture-fission data is given. [1] Umar and Oberacker, PRC 76, 014614 (2007). [2] Umar, Oberacker, Maruhn, and Reinhard, PRC 80, 041601(R) (2009). [3] Umar, Maruhn, Itagaki, and Oberacker, PRL 104, 212503 (2010).

  12. Geological Sequestration Training and Research Program in Capture and Transport: Development of the Most Economical Separation Method for CO2 Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vahdat, Nader

    2013-09-30

    The project provided hands-on training and networking opportunities to undergraduate students in the area of carbon dioxide (CO2) capture and transport, through fundamental research study focused on advanced separation methods that can be applied to the capture of CO2 resulting from the combustion of fossil-fuels for power generation . The project team’s approach to achieve its objectives was to leverage existing Carbon Capture and Storage (CCS) course materials and teaching methods to create and implement an annual CCS short course for the Tuskegee University community; conduct a survey of CO2 separation and capture methods; utilize data to verify and developmore » computer models for CO2 capture and build CCS networks and hands-on training experiences. The objectives accomplished as a result of this project were: (1) A comprehensive survey of CO2 capture methods was conducted and mathematical models were developed to compare the potential economics of the different methods based on the total cost per year per unit of CO2 avoidance; and (2) Training was provided to introduce the latest CO2 capture technologies and deployment issues to the university community.« less

  13. Transport and collision dynamics in periodic asymmetric obstacle arrays: Rational design of microfluidic rare-cell immunocapture devices

    NASA Astrophysics Data System (ADS)

    Gleghorn, Jason P.; Smith, James P.; Kirby, Brian J.

    2013-09-01

    Microfluidic obstacle arrays have been used in numerous applications, and their ability to sort particles or capture rare cells from complex samples has broad and impactful applications in biology and medicine. We have investigated the transport and collision dynamics of particles in periodic obstacle arrays to guide the design of convective, rather than diffusive, transport-based immunocapture microdevices. Ballistic and full computational fluid dynamics simulations are used to understand the collision modes that evolve in cylindrical obstacle arrays with various geometries. We identify previously unrecognized collision mode structures and differential size-based collision frequencies that emerge from these arrays. Previous descriptions of transverse displacements that assume unidirectional flow in these obstacle arrays cannot capture mode transitions properly as these descriptions fail to capture the dependence of the mode transitions on column spacing and the attendant change in the flow field. Using these analytical and computational simulations, we elucidate design parameters that induce high collision rates for all particles larger than a threshold size or selectively increase collision frequencies for a narrow range of particle sizes within a polydisperse population. Furthermore, we investigate how the particle Péclet number affects collision dynamics and mode transitions and demonstrate that experimental observations from various obstacle array geometries are well described by our computational model.

  14. SIDR: simultaneous isolation and parallel sequencing of genomic DNA and total RNA from single cells.

    PubMed

    Han, Kyung Yeon; Kim, Kyu-Tae; Joung, Je-Gun; Son, Dae-Soon; Kim, Yeon Jeong; Jo, Areum; Jeon, Hyo-Jeong; Moon, Hui-Sung; Yoo, Chang Eun; Chung, Woosung; Eum, Hye Hyeon; Kim, Sangmin; Kim, Hong Kwan; Lee, Jeong Eon; Ahn, Myung-Ju; Lee, Hae-Ock; Park, Donghyun; Park, Woong-Yang

    2018-01-01

    Simultaneous sequencing of the genome and transcriptome at the single-cell level is a powerful tool for characterizing genomic and transcriptomic variation and revealing correlative relationships. However, it remains technically challenging to analyze both the genome and transcriptome in the same cell. Here, we report a novel method for simultaneous isolation of genomic DNA and total RNA (SIDR) from single cells, achieving high recovery rates with minimal cross-contamination, as is crucial for accurate description and integration of the single-cell genome and transcriptome. For reliable and efficient separation of genomic DNA and total RNA from single cells, the method uses hypotonic lysis to preserve nuclear lamina integrity and subsequently captures the cell lysate using antibody-conjugated magnetic microbeads. Evaluating the performance of this method using real-time PCR demonstrated that it efficiently recovered genomic DNA and total RNA. Thorough data quality assessments showed that DNA and RNA simultaneously fractionated by the SIDR method were suitable for genome and transcriptome sequencing analysis at the single-cell level. The integration of single-cell genome and transcriptome sequencing by SIDR (SIDR-seq) showed that genetic alterations, such as copy-number and single-nucleotide variations, were more accurately captured by single-cell SIDR-seq compared with conventional single-cell RNA-seq, although copy-number variations positively correlated with the corresponding gene expression levels. These results suggest that SIDR-seq is potentially a powerful tool to reveal genetic heterogeneity and phenotypic information inferred from gene expression patterns at the single-cell level. © 2018 Han et al.; Published by Cold Spring Harbor Laboratory Press.

  15. SIDR: simultaneous isolation and parallel sequencing of genomic DNA and total RNA from single cells

    PubMed Central

    Han, Kyung Yeon; Kim, Kyu-Tae; Joung, Je-Gun; Son, Dae-Soon; Kim, Yeon Jeong; Jo, Areum; Jeon, Hyo-Jeong; Moon, Hui-Sung; Yoo, Chang Eun; Chung, Woosung; Eum, Hye Hyeon; Kim, Sangmin; Kim, Hong Kwan; Lee, Jeong Eon; Ahn, Myung-Ju; Lee, Hae-Ock; Park, Donghyun; Park, Woong-Yang

    2018-01-01

    Simultaneous sequencing of the genome and transcriptome at the single-cell level is a powerful tool for characterizing genomic and transcriptomic variation and revealing correlative relationships. However, it remains technically challenging to analyze both the genome and transcriptome in the same cell. Here, we report a novel method for simultaneous isolation of genomic DNA and total RNA (SIDR) from single cells, achieving high recovery rates with minimal cross-contamination, as is crucial for accurate description and integration of the single-cell genome and transcriptome. For reliable and efficient separation of genomic DNA and total RNA from single cells, the method uses hypotonic lysis to preserve nuclear lamina integrity and subsequently captures the cell lysate using antibody-conjugated magnetic microbeads. Evaluating the performance of this method using real-time PCR demonstrated that it efficiently recovered genomic DNA and total RNA. Thorough data quality assessments showed that DNA and RNA simultaneously fractionated by the SIDR method were suitable for genome and transcriptome sequencing analysis at the single-cell level. The integration of single-cell genome and transcriptome sequencing by SIDR (SIDR-seq) showed that genetic alterations, such as copy-number and single-nucleotide variations, were more accurately captured by single-cell SIDR-seq compared with conventional single-cell RNA-seq, although copy-number variations positively correlated with the corresponding gene expression levels. These results suggest that SIDR-seq is potentially a powerful tool to reveal genetic heterogeneity and phenotypic information inferred from gene expression patterns at the single-cell level. PMID:29208629

  16. Monitored Geologic Repository Project Description Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. M. Curry

    2001-01-30

    The primary objective of the Monitored Geologic Repository Project Description Document (PDD) is to allocate the functions, requirements, and assumptions to the systems at Level 5 of the Civilian Radioactive Waste Management System (CRWMS) architecture identified in Section 4. It provides traceability of the requirements to those contained in Section 3 of the ''Monitored Geologic Repository Requirements Document'' (MGR RD) (YMP 2000a) and other higher-level requirements documents. In addition, the PDD allocates design related assumptions to work products of non-design organizations. The document provides Monitored Geologic Repository (MGR) technical requirements in support of design and performance assessment in preparing formore » the Site Recommendation (SR) and License Application (LA) milestones. The technical requirements documented in the PDD are to be captured in the System Description Documents (SDDs) which address each of the systems at Level 5 of the CRWMS architecture. The design engineers obtain the technical requirements from the SDDs and by reference from the SDDs to the PDD. The design organizations and other organizations will obtain design related assumptions directly from the PDD. These organizations may establish additional assumptions for their individual activities, but such assumptions are not to conflict with the assumptions in the PDD. The PDD will serve as the primary link between the technical requirements captured in the SDDs and the design requirements captured in US Department of Energy (DOE) documents. The approved PDD is placed under Level 3 baseline control by the CRWMS Management and Operating Contractor (M and O) and the following portions of the PDD constitute the Technical Design Baseline for the MGR: the design characteristics listed in Table 1-1, the MGR Architecture (Section 4.1), the Technical Requirements (Section 5), and the Controlled Project Assumptions (Section 6).« less

  17. Reflexive photography: an alternative method for documenting the learning process of cultural competence.

    PubMed

    Amerson, Roxanne; Livingston, Wade G

    2014-04-01

    This qualitative descriptive study used reflexive photography to evaluate the learning process of cultural competence during an international service-learning project in Guatemala. Reflexive photography is an innovative qualitative research technique that examines participants' interactions with their environment through their personal reflections on images that they captured during their experience. A purposive sample of 10 baccalaureate nursing students traveled to Guatemala, where they conducted family and community assessments, engaged in home visits, and provided health education. Data collection involved over 100 photographs and a personal interview with each student. The themes developed from the photographs and interviews provided insight into the activities of an international experience that influence the cognitive, practical, and affective learning of cultural competence. Making home visits and teaching others from a different culture increased students' transcultural self-efficacy. Reflexive photography is a more robust method of self-reflection, especially for visual learners.

  18. Multi-year encoding of daily rainfall and streamflow via the fractal-multifractal method

    NASA Astrophysics Data System (ADS)

    Puente, C. E.; Maskey, M.; Sivakumar, B.

    2017-12-01

    A deterministic geometric approach, the fractal-multifractal (FM) method, which has been proven to be faithful in encoding daily geophysical sets over a year, is used to describe records over multiple years at a time. Looking for FM parameter trends over longer periods, the present study shows FM descriptions of daily rainfall and streamflow gathered over five consecutive years optimizing deviations on accumulated sets. The results for 100 and 60 sets of five years for rainfall streamflow, respectively, near Sacramento, California illustrate that: (a) encoding of both types of data sets may be accomplished with relatively small errors; and (b) predicting the geometry of both variables appears to be possible, even five years ahead, training neural networks on the respective FM parameters. It is emphasized that the FM approach not only captures the accumulated sets over successive pentades but also preserves other statistical attributes including the overall "texture" of the records.

  19. Scattering Removal for Finger-Vein Image Restoration

    PubMed Central

    Yang, Jinfeng; Zhang, Ben; Shi, Yihua

    2012-01-01

    Finger-vein recognition has received increased attention recently. However, the finger-vein images are always captured in poor quality. This certainly makes finger-vein feature representation unreliable, and further impairs the accuracy of finger-vein recognition. In this paper, we first give an analysis of the intrinsic factors causing finger-vein image degradation, and then propose a simple but effective image restoration method based on scattering removal. To give a proper description of finger-vein image degradation, a biological optical model (BOM) specific to finger-vein imaging is proposed according to the principle of light propagation in biological tissues. Based on BOM, the light scattering component is sensibly estimated and properly removed for finger-vein image restoration. Finally, experimental results demonstrate that the proposed method is powerful in enhancing the finger-vein image contrast and in improving the finger-vein image matching accuracy. PMID:22737028

  20. Nonlinear finite-element analysis of nanoindentation of viral capsids

    NASA Astrophysics Data System (ADS)

    Gibbons, Melissa M.; Klug, William S.

    2007-03-01

    Recent atomic force microscope (AFM) nanoindentation experiments measuring mechanical response of the protein shells of viruses have provided a quantitative description of their strength and elasticity. To better understand and interpret these measurements, and to elucidate the underlying mechanisms, this paper adopts a course-grained modeling approach within the framework of three-dimensional nonlinear continuum elasticity. Homogeneous, isotropic, elastic, thick-shell models are proposed for two capsids: the spherical cowpea chlorotic mottle virus (CCMV), and the ellipsocylindrical bacteriophage ϕ29 . As analyzed by the finite-element method, these models enable parametric characterization of the effects of AFM tip geometry, capsid dimensions, and capsid constitutive descriptions. The generally nonlinear force response of capsids to indentation is shown to be insensitive to constitutive particulars, and greatly influenced by geometric and kinematic details. Nonlinear stiffening and softening of the force response is dependent on the AFM tip dimensions and shell thickness. Fits of the models capture the roughly linear behavior observed in experimental measurements and result in estimates of Young’s moduli of ≈280-360MPa for CCMV and ≈4.5GPa for ϕ29 .

  1. Three dimensional shape measurement of wear particle by iterative volume intersection

    NASA Astrophysics Data System (ADS)

    Wu, Hongkun; Li, Ruowei; Liu, Shilong; Rahman, Md Arifur; Liu, Sanchi; Kwok, Ngaiming; Peng, Zhongxiao

    2018-04-01

    The morphology of wear particle is a fundamental indicator where wear oriented machine health can be assessed. Previous research proved that thorough measurement of the particle shape allows more reliable explanation of the occurred wear mechanism. However, most of current particle measurement techniques are focused on extraction of the two-dimensional (2-D) morphology, while other critical particle features including volume and thickness are not available. As a result, a three-dimensional (3-D) shape measurement method is developed to enable a more comprehensive particle feature description. The developed method is implemented in three steps: (1) particle profiles in multiple views are captured via a camera mounted above a micro fluid channel; (2) a preliminary reconstruction is accomplished by the shape-from-silhouette approach with the collected particle contours; (3) an iterative re-projection process follows to obtain the final 3-D measurement by minimizing the difference between the original and the re-projected contours. Results from real data are presented, demonstrating the feasibility of the proposed method.

  2. Using Musical Intervals to Demonstrate Superposition of Waves and Fourier Analysis

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2013-01-01

    What follows is a description of a demonstration of superposition of waves and Fourier analysis using a set of four tuning forks mounted on resonance boxes and oscilloscope software to create, capture and analyze the waveforms and Fourier spectra of musical intervals.

  3. Rational and Mechanistic Perspectives on Reinforcement Learning

    ERIC Educational Resources Information Center

    Chater, Nick

    2009-01-01

    This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: "mechanistic" and "rational." Reinforcement learning is often viewed in mechanistic terms--as…

  4. Oxidizing and Scavenging Characteristics of April Rains - OSCAR data report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benkovitz, C.M.; Evans, V.A.; Tichler, J.L.

    The organization of this report is as follows: Chapter 1 presents a description of the OSCAR experiment, including its objectives, design, and field deployment. Chapter 2 presents the OSCAR Central Data Coordination function and summarizes the tasks needed to compile each data set. Chapters 3 through 6 address each of the four OSCAR events. A synoptic description of each event is presented in these chapters, followed by a summary of the data captured during the event. Chapter 3 and Appendices C-G then present detailed tabular and graphical displays of the data captured during this event by the intermediate-density precipitation chemistrymore » network, the BNL aircraft and the surface air chemistry measurements conducted by BNL and by state/province agency networks. Data from the high-density precipitation chemistry network are being presented in a separate series of reports by Pacific Northwest Laboratory. Detailed displays of the data for events 2 to 4 have not been included in this report; however, selected portions could be developed for interested parties.« less

  5. NursesforTomorrow: a proactive approach to nursing resource analysis.

    PubMed

    Bournes, Debra A; Plummer, Carolyn; Miller, Robert; Ferguson-Paré, Mary

    2010-03-01

    This paper describes the background, development, implementation and utilization of NursesforTomorrow (N4T), a practical and comprehensive nursing human resources analysis method to capture regional, institutional and patient care unit-specific actual and predicted nurse vacancies, nurse staff characteristics and nurse staffing changes. Reports generated from the process include forecasted shortfalls or surpluses of nurses, percentage of novice nurses, occupancy, sick time, overtime, agency use and other metrics. Readers will benefit from a description of the ways in which the data generated from the nursing resource analysis process are utilized at senior leadership, program and unit levels to support proactive hiring and resource allocation decisions and to predict unit-specific recruitment and retention patterns across multiple healthcare organizations and regions.

  6. A three-dimensional spin-diffusion model for micromagnetics

    PubMed Central

    Abert, Claas; Ruggeri, Michele; Bruckner, Florian; Vogler, Christoph; Hrkac, Gino; Praetorius, Dirk; Suess, Dieter

    2015-01-01

    We solve a time-dependent three-dimensional spin-diffusion model coupled to the Landau-Lifshitz-Gilbert equation numerically. The presented model is validated by comparison to two established spin-torque models: The model of Slonzewski that describes spin-torque in multi-layer structures in the presence of a fixed layer and the model of Zhang and Li that describes current driven domain-wall motion. It is shown that both models are incorporated by the spin-diffusion description, i.e., the nonlocal effects of the Slonzewski model are captured as well as the spin-accumulation due to magnetization gradients as described by the model of Zhang and Li. Moreover, the presented method is able to resolve the time dependency of the spin-accumulation. PMID:26442796

  7. Description of a user-oriented geographic information system - The resource analysis program

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Mokma, D. L.

    1980-01-01

    This paper describes the Resource Analysis Program, an applied geographic information system. Several applications are presented which utilized soil, and other natural resource data, to develop integrated maps and data analyses. These applications demonstrate the methods of analysis and the philosophy of approach used in the mapping system. The applications are evaluated in reference to four major needs of a functional mapping system: data capture, data libraries, data analysis, and mapping and data display. These four criteria are then used to describe an effort to develop the next generation of applied mapping systems. This approach uses inexpensive microcomputers for field applications and should prove to be a viable entry point for users heretofore unable or unwilling to venture into applied computer mapping.

  8. Estimating fish populations by removal methods with minnow traps in southeast Alaska streams.

    Treesearch

    M.D. Bryant

    2002-01-01

    Passive capture methods, such as minnow traps, are commonly used to capture fish for mark-recapture population estimates; however, they have not been used for removal methods. Minnow traps set for 90-min periods during three or four sequential capture occasions during the summer of 1996 were used to capture coho salmon Oncorhynchus kisutch fry and...

  9. TH-CD-207A-07: Prediction of High Dimensional State Subject to Respiratory Motion: A Manifold Learning Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, W; Sawant, A; Ruan, D

    Purpose: The development of high dimensional imaging systems (e.g. volumetric MRI, CBCT, photogrammetry systems) in image-guided radiotherapy provides important pathways to the ultimate goal of real-time volumetric/surface motion monitoring. This study aims to develop a prediction method for the high dimensional state subject to respiratory motion. Compared to conventional linear dimension reduction based approaches, our method utilizes manifold learning to construct a descriptive feature submanifold, where more efficient and accurate prediction can be performed. Methods: We developed a prediction framework for high-dimensional state subject to respiratory motion. The proposed method performs dimension reduction in a nonlinear setting to permit moremore » descriptive features compared to its linear counterparts (e.g., classic PCA). Specifically, a kernel PCA is used to construct a proper low-dimensional feature manifold, where low-dimensional prediction is performed. A fixed-point iterative pre-image estimation method is applied subsequently to recover the predicted value in the original state space. We evaluated and compared the proposed method with PCA-based method on 200 level-set surfaces reconstructed from surface point clouds captured by the VisionRT system. The prediction accuracy was evaluated with respect to root-mean-squared-error (RMSE) for both 200ms and 600ms lookahead lengths. Results: The proposed method outperformed PCA-based approach with statistically higher prediction accuracy. In one-dimensional feature subspace, our method achieved mean prediction accuracy of 0.86mm and 0.89mm for 200ms and 600ms lookahead lengths respectively, compared to 0.95mm and 1.04mm from PCA-based method. The paired t-tests further demonstrated the statistical significance of the superiority of our method, with p-values of 6.33e-3 and 5.78e-5, respectively. Conclusion: The proposed approach benefits from the descriptiveness of a nonlinear manifold and the prediction reliability in such low dimensional manifold. The fixed-point iterative approach turns out to work well practically for the pre-image recovery. Our approach is particularly suitable to facilitate managing respiratory motion in image-guide radiotherapy. This work is supported in part by NIH grant R01 CA169102-02.« less

  10. Avian models for toxicity testing

    USGS Publications Warehouse

    Hill, E.F.; Hoffman, D.J.

    1984-01-01

    The use of birds as test models in experimental and environmental toxicology as related to health effects is reviewed, and an overview of descriptive tests routinely used in wildlife toxicology is provided. Toxicologic research on birds may be applicable to human health both directly by their use as models for mechanistic and descriptive studies and indirectly as monitors of environmental quality. Topics include the use of birds as models for study of teratogenesis and embryotoxicity, neurotoxicity, behavior, trends of environmental pollution, and for use in predictive wildlife toxicology. Uses of domestic and wild-captured birds are discussed.

  11. A methodological pilot: parenting among women in substance abuse treatment.

    PubMed

    Lewin, Linda; Farkas, Kathleen; Niazi, Maryam

    2014-01-01

    Mothers who abuse substances are likely to have insecure emotional attachment with their children, placing their children at risk for social-emotional and psychiatric conditions. Sobriety does not inevitably improve parenting. We tested recruitment methods, audiovisual (AV) recording procedures, the protocol for identifying child abuse risk, the coding of mother-child interactions, and retention of the sample for repeated measures as the first phase in examining mother-child relational quality of women in substance abuse treatment. This innovative study involved AV recordings to capture the in-vivo mother-child interactional behaviors that were later coded and analyzed for mean scores on the 64-item Parent-Child Relational Quality Assessment. Repeated measurement was planned during treatment and two months after discharge from treatment. The pilot involved a small sample (n = 11) of mother-child (<6 years) dyads. Highest and lowest ratings of interaction behaviors were identified. Mothers showed less enthusiasm and creativity but matched their child's emotional state. The children showed appropriate motor skill items and attachment behaviors. The dyad coding showed less mutual enjoyment between the mother and child. Eight of the participants could not be located for the second measurement despite multiple contact methods. AV recordings capture rich, descriptive information that can be coded for interactional quality analysis. Repeated measurement with this cohort was not feasible, thus needing to assess for additional/more frequent contacts to maintain the sample.

  12. Evidence-informed health policy 4 – Case descriptions of organizations that support the use of research evidence

    PubMed Central

    Lavis, John N; Moynihan, Ray; Oxman, Andrew D; Paulsen, Elizabeth J

    2008-01-01

    Background Previous efforts to produce case descriptions have typically not focused on the organizations that produce research evidence and support its use. External evaluations of such organizations have typically not been analyzed as a group to identify the lessons that have emerged across multiple evaluations. Case descriptions offer the potential for capturing the views and experiences of many individuals who are familiar with an organization, including staff, advocates, and critics. Methods We purposively sampled a subgroup of organizations from among those that participated in the second (interview) phase of the study and (once) from among other organizations with which we were familiar. We developed and pilot-tested a case description data collection protocol, and conducted site visits that included both interviews and documentary analyses. Themes were identified from among responses to semi-structured questions using a constant comparative method of analysis. We produced both a brief (one to two pages) written description and a video documentary for each case. Results We conducted 51 interviews as part of the eight site visits. Two organizational strengths were repeatedly cited by individuals participating in the site visits: use of an evidence-based approach (which was identified as being very time-consuming) and existence of a strong relationship between researchers and policymakers (which can be challenged by conflicts of interest). Two organizational weaknesses – a lack of resources and the presence of conflicts of interest – were repeatedly cited by individuals participating in the site visits. Participants offered two main suggestions for the World Health Organization (and other international organizations and networks): 1) mobilize one or more of government support, financial resources, and the participation of both policymakers and researchers; and 2) create knowledge-related global public goods. Conclusion The findings from our case descriptions, the first of their kind, intersect in interesting ways with the messages arising from two systematic reviews of the factors that increase the prospects for research use in policymaking. Strong relationships between researchers and policymakers bodes well given such interactions appear to increase the prospects for research use. The time-consuming nature of an evidence-based approach, on the other hand, suggests the need for more efficient production processes that are 'quick and clean enough.' Our case descriptions and accompanying video documentaries provide a rich description of organizations supporting the use of research evidence, which can be drawn upon by those establishing or leading similar organizations, particularly in low- and middle-income countries. PMID:19091110

  13. Dynamic CDM strategies in an EHR environment.

    PubMed

    Bieker, Michael; Bailey, Spencer

    2012-02-01

    A dynamic charge description master (CDM) integrates information from clinical ancillary systems into the charge-capture process, so an organization can reduce its reliance on the patient accounting system as the sole source of billing information. By leveraging the information from electronic ancillary systems, providers can eliminate the need for paper charge-capture forms and see increased accuracy and efficiency in the maintenance of billing information. Before embarking on a dynamic CDM strategy, organizations should first determine their goals for implementing an EHR system, include revenue cycle leaders on the EHR implementation team, and carefully weigh the pros and cons of CDM design decisions.

  14. Neural activity in relation to empirically derived personality syndromes in depression using a psychodynamic fMRI paradigm

    PubMed Central

    Taubner, Svenja; Wiswede, Daniel; Kessler, Henrik

    2013-01-01

    Objective: The heterogeneity between patients with depression cannot be captured adequately with existing descriptive systems of diagnosis and neurobiological models of depression. Furthermore, considering the highly individual nature of depression, the application of general stimuli in past research efforts may not capture the essence of the disorder. This study aims to identify subtypes of depression by using empirically derived personality syndromes, and to explore neural correlates of the derived personality syndromes. Materials and Methods: In the present exploratory study, an individually tailored and psychodynamically based functional magnetic resonance imaging paradigm using dysfunctional relationship patterns was presented to 20 chronically depressed patients. Results from the Shedler–Westen Assessment Procedure (SWAP-200) were analyzed by Q-factor analysis to identify clinically relevant subgroups of depression and related brain activation. Results: The principle component analysis of SWAP-200 items from all 20 patients lead to a two-factor solution: “Depressive Personality” and “Emotional-Hostile-Externalizing Personality.” Both factors were used in a whole-brain correlational analysis but only the second factor yielded significant positive correlations in four regions: a large cluster in the right orbitofrontal cortex (OFC), the left ventral striatum, a small cluster in the left temporal pole, and another small cluster in the right middle frontal gyrus. Discussion: The degree to which patients with depression score high on the factor “Emotional-Hostile-Externalizing Personality” correlated with relatively higher activity in three key areas involved in emotion processing, evaluation of reward/punishment, negative cognitions, depressive pathology, and social knowledge (OFC, ventral striatum, temporal pole). Results may contribute to an alternative description of neural correlates of depression showing differential brain activation dependent on the extent of specific personality syndromes in depression. PMID:24363644

  15. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes tomore » make use of the new data.3« less

  16. The health care and life sciences community profile for dataset descriptions

    PubMed Central

    Alexiev, Vladimir; Ansell, Peter; Bader, Gary; Baran, Joachim; Bolleman, Jerven T.; Callahan, Alison; Cruz-Toledo, José; Gaudet, Pascale; Gombocz, Erich A.; Gonzalez-Beltran, Alejandra N.; Groth, Paul; Haendel, Melissa; Ito, Maori; Jupp, Simon; Juty, Nick; Katayama, Toshiaki; Kobayashi, Norio; Krishnaswami, Kalpana; Laibe, Camille; Le Novère, Nicolas; Lin, Simon; Malone, James; Miller, Michael; Mungall, Christopher J.; Rietveld, Laurens; Wimalaratne, Sarala M.; Yamaguchi, Atsuko

    2016-01-01

    Access to consistent, high-quality metadata is critical to finding, understanding, and reusing scientific data. However, while there are many relevant vocabularies for the annotation of a dataset, none sufficiently captures all the necessary metadata. This prevents uniform indexing and querying of dataset repositories. Towards providing a practical guide for producing a high quality description of biomedical datasets, the W3C Semantic Web for Health Care and the Life Sciences Interest Group (HCLSIG) identified Resource Description Framework (RDF) vocabularies that could be used to specify common metadata elements and their value sets. The resulting guideline covers elements of description, identification, attribution, versioning, provenance, and content summarization. This guideline reuses existing vocabularies, and is intended to meet key functional requirements including indexing, discovery, exchange, query, and retrieval of datasets, thereby enabling the publication of FAIR data. The resulting metadata profile is generic and could be used by other domains with an interest in providing machine readable descriptions of versioned datasets. PMID:27602295

  17. Creating Body Shapes From Verbal Descriptions by Linking Similarity Spaces.

    PubMed

    Hill, Matthew Q; Streuber, Stephan; Hahn, Carina A; Black, Michael J; O'Toole, Alice J

    2016-11-01

    Brief verbal descriptions of people's bodies (e.g., "curvy," "long-legged") can elicit vivid mental images. The ease with which these mental images are created belies the complexity of three-dimensional body shapes. We explored the relationship between body shapes and body descriptions and showed that a small number of words can be used to generate categorically accurate representations of three-dimensional bodies. The dimensions of body-shape variation that emerged in a language-based similarity space were related to major dimensions of variation computed directly from three-dimensional laser scans of 2,094 bodies. This relationship allowed us to generate three-dimensional models of people in the shape space using only their coordinates on analogous dimensions in the language-based description space. Human descriptions of photographed bodies and their corresponding models matched closely. The natural mapping between the spaces illustrates the role of language as a concise code for body shape that captures perceptually salient global and local body features. © The Author(s) 2016.

  18. Describing content in middle school science curricula

    NASA Astrophysics Data System (ADS)

    Schwarz-Ballard, Jennifer A.

    As researchers and designers, we intuitively recognize differences between curricula and describe them in terms of design strategy: project-based, laboratory-based, modular, traditional, and textbook, among others. We assume that practitioners recognize the differences in how each requires that students use knowledge, however these intuitive differences have not been captured or systematically described by the existing languages for describing learning goals. In this dissertation I argue that we need new ways of capturing relationships among elements of content, and propose a theory that describes some of the important differences in how students reason in differently designed curricula and activities. Educational researchers and curriculum designers have taken a variety of approaches to laying out learning goals for science. Through an analysis of existing descriptions of learning goals I argue that to describe differences in the understanding students come away with, they need to (1) be specific about the form of knowledge, (2) incorporate both the processes through which knowledge is used and its form, and (3) capture content development across a curriculum. To show the value of inquiry curricula, learning goals need to incorporate distinctions among the variety of ways we ask students to use knowledge. Here I propose the Epistemic Structures Framework as one way to describe differences in students reasoning that are not captured by existing descriptions of learning goals. The usefulness of the Epistemic Structures framework is demonstrated in the four curriculum case study examples in Part II of this work. The curricula in the case studies represent a range of content coverage, curriculum structure, and design rationale. They serve both to illustrate the Epistemic Structures analysis process and make the case that it does in fact describe learning goals in a way that captures important differences in students reasoning in differently designed curricula. Describing learning goals in terms of Epistemic Structures provides one way to define what we mean when we talk about "project-based" curricula and demonstrate its "value added" to educators, administrators and policy makers.

  19. A New Method for Computing Three-Dimensional Capture Fraction in Heterogeneous Regional Systems using the MODFLOW Adjoint Code

    NASA Astrophysics Data System (ADS)

    Clemo, T. M.; Ramarao, B.; Kelly, V. A.; Lavenue, M.

    2011-12-01

    Capture is a measure of the impact of groundwater pumping upon groundwater and surface water systems. The computation of capture through analytical or numerical methods has been the subject of articles in the literature for several decades (Bredehoeft et al., 1982). Most recently Leake et al. (2010) described a systematic way to produce capture maps in three-dimensional systems using a numerical perturbation approach in which capture from streams was computed using unit rate pumping at many locations within a MODFLOW model. The Leake et al. (2010) method advances the current state of computing capture. A limitation stems from the computational demand required by the perturbation approach wherein days or weeks of computational time might be required to obtain a robust measure of capture. In this paper, we present an efficient method to compute capture in three-dimensional systems based upon adjoint states. The efficiency of the adjoint method will enable uncertainty analysis to be conducted on capture calculations. The USGS and INTERA have collaborated to extend the MODFLOW Adjoint code (Clemo, 2007) to include stream-aquifer interaction and have applied it to one of the examples used in Leake et al. (2010), the San Pedro Basin MODFLOW model. With five layers and 140,800 grid blocks per layer, the San Pedro Basin model, provided an ideal example data set to compare the capture computed from the perturbation and the adjoint methods. The capture fraction map produced from the perturbation method for the San Pedro Basin model required significant computational time to compute and therefore the locations for the pumping wells were limited to 1530 locations in layer 4. The 1530 direct simulations of capture require approximately 76 CPU hours. Had capture been simulated in each grid block in each layer, as is done in the adjoint method, the CPU time would have been on the order of 4 years. The MODFLOW-Adjoint produced the capture fraction map of the San Pedro Basin model at 704,000 grid blocks (140,800 grid blocks x 5 layers) in just 6 minutes. The capture fraction maps from the perturbation and adjoint methods agree closely. The results of this study indicate that the adjoint capture method and its associated computational efficiency will enable scientists and engineers facing water resource management decisions to evaluate the sensitivity and uncertainty of impacts to regional water resource systems as part of groundwater supply strategies. Bredehoeft, J.D., S.S. Papadopulos, and H.H. Cooper Jr, Groundwater: The water budget myth. In Scientific Basis of Water-Resources Management, ed. National Research Council (U.S.), Geophysical Study Committee, 51-57. Washington D.C.: National Academy Press, 1982. Clemo, Tom, MODFLOW-2005 Ground-Water Model-Users Guide to Adjoint State based Sensitivity Process (ADJ), BSU CGISS 07-01, Center for the Geophysical Investigation of the Shallow Subsurface, Boise State University, 2007. Leake, S.A., H.W. Reeves, and J.E. Dickinson, A New Capture Fraction Method to Map How Pumpage Affects Surface Water Flow, Ground Water, 48(5), 670-700, 2010.

  20. ASSESSMENT OF THE RATES OF INJURY AND MORTALITY IN WATERFOWL CAPTURED WITH FIVE METHODS OF CAPTURE AND TECHNIQUES FOR MINIMIZING RISKS.

    PubMed

    O'Brien, Michelle F; Lee, Rebecca; Cromie, Ruth; Brown, Martin J

    2016-04-01

    Swan pipes, duck decoys, cage traps, cannon netting, and roundups are widely used to capture waterfowl in order to monitor populations. These methods are often regulated in countries with national ringing or banding programs and are considered to be safe, and thus justifiable given the benefits to conservation. However, few published studies have addressed how frequently injuries and mortalities occur, or the nature of any injuries. In the present study, rates of mortality and injury during captures with the use of these methods carried out by the Wildfowl & Wetlands Trust as part of conservation programs were assessed. The total rate of injury (including mild dermal abrasions) was 0.42% across all species groups, whereas total mortality was 0.1% across all capture methods. Incidence of injury varied among species groups (ducks, geese, swans, and rails), with some, for example, dabbling ducks, at greater risk than others. We also describe techniques used before, during, and after a capture to reduce stress and injury in captured waterfowl. Projects using these or other capture methods should monitor and publish their performance to allow sharing of experience and to reduce risks further.

  1. Tailoring the Variational Implicit Solvent Method for New Challenges: Biomolecular Recognition and Assembly

    PubMed Central

    Ricci, Clarisse Gravina; Li, Bo; Cheng, Li-Tien; Dzubiella, Joachim; McCammon, J. Andrew

    2018-01-01

    Predicting solvation free energies and describing the complex water behavior that plays an important role in essentially all biological processes is a major challenge from the computational standpoint. While an atomistic, explicit description of the solvent can turn out to be too expensive in large biomolecular systems, most implicit solvent methods fail to capture “dewetting” effects and heterogeneous hydration by relying on a pre-established (i.e., guessed) solvation interface. Here we focus on the Variational Implicit Solvent Method, an implicit solvent method that adds water “plasticity” back to the picture by formulating the solvation free energy as a functional of all possible solvation interfaces. We survey VISM's applications to the problem of molecular recognition and report some of the most recent efforts to tailor VISM for more challenging scenarios, with the ultimate goal of including thermal fluctuations into the framework. The advances reported herein pave the way to make VISM a uniquely successful approach to characterize complex solvation properties in the recognition and binding of large-scale biomolecular complexes. PMID:29484300

  2. Coupled forward-backward trajectory approach for nonequilibrium electron-ion dynamics

    NASA Astrophysics Data System (ADS)

    Sato, Shunsuke A.; Kelly, Aaron; Rubio, Angel

    2018-04-01

    We introduce a simple ansatz for the wave function of a many-body system based on coupled forward and backward propagating semiclassical trajectories. This method is primarily aimed at, but not limited to, treating nonequilibrium dynamics in electron-phonon systems. The time evolution of the system is obtained from the Euler-Lagrange variational principle, and we show that this ansatz yields Ehrenfest mean-field theory in the limit that the forward and backward trajectories are orthogonal, and in the limit that they coalesce. We investigate accuracy and performance of this method by simulating electronic relaxation in the spin-boson model and the Holstein model. Although this method involves only pairs of semiclassical trajectories, it shows a substantial improvement over mean-field theory, capturing quantum coherence of nuclear dynamics as well as electron-nuclear correlations. This improvement is particularly evident in nonadiabatic systems, where the accuracy of this coupled trajectory method extends well beyond the perturbative electron-phonon coupling regime. This approach thus provides an attractive route forward to the ab initio description of relaxation processes, such as thermalization, in condensed phase systems.

  3. 40 CFR 63.11509 - What are my notification, reporting, and recordkeeping requirements?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... part. (3) The records required to show continuous compliance with each management practice and... management practices and equipment standards. (iii) Description of the capture and emission control systems... that is subject to the requirements in § 63.11507(b), “What are my standards and management practices...

  4. 40 CFR 63.11509 - What are my notification, reporting, and recordkeeping requirements?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... part. (3) The records required to show continuous compliance with each management practice and... management practices and equipment standards. (iii) Description of the capture and emission control systems... that is subject to the requirements in § 63.11507(b), “What are my standards and management practices...

  5. 40 CFR 63.11509 - What are my notification, reporting, and recordkeeping requirements?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... part. (3) The records required to show continuous compliance with each management practice and... management practices and equipment standards. (iii) Description of the capture and emission control systems... that is subject to the requirements in § 63.11507(b), “What are my standards and management practices...

  6. Stories from Hopescapes

    ERIC Educational Resources Information Center

    Harris, Violet J.

    2011-01-01

    Author Virginia Hamilton had the gift of creating lyrical phrases that captured the complexities of life. Among her most notable phrase is the idea of the "hopescape," the metaphoric description of the pains and joys, triumphs and defeats, longing, and dreams that make us human. The publication of an edited volume that compiles a sampling of…

  7. The Futility of Attempting to Codify Academic Achievement Standards

    ERIC Educational Resources Information Center

    Sadler, D. Royce

    2014-01-01

    Internationally, attempts at developing explicit descriptions of academic achievement standards have been steadily intensifying. The aim has been to capture the essence of the standards in words, symbols or diagrams (collectively referred to as codifications) so that standards can be: set and maintained at appropriate levels; made broadly…

  8. Computational modeling of the human auditory periphery: Auditory-nerve responses, evoked potentials and hearing loss.

    PubMed

    Verhulst, Sarah; Altoè, Alessandro; Vasilkov, Viacheslav

    2018-03-01

    Models of the human auditory periphery range from very basic functional descriptions of auditory filtering to detailed computational models of cochlear mechanics, inner-hair cell (IHC), auditory-nerve (AN) and brainstem signal processing. It is challenging to include detailed physiological descriptions of cellular components into human auditory models because single-cell data stems from invasive animal recordings while human reference data only exists in the form of population responses (e.g., otoacoustic emissions, auditory evoked potentials). To embed physiological models within a comprehensive human auditory periphery framework, it is important to capitalize on the success of basic functional models of hearing and render their descriptions more biophysical where possible. At the same time, comprehensive models should capture a variety of key auditory features, rather than fitting their parameters to a single reference dataset. In this study, we review and improve existing models of the IHC-AN complex by updating their equations and expressing their fitting parameters into biophysical quantities. The quality of the model framework for human auditory processing is evaluated using recorded auditory brainstem response (ABR) and envelope-following response (EFR) reference data from normal and hearing-impaired listeners. We present a model with 12 fitting parameters from the cochlea to the brainstem that can be rendered hearing impaired to simulate how cochlear gain loss and synaptopathy affect human population responses. The model description forms a compromise between capturing well-described single-unit IHC and AN properties and human population response features. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Implementation and flight tests for the Digital Integrated Automatic Landing System (DIALS). Part 1: Flight software equations, flight test description and selected flight test data

    NASA Technical Reports Server (NTRS)

    Hueschen, R. M.

    1986-01-01

    Five flight tests of the Digital Automated Landing System (DIALS) were conducted on the Advanced Transport Operating Systems (ATOPS) Transportation Research Vehicle (TSRV) -- a modified Boeing 737 aircraft for advanced controls and displays research. These flight tests were conducted at NASA's Wallops Flight Center using the microwave landing system (MLS) installation on runway 22. This report describes the flight software equations of the DIALS which was designed using modern control theory direct-digital design methods and employed a constant gain Kalman filter. Selected flight test performance data is presented for localizer (runway centerline) capture and track at various intercept angles, for glideslope capture and track of 3, 4.5, and 5 degree glideslopes, for the decrab maneuver, and for the flare maneuver. Data is also presented to illustrate the system performance in the presence of cross, gust, and shear winds. The mean and standard deviation of the peak position errors for localizer capture were, respectively, 24 feet and 26 feet. For mild wind conditions, glideslope and localizer tracking position errors did not exceed, respectively, 5 and 20 feet. For gusty wind conditions (8 to 10 knots), these errors were, respectively, 10 and 30 feet. Ten hands off automatic lands were performed. The standard deviation of the touchdown position and velocity errors from the mean values were, respectively, 244 feet and 0.7 feet/sec.

  10. Production data in media systems and press front ends: capture, formats and database methods

    NASA Astrophysics Data System (ADS)

    Karttunen, Simo

    1997-02-01

    The nature, purpose and data presentation features of media jobs are analyzed in relation to the content, document, process and resource management in media production. Formats are the natural way of presenting, collecting and storing information, contents, document components and final documents. The state of the art and the trends in the media formats and production data are reviewed. The types and the amount of production data are listed, e.g. events, schedules, product descriptions, reports, visual support, quality, process states and color data. The data exchange must be vendor-neutral. Adequate infrastructure and system architecture are defined for production and media data. The roles of open servers and intranets are evaluated and their potential roles as future solutions are anticipated. The press frontend is the part of print media production where large files dominate. The new output alternatives, i.e. film recorders, direct plate output (CTP and CTP-on-press) and digital, plateless printing lines need new workflow tools and very efficient file and format management. The paper analyzes the capture, formatting and storing of job files and respective production data, such as the event logs of the processes. Intranet, browsers, Java applets and open web severs will be used to capture production data, especially where intranets are used anyhow, or where several companies are networked to plan, design and use documents and printed products. The user aspects of installing intranets is stressed since there are numerous more traditional and more dedicated networking solutions on the market.

  11. Pre-capture multiplexing improves efficiency and cost-effectiveness of targeted genomic enrichment.

    PubMed

    Shearer, A Eliot; Hildebrand, Michael S; Ravi, Harini; Joshi, Swati; Guiffre, Angelica C; Novak, Barbara; Happe, Scott; LeProust, Emily M; Smith, Richard J H

    2012-11-14

    Targeted genomic enrichment (TGE) is a widely used method for isolating and enriching specific genomic regions prior to massively parallel sequencing. To make effective use of sequencer output, barcoding and sample pooling (multiplexing) after TGE and prior to sequencing (post-capture multiplexing) has become routine. While previous reports have indicated that multiplexing prior to capture (pre-capture multiplexing) is feasible, no thorough examination of the effect of this method has been completed on a large number of samples. Here we compare standard post-capture TGE to two levels of pre-capture multiplexing: 12 or 16 samples per pool. We evaluated these methods using standard TGE metrics and determined the ability to identify several classes of genetic mutations in three sets of 96 samples, including 48 controls. Our overall goal was to maximize cost reduction and minimize experimental time while maintaining a high percentage of reads on target and a high depth of coverage at thresholds required for variant detection. We adapted the standard post-capture TGE method for pre-capture TGE with several protocol modifications, including redesign of blocking oligonucleotides and optimization of enzymatic and amplification steps. Pre-capture multiplexing reduced costs for TGE by at least 38% and significantly reduced hands-on time during the TGE protocol. We found that pre-capture multiplexing reduced capture efficiency by 23 or 31% for pre-capture pools of 12 and 16, respectively. However efficiency losses at this step can be compensated by reducing the number of simultaneously sequenced samples. Pre-capture multiplexing and post-capture TGE performed similarly with respect to variant detection of positive control mutations. In addition, we detected no instances of sample switching due to aberrant barcode identification. Pre-capture multiplexing improves efficiency of TGE experiments with respect to hands-on time and reagent use compared to standard post-capture TGE. A decrease in capture efficiency is observed when using pre-capture multiplexing; however, it does not negatively impact variant detection and can be accommodated by the experimental design.

  12. Minimizing capture-related stress on white-tailed deer with a capture collar

    USGS Publications Warehouse

    DelGiudice, G.D.; Kunkel, K.E.; Mech, L.D.; Seal, U.S.

    1990-01-01

    We compared the effect of 3 capture methods for white-tailed deer (Odocoileus virginianus) on blood indicators of acute excitement and stress from 1 February to 20 April 1989. Eleven adult females were captured by Clover trap or cannon net between 1 February and 9 April 1989 in northeastern Minnesota [USA]. These deer were fitted with radio-controlled capture collars, and 9 deer were recaptured 7-33 days later. Trapping method affected serum cortisol (P < 0.0001), hemoglobin (Hb) (P < 0.06), and packed cell volume (PCV) (P < 0.07). Cortisol concentrations were lower (P < 0.0001) in capture-collared deer (0.54 .+-. 0.07 [SE] .mu.g/dL) compared to Clover-trapped (4.37 .+-. 0.69 .mu.g/dL) and cannon-netted (3.88 .+-. 0.82 .mu.g/dL) deer. Capture-collared deer were minimally stressed compared to deer captured by traditional methods. Use of the capture collar should permit more accurate interpretation of blood profiles of deer for assessement of condition and general health.

  13. A New Method for Determining Hamaker Constants of Solids Based on the Dynamic Approach Behavior of an Atomic Force Microscope

    NASA Astrophysics Data System (ADS)

    Fronczak, Sean G.

    The Hamaker constant, A, is a quantitative measure of the fundamental attractive van der Waals (vdW) interaction for microscale and nanoscale materials. This parameter captures each material's compositional effects on the vdW force, which is often needed as input for predicting the vdW interactions between particles and surfaces. Experimental attempts to determine A using an atomic force microscope (AFM) are typically hindered by issues inherent to the cantilever-tip-surface contact regime, such as surface roughness and deformation, and contact separation distance. Thus, we developed a new method for estimating Hamaker constants from the non-contact approach regime of an AFM experiment (Fronczak et al., 2017, Langmuir 33, 714-725). This method invokes a quasi-dynamic description of the cantilever tip's approach to contact, in which the inertial effects of the tip motion are accounted for when analyzing the trajectory of the tip's approach towards the substrate. The method was tested experimentally using silica, alumina and polystyrene substrates, and was demonstrated to yield estimates of A for these materials that were in very good agreement with previously published Lifshitz calculations. As with various other approaches to determining A, our new method relies heavily on the accuracy of the geometric model used to predict the interaction between the AFM tip and the substrate. For the initial validation experiments of our new method, we therefore focused on describing the shape of the cantilever tip as closely as possible, utilizing a complex model of a truncated pyramid with a spherical cap. Although this pyramidal geometry can be confirmed and the dimensions estimated via scanning electron microscopy (SEM), even high-resolution SEM images of the tip cannot provide sufficient detail to allow precise enough determination of the tip's geometric parameters. Consequently, we also propose an adaptation of the method, in which these difficult to quantify geometric effects are still fully captured via the convenient description of the tip as an 'effective' perfect sphere. Hence, the geometric complexity of the cantilever tip is no longer explicitly required for the determination of A. First, a tip is 'calibrated', whereby the deflection at first contact between the cantilever tip and a smooth surface of known vdW properties is determined and an effective radius, Reff, of the tip is calculated. The tip's approach to contact toward other similarly smooth surfaces can then be well-described by using only this single geometric parameter. We demonstrate the practicality and accuracy of this updated method by comparing the results with both the original pyramid model and Lifshitz approximations (when available) for flat substrates composed of silica, polystyrene, highly ordered pyrolytic graphite (HOPG), sapphire (alpha-Al3O2), Plexiglas (PMMA), and acrylonitrile butadiene styrene (ABS). Then, the modified quasi-dynamic model was employed to study the strength of the adhesive interaction between TNT and several swab materials which are used as explosive detection devices at security checkpoints. This information is crucial for the development and improvement of next-generation swab detection protocols to further advance this field. Finally, we also include the effects of thermal noise into our quasi-dynamic description of the cantilever motion to better understand how such noise might influence the accuracy of our method. We likewise determine, for the first time, the effects of instrument noise on the accuracy of other approach-to-contact methodologies for determining A.

  14. Using seemingly unnecessary illustrations to improve the diagnostic usefulness of descriptions in taxonomy–a case study on Perochaeta orientalis (Diptera, Sepsidae)

    PubMed Central

    Ang, Yuchen; Wong, Ling Jing; Meier, Rudolf

    2013-01-01

    Abstract Many species descriptions, especially older ones, consist mostly of text and have few illustrations. Only the most conspicuous morphological features needed for species diagnosis and delimitation at the time of description are illustrated. Such descriptions can quickly become inadequate when new species or characters are discovered. We propose that descriptions should become more data-rich by presenting a large amount of images and illustrations to cover as much morphology as possible; these descriptions are more likely to remain adequate over time because their large amounts of visual data could capture character systems that may become important in the future. Such an approach can now be quickly and easily achieved given that high-quality digital photography is readily available. Here, we re-describe the sepsid fly Perochaeta orientalis (de Meijere 1913) (Diptera, Sepsidae) which has suffered from inadequate descriptions in the past, and use photomicrography, scanning electron microscopy and videography to document its external morphology and mating behaviour. All images and videos are embedded within the electronic publication. We discuss briefly benefits and problems with our approach. PMID:24363567

  15. Application and comparison of large-scale solution-based DNA capture-enrichment methods on ancient DNA

    PubMed Central

    Ávila-Arcos, María C.; Cappellini, Enrico; Romero-Navarro, J. Alberto; Wales, Nathan; Moreno-Mayar, J. Víctor; Rasmussen, Morten; Fordyce, Sarah L.; Montiel, Rafael; Vielle-Calzada, Jean-Philippe; Willerslev, Eske; Gilbert, M. Thomas P.

    2011-01-01

    The development of second-generation sequencing technologies has greatly benefitted the field of ancient DNA (aDNA). Its application can be further exploited by the use of targeted capture-enrichment methods to overcome restrictions posed by low endogenous and contaminating DNA in ancient samples. We tested the performance of Agilent's SureSelect and Mycroarray's MySelect in-solution capture systems on Illumina sequencing libraries built from ancient maize to identify key factors influencing aDNA capture experiments. High levels of clonality as well as the presence of multiple-copy sequences in the capture targets led to biases in the data regardless of the capture method. Neither method consistently outperformed the other in terms of average target enrichment, and no obvious difference was observed either when two tiling designs were compared. In addition to demonstrating the plausibility of capturing aDNA from ancient plant material, our results also enable us to provide useful recommendations for those planning targeted-sequencing on aDNA. PMID:22355593

  16. Automatic Defect Detection for TFT-LCD Array Process Using Quasiconformal Kernel Support Vector Data Description

    PubMed Central

    Liu, Yi-Hung; Chen, Yan-Jen

    2011-01-01

    Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms. PMID:22016625

  17. Many-body dispersion effects in the binding of adsorbates on metal surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maurer, Reinhard J.; Ruiz, Victor G.; Tkatchenko, Alexandre

    2015-09-14

    A correct description of electronic exchange and correlation effects for molecules in contact with extended (metal) surfaces is a challenging task for first-principles modeling. In this work, we demonstrate the importance of collective van der Waals dispersion effects beyond the pairwise approximation for organic–inorganic systems on the example of atoms, molecules, and nanostructures adsorbed on metals. We use the recently developed many-body dispersion (MBD) approach in the context of density-functional theory [Tkatchenko et al., Phys. Rev. Lett. 108, 236402 (2012) and Ambrosetti et al., J. Chem. Phys. 140, 18A508 (2014)] and assess its ability to correctly describe the binding ofmore » adsorbates on metal surfaces. We briefly review the MBD method and highlight its similarities to quantum-chemical approaches to electron correlation in a quasiparticle picture. In particular, we study the binding properties of xenon, 3,4,9,10-perylene-tetracarboxylic acid, and a graphene sheet adsorbed on the Ag(111) surface. Accounting for MBD effects, we are able to describe changes in the anisotropic polarizability tensor, improve the description of adsorbate vibrations, and correctly capture the adsorbate–surface interaction screening. Comparison to other methods and experiment reveals that inclusion of MBD effects improves adsorption energies and geometries, by reducing the overbinding typically found in pairwise additive dispersion-correction approaches.« less

  18. Automatic defect detection for TFT-LCD array process using quasiconformal kernel support vector data description.

    PubMed

    Liu, Yi-Hung; Chen, Yan-Jen

    2011-01-01

    Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  19. Accurate simulations of helium pick-up experiments using a rejection-free Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Dutra, Matthew; Hinde, Robert

    2018-04-01

    In this paper, we present Monte Carlo simulations of helium droplet pick-up experiments with the intention of developing a robust and accurate theoretical approach for interpreting experimental helium droplet calorimetry data. Our approach is capable of capturing the evaporative behavior of helium droplets following dopant acquisition, allowing for a more realistic description of the pick-up process. Furthermore, we circumvent the traditional assumption of bulk helium behavior by utilizing density functional calculations of the size-dependent helium droplet chemical potential. The results of this new Monte Carlo technique are compared to commonly used Poisson pick-up statistics for simulations that reflect a broad range of experimental parameters. We conclude by offering an assessment of both of these theoretical approaches in the context of our observed results.

  20. Design Rules and Analysis of a Capture Mechanism for Rendezvous between a Space Tether and Payload

    NASA Technical Reports Server (NTRS)

    Sorensen, Kirk F.; Canfield, Stephen L.; Norris, Marshall A.

    2006-01-01

    Momentum-exchange/electrodynamic reboost (MXER) tether systems have been proposed to serve as an "upper stage in space". A MXER tether station would boost spacecraft from low Earth orbit to a high-energy orbit quickly, like a high-thrust rocket. Then, it would slowly rebuild its orbital momentum through electrodynamic thrust, minimizing the use of propellant. One of the primary challenges in developing a momentum-exchange/electrodynamic reboost tether system as identified by the 2003 MXER Technology Assessment Group is in the development of a mechanism that will enable the processes of capture, carry and release of a payload by the rotating tether as required by the MXER tether approach. This paper will present a concept that will achieve the desired goals of the capture system. This solution is presented as a multi-DOF (degree-of-freedom) capture mechanism with nearly passive operation that features matching of the capture space and expected window of capture error, efficient use of mass and nearly passive actuation during the capture process. This paper will describe the proposed capture mechanism concept and provide an evaluation of the concept through a dynamic model and experimental tests performed on a prototype article of the mechanism in a dynamically similar environment. This paper will also develop a set of rules to guide the design of such a capture mechanism based on analytical and experimental analyses. The primary contributions of this paper will be a description of the proposed capture mechanism concept, a collection of rules to guide its design, and empirical and model information that can be used to evaluate the capability of the concept

  1. Data Sources for Trait Databases: Comparing the Phenomic Content of Monographs and Evolutionary Matrices.

    PubMed

    Dececchi, T Alex; Mabee, Paula M; Blackburn, David C

    2016-01-01

    Databases of organismal traits that aggregate information from one or multiple sources can be leveraged for large-scale analyses in biology. Yet the differences among these data streams and how well they capture trait diversity have never been explored. We present the first analysis of the differences between phenotypes captured in free text of descriptive publications ('monographs') and those used in phylogenetic analyses ('matrices'). We focus our analysis on osteological phenotypes of the limbs of four extinct vertebrate taxa critical to our understanding of the fin-to-limb transition. We find that there is low overlap between the anatomical entities used in these two sources of phenotype data, indicating that phenotypes represented in matrices are not simply a subset of those found in monographic descriptions. Perhaps as expected, compared to characters found in matrices, phenotypes in monographs tend to emphasize descriptive and positional morphology, be somewhat more complex, and relate to fewer additional taxa. While based on a small set of focal taxa, these qualitative and quantitative data suggest that either source of phenotypes alone will result in incomplete knowledge of variation for a given taxon. As a broader community develops to use and expand databases characterizing organismal trait diversity, it is important to recognize the limitations of the data sources and develop strategies to more fully characterize variation both within species and across the tree of life.

  2. Data Sources for Trait Databases: Comparing the Phenomic Content of Monographs and Evolutionary Matrices

    PubMed Central

    Dececchi, T. Alex; Mabee, Paula M.; Blackburn, David C.

    2016-01-01

    Databases of organismal traits that aggregate information from one or multiple sources can be leveraged for large-scale analyses in biology. Yet the differences among these data streams and how well they capture trait diversity have never been explored. We present the first analysis of the differences between phenotypes captured in free text of descriptive publications (‘monographs’) and those used in phylogenetic analyses (‘matrices’). We focus our analysis on osteological phenotypes of the limbs of four extinct vertebrate taxa critical to our understanding of the fin-to-limb transition. We find that there is low overlap between the anatomical entities used in these two sources of phenotype data, indicating that phenotypes represented in matrices are not simply a subset of those found in monographic descriptions. Perhaps as expected, compared to characters found in matrices, phenotypes in monographs tend to emphasize descriptive and positional morphology, be somewhat more complex, and relate to fewer additional taxa. While based on a small set of focal taxa, these qualitative and quantitative data suggest that either source of phenotypes alone will result in incomplete knowledge of variation for a given taxon. As a broader community develops to use and expand databases characterizing organismal trait diversity, it is important to recognize the limitations of the data sources and develop strategies to more fully characterize variation both within species and across the tree of life. PMID:27191170

  3. Discovery as a process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loehle, C.

    1994-05-01

    The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if somethingmore » is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.« less

  4. Learning about Ecological Systems by Constructing Qualitative Models with DynaLearn

    ERIC Educational Resources Information Center

    Leiba, Moshe; Zuzovsky, Ruth; Mioduser, David; Benayahu, Yehuda; Nachmias, Rafi

    2012-01-01

    A qualitative model of a system is an abstraction that captures ordinal knowledge and predicts the set of qualitatively possible behaviours of the system, given a qualitative description of its structure and initial state. This paper examines an innovative approach to science education using an interactive learning environment that supports…

  5. Captured by Details: Sense-Making, Language and Communication in Autism

    ERIC Educational Resources Information Center

    Noens, Ilse L. J.; van Berckelaer-Onnes, Ina A.

    2005-01-01

    The communication of people with autism spectrum disorder (ASD) is characterized by a qualitative impairment in verbal and non-verbal communication. In past decades a growing body of descriptive studies has appeared on language and communication problems in ASD. Reviews suggest that the development of formal and semantic aspects is relatively…

  6. A Continuum Description of Nonlinear Elasticity, Slip and Twinning, With Application to Sapphire

    DTIC Science & Technology

    2009-03-01

    Twinning is modelled via the isochoric term FI, and residual volume changes associated with defects are captured by the Jacobian determinant J . The...BF00126994) Farber, Y. A., Yoon, S. Y., Lagerlof, K. P. D. & Heuer, A. H. 1993 Microplasticity during high temperature indentation and the Peierls

  7. A Phenomenological Study: The Experience of Live Supervision during a Pre-Practicum Counseling Techniques Course

    ERIC Educational Resources Information Center

    Koltz, Rebecca L.; Feit, Stephen S.

    2012-01-01

    The experiences of live supervision for three, master's level, pre-practicum counseling students were explored using a phenomenological methodology. Using semi-structured interviews, this study resulted in a thick description of the experience of live supervision capturing participants' thoughts, emotions, and behaviors. Data revealed that live…

  8. s -wave scattering length of a Gaussian potential

    NASA Astrophysics Data System (ADS)

    Jeszenszki, Peter; Cherny, Alexander Yu.; Brand, Joachim

    2018-04-01

    We provide accurate expressions for the s -wave scattering length for a Gaussian potential well in one, two, and three spatial dimensions. The Gaussian potential is widely used as a pseudopotential in the theoretical description of ultracold-atomic gases, where the s -wave scattering length is a physically relevant parameter. We first describe a numerical procedure to compute the value of the s -wave scattering length from the parameters of the Gaussian, but find that its accuracy is limited in the vicinity of singularities that result from the formation of new bound states. We then derive simple analytical expressions that capture the correct asymptotic behavior of the s -wave scattering length near the bound states. Expressions that are increasingly accurate in wide parameter regimes are found by a hierarchy of approximations that capture an increasing number of bound states. The small number of numerical coefficients that enter these expressions is determined from accurate numerical calculations. The approximate formulas combine the advantages of the numerical and approximate expressions, yielding an accurate and simple description from the weakly to the strongly interacting limit.

  9. Role of nuclear reactions on stellar evolution of intermediate-mass stars

    NASA Astrophysics Data System (ADS)

    Möller, H.; Jones, S.; Fischer, T.; Martínez-Pinedo, G.

    2018-01-01

    The evolution of intermediate-mass stars (8 - 12 solar masses) represents one of the most challenging subjects in nuclear astrophysics. Their final fate is highly uncertain and strongly model dependent. They can become white dwarfs, they can undergo electron-capture or core-collapse supernovae or they might even proceed towards explosive oxygen burning and a subsequent thermonuclear explosion. We believe that an accurate description of nuclear reactions is crucial for the determination of the pre-supernova structure of these stars. We argue that due to the possible development of an oxygen-deflagration, a hydrodynamic description has to be used. We implement a nuclear reaction network with ∼200 nuclear species into the implicit hydrodynamic code AGILE. The reaction network considers all relevant nuclear electron captures and beta-decays. For selected relevant nuclear species, we include a set of updated reaction rates, for which we discuss the role for the evolution of the stellar core, at the example of selected stellar models. We find that the final fate of these intermediate-mass stars depends sensitively on the density threshold for weak processes that deleptonize the core.

  10. Differential estimates of southern flying squirrel (Glaucomys volans) population structure based on capture method

    Treesearch

    Kevin S. Laves; Susan C. Loeb

    2005-01-01

    It is commonly assumed that population estimates derived from trapping small mammals are accurate and unbiased or that estimates derived from different capture methods are comparable. We captured southern flying squirrels (Glaucmrtys volam) using two methods to study their effect on red-cockaded woodpecker (Picoides bumah) reproductive success. Southern flying...

  11. Model for transport and reaction of defects and carriers within displacement cascades in gallium arsenide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wampler, William R., E-mail: wrwampl@sandia.gov; Myers, Samuel M.

    A model is presented for recombination of charge carriers at evolving displacement damage in gallium arsenide, which includes clustering of the defects in atomic displacement cascades produced by neutron or ion irradiation. The carrier recombination model is based on an atomistic description of capture and emission of carriers by the defects with time evolution resulting from the migration and reaction of the defects. The physics and equations on which the model is based are presented, along with the details of the numerical methods used for their solution. The model uses a continuum description of diffusion, field-drift and reaction of carriers,more » and defects within a representative spherically symmetric cluster of defects. The initial radial defect profiles within the cluster were determined through pair-correlation-function analysis of the spatial distribution of defects obtained from the binary-collision code MARLOWE, using recoil energies for fission neutrons. Properties of the defects are discussed and values for their parameters are given, many of which were obtained from density functional theory. The model provides a basis for predicting the transient response of III-V heterojunction bipolar transistors to displacement damage from energetic particle irradiation.« less

  12. COMPLETE DETERMINATION OF POLARIZATION FOR A HIGH-ENERGY DEUTERON BEAM (thesis)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Button, J

    1959-05-01

    please delete the no. 17076<>13:017077The P/sub 1/ multigroup code was written for the IBM-704 in order to determine the accuracy of the few- group diffusion scheme with various imposed conditions and also to provide an alternate computational method when this scheme fails to be sufficiently accurate. The code solves for the spatially dependent multigroup flux, taking into account such nuclear phenomena is slowing down of neutrons resulting from elastic and inelastic scattering, the removal of neutrons resulting from epithermal capture and fission resonances, and the regeneration of fist neutrons resulting from fissioning which may occur in any of as manymore » as 80 fast multigroups or in the one thermal group. The code will accept as input a physical description of the reactor (that is: slab, cylindrical, or spherical geometry, number of points and regions, composition description group dependent boundary condition, transverse buckling, and mesh sizes) and a prepared library of nuclear properties of all the isotopes in each composition. The code will produce as output multigroup fluxes, currents, and isotopic slowing-down densities, in addition to pointwise and regionwise few-group macroscopic cross sections. (auth)« less

  13. Sequence context and crosslinking mechanism affect the efficiency of in vivo capture of a protein-protein interaction

    PubMed Central

    Lancia, Jody K.; Nwokoye, Adaora; Dugan, Amanda; Joiner, Cassandra; Pricer, Rachel; Mapp, Anna K.

    2014-01-01

    Protein-protein interactions (PPIs) are essential for implementing cellular processes and thus methods for the discovery and study of PPIs are highly desirable. An emerging method for capturing PPIs in their native cellular environment is in vivo covalent chemical capture, a method that uses nonsense suppression to site specifically incorporate photoactivable unnatural amino acids in living cells. However, in one study we found that this method did not capture a PPI for which there was abundant functional evidence, a complex formed between the transcriptional activator Gal4 and its repressor protein Gal80. Here we describe the factors that influence the success of covalent chemical capture and show that the innate reactivity of the two unnatural amino acids utilized, (p-benzoylphenylalanine (pBpa) and p-azidophenylalanine (pAzpa)), plays a profound role in the capture of Gal80 by Gal4. Based upon these data, guidelines are outlined for the successful use of in vivo photo-crosslinking to capture novel PPIs and to characterize the interfaces. PMID:24037947

  14. Recommended methods for monitoring change in bird populations by counting and capture of migrants

    Treesearch

    David J. T. Hussell; C. John Ralph

    2005-01-01

    Counts and banding captures of spring or fall migrants can generate useful information on the status and trends of the source populations. To do so, the counts and captures must be taken and recorded in a standardized and consistent manner. We present recommendations for field methods for counting and capturing migrants at intensively operated sites, such as bird...

  15. The Metadata Coverage Index (MCI): A standardized metric for quantifying database metadata richness.

    PubMed

    Liolios, Konstantinos; Schriml, Lynn; Hirschman, Lynette; Pagani, Ioanna; Nosrat, Bahador; Sterk, Peter; White, Owen; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Taylor, Chris; Kyrpides, Nikos C; Field, Dawn

    2012-07-30

    Variability in the extent of the descriptions of data ('metadata') held in public repositories forces users to assess the quality of records individually, which rapidly becomes impractical. The scoring of records on the richness of their description provides a simple, objective proxy measure for quality that enables filtering that supports downstream analysis. Pivotally, such descriptions should spur on improvements. Here, we introduce such a measure - the 'Metadata Coverage Index' (MCI): the percentage of available fields actually filled in a record or description. MCI scores can be calculated across a database, for individual records or for their component parts (e.g., fields of interest). There are many potential uses for this simple metric: for example; to filter, rank or search for records; to assess the metadata availability of an ad hoc collection; to determine the frequency with which fields in a particular record type are filled, especially with respect to standards compliance; to assess the utility of specific tools and resources, and of data capture practice more generally; to prioritize records for further curation; to serve as performance metrics of funded projects; or to quantify the value added by curation. Here we demonstrate the utility of MCI scores using metadata from the Genomes Online Database (GOLD), including records compliant with the 'Minimum Information about a Genome Sequence' (MIGS) standard developed by the Genomic Standards Consortium. We discuss challenges and address the further application of MCI scores; to show improvements in annotation quality over time, to inform the work of standards bodies and repository providers on the usability and popularity of their products, and to assess and credit the work of curators. Such an index provides a step towards putting metadata capture practices and in the future, standards compliance, into a quantitative and objective framework.

  16. Modelling of Sub-daily Hydrological Processes Using Daily Time-Step Models: A Distribution Function Approach to Temporal Scaling

    NASA Astrophysics Data System (ADS)

    Kandel, D. D.; Western, A. W.; Grayson, R. B.

    2004-12-01

    Mismatches in scale between the fundamental processes, the model and supporting data are a major limitation in hydrologic modelling. Surface runoff generation via infiltration excess and the process of soil erosion are fundamentally short time-scale phenomena and their average behaviour is mostly determined by the short time-scale peak intensities of rainfall. Ideally, these processes should be simulated using time-steps of the order of minutes to appropriately resolve the effect of rainfall intensity variations. However, sub-daily data support is often inadequate and the processes are usually simulated by calibrating daily (or even coarser) time-step models. Generally process descriptions are not modified but rather effective parameter values are used to account for the effect of temporal lumping, assuming that the effect of the scale mismatch can be counterbalanced by tuning the parameter values at the model time-step of interest. Often this results in parameter values that are difficult to interpret physically. A similar approach is often taken spatially. This is problematic as these processes generally operate or interact non-linearly. This indicates a need for better techniques to simulate sub-daily processes using daily time-step models while still using widely available daily information. A new method applicable to many rainfall-runoff-erosion models is presented. The method is based on temporal scaling using statistical distributions of rainfall intensity to represent sub-daily intensity variations in a daily time-step model. This allows the effect of short time-scale nonlinear processes to be captured while modelling at a daily time-step, which is often attractive due to the wide availability of daily forcing data. The approach relies on characterising the rainfall intensity variation within a day using a cumulative distribution function (cdf). This cdf is then modified by various linear and nonlinear processes typically represented in hydrological and erosion models. The statistical description of sub-daily variability is thus propagated through the model, allowing the effects of variability to be captured in the simulations. This results in cdfs of various fluxes, the integration of which over a day gives respective daily totals. Using 42-plot-years of surface runoff and soil erosion data from field studies in different environments from Australia and Nepal, simulation results from this cdf approach are compared with the sub-hourly (2-minute for Nepal and 6-minute for Australia) and daily models having similar process descriptions. Significant improvements in the simulation of surface runoff and erosion are achieved, compared with a daily model that uses average daily rainfall intensities. The cdf model compares well with a sub-hourly time-step model. This suggests that the approach captures the important effects of sub-daily variability while utilizing commonly available daily information. It is also found that the model parameters are more robustly defined using the cdf approach compared with the effective values obtained at the daily scale. This suggests that the cdf approach may offer improved model transferability spatially (to other areas) and temporally (to other periods).

  17. A real time sorbent based air monitoring system for determining low level airborne exposure levels to Lewisite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lattin, F.G.; Paul, D.G.; Jakubowski, E.M.

    1994-12-31

    The Real Time Analytical Platform (RTAP) is designed to provide mobile, real-time monitoring support to ensure protection of worker safety in areas where military unique compounds are used and stored, and at disposal sites. Quantitative analysis of low-level vapor concentrations in air is accomplished through sorbent-based collection with subsequent thermal desorption into a gas chromatograph (GC) equipped with a variety of detectors. The monitoring system is characterized by its sensitivity (ability to measure at low concentrations), selectivity (ability to filter out interferences), dynamic range and linearity, real time mode (versus methods requiring extensive sample preparation procedures), and ability to interfacemore » with complimentary GC detectors. This presentation describes an RTAP analytical method for analyzing lewisite, an arsenical compound, that consists of a GC screening technique with an Electron Capture Detector (ECD), and a confirmation technique using an Atomic Emission Detector (AED). Included in the presentation is a description of quality assurance objectives in the monitoring system, and an assessment of method accuracy, precision and detection levels.« less

  18. Examination of Scanning Electron Microscope and Computed Tomography Images of PICA

    NASA Technical Reports Server (NTRS)

    Lawson, John W.; Stackpoole, Margaret M.; Shklover, Valery

    2010-01-01

    Micrographs of PICA (Phenolic Impregnated Carbon Ablator) taken using a Scanning Electron Microscope (SEM) and 3D images taken with a Computed Tomography (CT) system are examined. PICA is a carbon fiber based composite (Fiberform ) with a phenolic polymer matrix. The micrographs are taken at different surface depths and at different magnifications in a sample after arc jet testing and show different levels of oxidative removal of the charred matrix (Figs 1 though 13). CT scans, courtesy of Xradia, Inc. of Concord CA, were captured for samples of virgin PICA, charred PICA and raw Fiberform (Fig. 14). We use these images to calculate the thermal conductivity (TC) of these materials using correlation function (CF) methods. CF methods give a mathematical description of how one material is embedded in another and is thus ideally suited for modeling composites like PICA. We will evaluate how the TC of the materials changes as a function of surface depth. This work is in collaboration with ETH-Zurich, which has expertise in high temperature materials and TC modeling (including CF methods).

  19. A comparison of hematology, plasma chemistry, and injuries in Hickory shad (Alosa mediocris) captured by electrofishing or angling during a spawning run.

    PubMed

    Matsche, Mark A; Rosemary, Kevin; Stence, Charles P

    2017-09-01

    Declines in Hickory shad (Alosa mediocris) populations in Chesapeake Bay have prompted efforts at captive propagation of wild broodfish for stock enhancement and research. The objectives of this study were to evaluate injuries sustained, and immediate and delayed (24 hours) effects on blood variables related to 2 fish capturing methods (electrofishing [EF] and angling). Blood specimens were collected from fish immediately following capture by EF and angling (n = 40 per sex and capture method) from the Susquehanna River (MD, USA). Additional fish (n = 25 per sex and capture method) were collected on the same day, placed in holding tanks and bled 24 hours following capture. Blood data that were non-Gaussian in distribution were transformed (Box-Cox), and effects of sex, method of capture, and holding time were tested using ANOVA with general linear models. Fish were evaluated for injuries by necropsy and radiography. Sex-specific differences were observed for RBC, HGB, PCV, MCH, MCHC, total proteins (TP), globulins, glucose, calcium, AST, CK, and lactate, while RBC, HGB, PCV, MCV, MCH, MCHC, TP, albumin, globulins, glucose, potassium, sodium, AST, CK, and lactate differed significantly by fish capturing method. Electrofishing may have induced greater disruption in blood variables, but mortality (4%) was not significantly different compared to angling. Electrofishing for Hickory shad using a constant DC voltage resulted in numerous hematologic and biochemical changes, with no additional injuries or deaths compared to angling. Capture method must be considered when evaluating fish condition, and blood variables should be partitioned by sex during spawning season. © 2017 American Society for Veterinary Clinical Pathology.

  20. Three-dimensional visualization of the craniofacial patient: volume segmentation, data integration and animation.

    PubMed

    Enciso, R; Memon, A; Mah, J

    2003-01-01

    The research goal at the Craniofacial Virtual Reality Laboratory of the School of Dentistry in conjunction with the Integrated Media Systems Center, School of Engineering, University of Southern California, is to develop computer methods to accurately visualize patients in three dimensions using advanced imaging and data acquisition devices such as cone-beam computerized tomography (CT) and mandibular motion capture. Data from these devices were integrated for three-dimensional (3D) patient-specific visualization, modeling and animation. Generic methods are in development that can be used with common CT image format (DICOM), mesh format (STL) and motion data (3D position over time). This paper presents preliminary descriptive studies on: 1) segmentation of the lower and upper jaws with two types of CT data--(a) traditional whole head CT data and (b) the new dental Newtom CT; 2) manual integration of accurate 3D tooth crowns with the segmented lower jaw 3D model; 3) realistic patient-specific 3D animation of the lower jaw.

  1. A multiscale product approach for an automatic classification of voice disorders from endoscopic high-speed videos.

    PubMed

    Unger, Jakob; Schuster, Maria; Hecker, Dietmar J; Schick, Bernhard; Lohscheller, Joerg

    2013-01-01

    Direct observation of vocal fold vibration is indispensable for a clinical diagnosis of voice disorders. Among current imaging techniques, high-speed videoendoscopy constitutes a state-of-the-art method capturing several thousand frames per second of the vocal folds during phonation. Recently, a method for extracting descriptive features from phonovibrograms, a two-dimensional image containing the spatio-temporal pattern of vocal fold dynamics, was presented. The derived features are closely related to a clinically established protocol for functional assessment of pathologic voices. The discriminative power of these features for different pathologic findings and configurations has not been assessed yet. In the current study, a collective of 220 subjects is considered for two- and multi-class problems of healthy and pathologic findings. The performance of the proposed feature set is compared to conventional feature reduction routines and was found to clearly outperform these. As such, the proposed procedure shows great potential for diagnostical issues of vocal fold disorders.

  2. A pairwise maximum entropy model accurately describes resting-state human brain networks

    PubMed Central

    Watanabe, Takamitsu; Hirose, Satoshi; Wada, Hiroyuki; Imai, Yoshio; Machida, Toru; Shirouzu, Ichiro; Konishi, Seiki; Miyashita, Yasushi; Masuda, Naoki

    2013-01-01

    The resting-state human brain networks underlie fundamental cognitive functions and consist of complex interactions among brain regions. However, the level of complexity of the resting-state networks has not been quantified, which has prevented comprehensive descriptions of the brain activity as an integrative system. Here, we address this issue by demonstrating that a pairwise maximum entropy model, which takes into account region-specific activity rates and pairwise interactions, can be robustly and accurately fitted to resting-state human brain activities obtained by functional magnetic resonance imaging. Furthermore, to validate the approximation of the resting-state networks by the pairwise maximum entropy model, we show that the functional interactions estimated by the pairwise maximum entropy model reflect anatomical connexions more accurately than the conventional functional connectivity method. These findings indicate that a relatively simple statistical model not only captures the structure of the resting-state networks but also provides a possible method to derive physiological information about various large-scale brain networks. PMID:23340410

  3. An improved KCF tracking algorithm based on multi-feature and multi-scale

    NASA Astrophysics Data System (ADS)

    Wu, Wei; Wang, Ding; Luo, Xin; Su, Yang; Tian, Weiye

    2018-02-01

    The purpose of visual tracking is to associate the target object in a continuous video frame. In recent years, the method based on the kernel correlation filter has become the research hotspot. However, the algorithm still has some problems such as video capture equipment fast jitter, tracking scale transformation. In order to improve the ability of scale transformation and feature description, this paper has carried an innovative algorithm based on the multi feature fusion and multi-scale transform. The experimental results show that our method solves the problem that the target model update when is blocked or its scale transforms. The accuracy of the evaluation (OPE) is 77.0%, 75.4% and the success rate is 69.7%, 66.4% on the VOT and OTB datasets. Compared with the optimal one of the existing target-based tracking algorithms, the accuracy of the algorithm is improved by 6.7% and 6.3% respectively. The success rates are improved by 13.7% and 14.2% respectively.

  4. One-nucleon pickup reactions and compound-nuclear decays

    NASA Astrophysics Data System (ADS)

    Escher, J. E.; Burke, J. T.; Casperson, R. J.; Hughes, R. O.; Scielzo, N. D.

    2018-05-01

    One-nucleon transfer reactions, long used as a tool to study the structure of nuclei, are potentially valuable for determining reaction cross sections indirectly. This is significant, as many reactions of interest to astrophysics and other applications involve short-lived isotopes and cannot be measured directly. We describe a procedure for obtaining constraints for calculations of neutron capture cross sections using observables from experiments with transfer reactions. As a first step toward demonstrating the method, we outline the theory developments used to properly describe the production of the compound nucleus 88Y* via the one-nucleon pickup reaction 89Y(p,d)88Y* and test the description with data from a recent experiment. We indicate how this development can be used to extract the unknown 87Y(n,γ) cross section from 89Y(p,dγ) data. The example illustrates a more generally applicable method for determining unknown cross sections via a combination of theory and transfer (or inelastic scattering) experiments.

  5. Center-TRACON Automation System (CTAS) En Route Trajectory Predictor Requirements and Capabilities

    NASA Technical Reports Server (NTRS)

    Vivona, Robert; Cate, Karen Tung

    2013-01-01

    This requirements framework document is designed to support the capture of requirements and capabilities for state-of-the-art trajectory predictors (TPs). This framework has been developed to assist TP experts in capturing a clear, consistent, and cross-comparable set of requirements and capabilities. The goal is to capture capabilities (types of trajectories that can be built), functional requirements (including inputs and outputs), non-functional requirements (including prediction accuracy and computational performance), approaches for constraint relaxation, and input uncertainties. The sections of this framework are based on the Common Trajectory Predictor structure developed by the FAA/Eurocontrol Cooperative R&D Action Plan 16 Committee on Common Trajectory Prediction. It is assumed that the reader is familiar with the Common TP Structure.1 This initial draft is intended as a first cut capture of the En Route TS Capabilities and Requirements. As such, it contains many annotations indicating possible logic errors in the CTAS code or in the description provided. It is intended to work out the details of the annotations with NASA and to update this document at a later time.

  6. A method to assess social sustainability of capture fisheries: An application to a Norwegian trawler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veldhuizen, L.J.L., E-mail: linda.veldhuizen@wur.nl; Berentsen, P.B.M.; Bokkers, E.A.M.

    Social sustainability assessment of capture fisheries is, both in terms of method development and measurement, not well developed. The objective of this study, therefore, was to develop a method consisting of indicators and rubrics (i.e. categories that articulate levels of performance) to assess social sustainability of capture fisheries. This method was applied to a Norwegian trawler that targets cod and haddock in the northeast Atlantic. Based on previous research, 13 social sustainability issues were selected. To measure the state of these issues, 17 process and outcome indicators were determined. To interpret indicator values, rubrics were developed for each indicator, usingmore » standards set by international conventions or data retrieved from national statistics, industry agreements or scientific publications that explore rubric scales. The indicators and rubrics were subsequently used in a social sustainability assessment of a Norwegian trawler. This assessment indicated that overall, social sustainability of this trawler is relatively high, with high rubric scores, for example, for worker safety, provisions aboard for the crew and companies' salary levels. The assessment also indicated that the trawler could improve on healthy working environment, product freshness and fish welfare during capture. This application demonstrated that our method provides insight into social sustainability at the level of the vessel and can be used to identify potential room for improvement. This method is also promising for social sustainability assessment of other capture fisheries. - Highlights: • A method was developed for social sustainability assessment of capture fisheries. • This method entailed determining outcome and process indicators for important issues. • To interpret indicator values, a rubric was developed for each indicator. • Use of this method gives insight into social sustainability and improvement options. • This method is promising for social sustainability assessment of capture fisheries.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Couture, Aaron Joseph

    This report documents aspects of direct and indirect neutron capture. The importance of neutron capture rates and methods to determine them are presented. The following conclusions are drawn: direct neutron capture measurements remain a backbone of experimental study; work is being done to take increased advantage of indirect methods for neutron capture; both instrumentation and facilities are making new measurements possible; more work is needed on the nuclear theory side to understand what is needed furthest from stability.

  8. Towards refactoring the Molecular Function Ontology with a UML profile for function modeling.

    PubMed

    Burek, Patryk; Loebe, Frank; Herre, Heinrich

    2017-10-04

    Gene Ontology (GO) is the largest resource for cataloging gene products. This resource grows steadily and, naturally, this growth raises issues regarding the structure of the ontology. Moreover, modeling and refactoring large ontologies such as GO is generally far from being simple, as a whole as well as when focusing on certain aspects or fragments. It seems that human-friendly graphical modeling languages such as the Unified Modeling Language (UML) could be helpful in connection with these tasks. We investigate the use of UML for making the structural organization of the Molecular Function Ontology (MFO), a sub-ontology of GO, more explicit. More precisely, we present a UML dialect, called the Function Modeling Language (FueL), which is suited for capturing functions in an ontologically founded way. FueL is equipped, among other features, with language elements that arise from studying patterns of subsumption between functions. We show how to use this UML dialect for capturing the structure of molecular functions. Furthermore, we propose and discuss some refactoring options concerning fragments of MFO. FueL enables the systematic, graphical representation of functions and their interrelations, including making information explicit that is currently either implicit in MFO or is mainly captured in textual descriptions. Moreover, the considered subsumption patterns lend themselves to the methodical analysis of refactoring options with respect to MFO. On this basis we argue that the approach can increase the comprehensibility of the structure of MFO for humans and can support communication, for example, during revision and further development.

  9. Description and application of capture zone delineation for a wellfield at Hilton Head Island, South Carolina

    USGS Publications Warehouse

    Landmeyer, J.E.

    1994-01-01

    Ground-water capture zone boundaries for individual pumped wells in a confined aquffer were delineated by using groundwater models. Both analytical and numerical (semi-analytical) models that more accurately represent the $round-water-flow system were used. All models delineated 2-dimensional boundaries (capture zones) that represent the areal extent of groundwater contribution to a pumped well. The resultant capture zones were evaluated on the basis of the ability of each model to realistically rapresent the part of the ground-water-flow system that contributed water to the pumped wells. Analytical models used were based on a fixed radius approach, and induded; an arbitrary radius model, a calculated fixed radius model based on the volumetric-flow equation with a time-of-travel criterion, and a calculated fixed radius model derived from modification of the Theis model with a drawdown criterion. Numerical models used induded the 2-dimensional, finite-difference models RESSQC and MWCAP. The arbitrary radius and Theis analytical models delineated capture zone boundaries that compared least favorably with capture zones delineated using the volumetric-flow analytical model and both numerical models. The numerical models produced more hydrologically reasonable capture zones (that were oriented parallel to the regional flow direction) than the volumetric-flow equation. The RESSQC numerical model computed more hydrologically realistic capture zones than the MWCAP numerical model by accounting for changes in the shape of capture zones caused by multiple-well interference. The capture zone boundaries generated by using both analytical and numerical models indicated that the curnmtly used 100-foot radius of protection around a wellhead in South Carolina is an underestimate of the extent of ground-water capture for pumped wetis in this particular wellfield in the Upper Floridan aquifer. The arbitrary fixed radius of 100 feet was shown to underestimate the upgradient contribution of ground-water flow to a pumped well.

  10. Children's Understanding of Large-Scale Mapping Tasks: An Analysis of Talk, Drawings, and Gesture

    ERIC Educational Resources Information Center

    Kotsopoulos, Donna; Cordy, Michelle; Langemeyer, Melanie

    2015-01-01

    This research examined how children represent motion in large-scale mapping tasks that we referred to as "motion maps". The underlying mathematical content was transformational geometry. In total, 19 children, 8- to 10-year-old, created motion maps and captured their motion maps with accompanying verbal description digitally. Analysis of…

  11. 40 CFR 63.11509 - What are my notification, reporting, and recordkeeping requirements?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... management practices and equipment standards. (iii) Description of the capture and emission control systems... subject to the requirements in § 63.11507(a)(1), “What are my standards and management practices?”, you... that is subject to the requirements in § 63.11507(b), “What are my standards and management practices...

  12. MSFC Propulsion Systems Department Knowledge Management Project

    NASA Technical Reports Server (NTRS)

    Caraccioli, Paul A.

    2007-01-01

    This slide presentation reviews the Knowledge Management (KM) project of the Propulsion Systems Department at Marshall Space Flight Center. KM is needed to support knowledge capture, preservation and to support an information sharing culture. The presentation includes the strategic plan for the KM initiative, the system requirements, the technology description, the User Interface and custom features, and a search demonstration.

  13. Art in Social Studies: Exploring the World and Ourselves with Rembrandt

    ERIC Educational Resources Information Center

    Ahmad, Iftikhar

    2008-01-01

    Rembrandt's art lends itself as a fertile resource for teaching and learning social studies. His art not only captures the social studies themes relevant to the Dutch Golden Age, but it also offers a description of human relations transcending temporal and spatial frontiers. Rembrandt is an imaginative storyteller with a keen insight for minute…

  14. The gas heterogeneous flows cleaning technology from corona discharge field

    NASA Astrophysics Data System (ADS)

    Bogdanov, A.; Tokarev, A.; Judanov, V.; Vinogradov, V.

    2017-11-01

    A nanogold capture and extraction from combustion products of Kara-Keche coal, description the process: a coal preparation to experiments, nanogold introducing in its composition, temperature and time performance of combustion, device and function of experimental apparatus, gas-purification of the gas flow process and receiving combustion products (condensate, coke, ash, rags) is offerred.

  15. 40 CFR 63.4920 - What reports must I submit?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... control device and were diverted to the atmosphere), the semiannual compliance report must contain the... capture systems and add-on control devices, using Equation 1 of § 63.4961, and Equation 3 of § 63.4961 for... malfunction started and stopped. (vii) A brief description of the CPMS. (viii) The date of the latest CPMS...

  16. The California All-sky Meteor Surveillance (CAMS) System

    NASA Astrophysics Data System (ADS)

    Gural, P. S.

    2011-01-01

    A unique next generation multi-camera, multi-site video meteor system is being developed and deployed in California to provide high accuracy orbits of simultaneously captured meteors. Included herein is a description of the goals, concept of operations, hardware, and software development progress. An appendix contains a meteor camera performance trade study made for video systems circa 2010.

  17. Preferred mental models in reasoning about spatial relations.

    PubMed

    Jahn, Georg; Knauff, Markus; Johnson-Laird, P N

    2007-12-01

    The theory of mental models postulates that individuals infer that a spatial description is consistent only if they can construct a model in which all the assertions in the description are true. Individuals prefer a parsimonious representation, and so, when a description is consistent with more than one possible layout of entities on the left-right dimension, individuals in our culture prefer to construct models working from left to right. They also prefer to locate entities referred to in the same assertion as adjacent to one another in a model. And, if possible, they tend to chunk entities into a single unit in order to capture several possibilities in a single model. We report four experiments corroborating these predictions. The results shed light on the integration of relational assertions, and they show that participants exploit implicit constraints in building models of spatial relations.

  18. Self-development of visual space perception by learning from the hand

    NASA Astrophysics Data System (ADS)

    Chung, Jae-Moon; Ohnishi, Noboru

    1998-10-01

    Animals have been considered to develop ability for interpreting images captured on their retina by themselves gradually from their birth. For this they do not need external supervisor. We think that the visual function is obtained together with the development of hand reaching and grasping operations which are executed by active interaction with environment. On the viewpoint of hand teaches eye, this paper shows how visual space perception is developed in a simulated robot. The robot has simplified human-like structure used for hand-eye coordination. From the experimental results it may be possible to validate the method to describe how visual space perception of biological systems is developed. In addition the description gives a way to self-calibrate the vision of intelligent robot based on learn by doing manner without external supervision.

  19. The Aluminum Smelting Process and Innovative Alternative Technologies

    PubMed Central

    Drabløs, Per Arne

    2014-01-01

    Objective: The industrial aluminum production process is addressed. The purpose is to give a short but comprehensive description of the electrolysis cell technology, the raw materials used, and the health and safety relevance of the process. Methods: This article is based on a study of the extensive chemical and medical literature on primary aluminum production. Results: At present, there are two main technological challenges for the process—to reduce energy consumption and to mitigate greenhouse gas emissions. A future step may be carbon dioxide gas capture and sequestration related to the electric power generation from fossil sources. Conclusions: Workers' health and safety have now become an integrated part of the aluminum business. Work-related injuries and illnesses are preventable, and the ultimate goal to eliminate accidents with lost-time injuries may hopefully be approached in the future. PMID:24806723

  20. Particle Capture Devices and Methods of Use Thereof

    NASA Technical Reports Server (NTRS)

    Voldman, Joel (Inventor); Skelley, Alison M. (Inventor); Kirak, Oktay (Inventor); Jaenisch, Rudolf (Inventor)

    2015-01-01

    The present invention provides a device and methods of use thereof in microscale particle capturing and particle pairing. This invention provides particle patterning device, which mechanically traps individual particles within first chambers of capture units, transfer the particles to second chambers of opposing capture units, and traps a second type of particle in the same second chamber. The device and methods allow for high yield assaying of trapped cells, high yield fusion of trapped, paired cells, for controlled binding of particles to cells and for specific chemical reactions between particle interfaces and particle contents. The device and method provide means of identification of the particle population and a facile route to particle collection.

  1. A method of studying wild bird populations by mist-netting and banding

    USGS Publications Warehouse

    Stamm, D.D.; Davis, D.E.; Robbins, C.S.

    1960-01-01

    1. Progress is reported toward development of a method of bird-population study based on mist-netting and banding. A definite pattern of arrangement and schedule of operation are presented. 2. Nets were operated for a total of 4200 net-hours during which 966 captures were made (23.0 birds per 100 net-hours). A total of 431 adult breeding birds were banded and 38 per cent of them were recaptured. 3. A breeding bird census was made simultaneously in the same area by the Williams spot-mapping technique. 4. Estimates of population by recapture agreed closely with the spot-mappmg census. 5. Some birds are demonstrated to have overlapping home-ranges much larger than their singing territories. 6. Recruitment and net-shyness distort recapture estimates of population .but the method allows detection and assessment of their influence in the population dealt with here. 7. The method produced integrated information on population density and dynamics, movement and behavior. 8. The procedure is especially well adapted to studies of disease agents in bird populations. 9. A simple scheme for description of the habitat in terms of relative abundance and frequency of occurrence of tree species was used.

  2. Standardized description of scientific evidence using the Evidence Ontology (ECO)

    PubMed Central

    Chibucos, Marcus C.; Mungall, Christopher J.; Balakrishnan, Rama; Christie, Karen R.; Huntley, Rachael P.; White, Owen; Blake, Judith A.; Lewis, Suzanna E.; Giglio, Michelle

    2014-01-01

    The Evidence Ontology (ECO) is a structured, controlled vocabulary for capturing evidence in biological research. ECO includes diverse terms for categorizing evidence that supports annotation assertions including experimental types, computational methods, author statements and curator inferences. Using ECO, annotation assertions can be distinguished according to the evidence they are based on such as those made by curators versus those automatically computed or those made via high-throughput data review versus single test experiments. Originally created for capturing evidence associated with Gene Ontology annotations, ECO is now used in other capacities by many additional annotation resources including UniProt, Mouse Genome Informatics, Saccharomyces Genome Database, PomBase, the Protein Information Resource and others. Information on the development and use of ECO can be found at http://evidenceontology.org. The ontology is freely available under Creative Commons license (CC BY-SA 3.0), and can be downloaded in both Open Biological Ontologies and Web Ontology Language formats at http://code.google.com/p/evidenceontology. Also at this site is a tracker for user submission of term requests and questions. ECO remains under active development in response to user-requested terms and in collaborations with other ontologies and database resources. Database URL: Evidence Ontology Web site: http://evidenceontology.org PMID:25052702

  3. Application of the laser capture microdissection technique for molecular definition of skeletal cell differentiation in vivo.

    PubMed

    Benayahu, Dafna; Socher, Rina; Shur, Irena

    2008-01-01

    Laser capture microdissection (LCM) method allows selection of individual or clustered cells from intact tissues. This technology enables one to pick cells from tissues that are difficult to study individually, sort the anatomical complexity of these tissues, and make the cells available for molecular analyses. Following the cells' extraction, the nucleic acids and proteins can be isolated and used for multiple applications that provide an opportunity to uncover the molecular control of cellular fate in the natural microenvironment. Utilization of LCM for the molecular analysis of cells from skeletal tissues will enable one to study differential patterns of gene expression in the native intact skeletal tissue with reliable interpretation of function for known genes as well as to discover novel genes. Variability between samples may be caused either by differences in the tissue samples (different areas isolated from the same section) or some variances in sample handling. LCM is a multi-task technology that combines histology, microscopy work, and dedicated molecular biology. The LCM application will provide results that will pave the way toward high throughput profiling of tissue-specific gene expression using Gene Chip arrays. Detailed description of in vivo molecular pathways will make it possible to elaborate on control systems to apply for the repair of genetic or metabolic diseases of skeletal tissues.

  4. Cretaceous origin of the unique prey-capture apparatus in mega-diverse genus: stem lineage of Steninae rove beetles discovered in Burmese amber

    PubMed Central

    Żyła, Dagmara; Yamamoto, Shûhei; Wolf-Schwenninger, Karin; Solodovnikov, Alexey

    2017-01-01

    Stenus is the largest genus of rove beetles and the second largest among animals. Its evolutionary success was associated with the adhesive labial prey-capture apparatus, a unique apomorphy of that genus. Definite Stenus with prey-capture apparatus are known from the Cenozoic fossils, while the age and early evolution of Steninae was hardly ever hypothesized. Our study of several Cretaceous Burmese amber inclusions revealed a stem lineage of Steninae that possibly possesses the Stenus-like prey-capture apparatus. Phylogenetic analysis of extinct and extant taxa of Steninae and putatively allied subfamilies of Staphylinidae with parsimony and Bayesian approaches resolved the Burmese amber lineage as a member of Steninae. It justified the description of a new extinct stenine genus Festenus with two new species, F. robustus and F. gracilis. The Late Cretaceous age of Festenus suggests an early origin of prey-capture apparatus in Steninae that, perhaps, drove the evolution towards the crown Stenus. Our analysis confirmed the well-established sister relationships between Steninae and Euaesthetinae and resolved Scydmaeninae as their next closest relative, the latter having no stable position in recent phylogenetic studies of rove beetles. Close affiliation of Megalopsidiinae, a subfamily often considered as a sister group to Euaesthetinae + Steninae clade, is rejected. PMID:28397786

  5. Simulation of aerosolized oil droplets capture in a range hood exhaust using coupled CFD-population balance method

    NASA Astrophysics Data System (ADS)

    Liu, Shuyuan; Zhang, Yong; Feng, Yu; Shi, Changbin; Cao, Yong; Yuan, Wei

    2018-02-01

    A coupled population balance sectional method (PBSM) coupled with computational fluid dynamics (CFD) is presented to simulate the capture of aerosolized oil droplets (AODs) in a range hood exhaust. The homogeneous nucleation and coagulation processes are modeled and simulated with this CFD-PBSM method. With the design angle, α of the range hood exhaust varying from 60° to 30°, the AODs capture increases meanwhile the pressure drop between the inlet and the outlet of the range hood also increases from 8.38Pa to 175.75Pa. The increasing inlet flow velocities also result in less AODs capture although the total suction increases due to higher flow rates to the range hood. Therefore, the CFD-PBSM method provides an insight into the formation and capture of AODs as well as their impact on the operation and design of the range hood exhaust.

  6. Baited lines: An active nondestructive collection method for burrowing crayfish

    USGS Publications Warehouse

    Loughman, Zachary J.; Foltz, David A.; Welsh, Stuart A.

    2013-01-01

    A new method (baited lines) is described for the collection of burrowing crayfishes, where fishing hooks baited with earthworms and tied to monofilament leaders are used to lure crayfishes from their burrow entrances. We estimated capture rates using baited lines at four locations across West Virginia for a total of four crayfish taxa; the taxa studied were orange, blue, and blue/orange morphs of Cambarus dubius (Upland Burrowing Catfish), and C. thomai (Little Brown Mudbug). Baited-line capture rates were lowest for C. thomai (81%; n = 21 attempts) and highest for the orange morph ofC. dubius (99%; n = 13 attempts). The pooled capture rate across all taxa was 91.5% (n = 50 attempts). Baited lines represent an environmentally nondestructive method to capture burrowing crayfishes without harm to individuals, and without disturbing burrows or the surrounding area. This novel method allows for repeat captures and long-term studies, providing a useful sampling method for ecological studies of burrowing crayfishes.

  7. Effect of capture stress on plasma enzyme activities in rainbow trout (Salmo gairdneri)

    USGS Publications Warehouse

    Bouck, G.R.; Cairns, M. A.; Christian, A. R.

    1978-01-01

    Four capture methods were used to collect domesticated rainbow trout (Salmo gairdneri): angling, electroshocking, seining, and direct netting (control). Blood was sampled rapidly upon capture, usually within 2 min. No significant differences were noted within the time frame of the experiment between the four capture groups for plasma protein concentration, lactate dehydrogenase activity, or leucine aminonaphthylamidase activity. Creatine phosphokinase activity was elevated among electroshocked fish. Acid phosphatase activity was too low for accurate measurement. Hematocrits were significantly elevated by capture struggles. These results indicate that these capture methods do not preclude the use of plasma enzyme levels for investigating the health of wild fish. Key words: plasma enzyme, capture stress, physiology, plasma protein, rainbow trout, lactate dehydrogenase, leucine aminonaphthylamidase, creatine phosphokinase

  8. The Accuracy of Shock Capturing in Two Spatial Dimensions

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Casper, Jay H.

    1997-01-01

    An assessment of the accuracy of shock capturing schemes is made for two-dimensional steady flow around a cylindrical projectile. Both a linear fourth-order method and a nonlinear third-order method are used in this study. It is shown, contrary to conventional wisdom, that captured two-dimensional shocks are asymptotically first-order, regardless of the design accuracy of the numerical method. The practical implications of this finding are discussed in the context of the efficacy of high-order numerical methods for discontinuous flows.

  9. Motion capture for human motion measuring by using single camera with triangle markers

    NASA Astrophysics Data System (ADS)

    Takahashi, Hidenori; Tanaka, Takayuki; Kaneko, Shun'ichi

    2005-12-01

    This study aims to realize a motion capture for measuring 3D human motions by using single camera. Although motion capture by using multiple cameras is widely used in sports field, medical field, engineering field and so on, optical motion capture method with one camera is not established. In this paper, the authors achieved a 3D motion capture by using one camera, named as Mono-MoCap (MMC), on the basis of two calibration methods and triangle markers which each length of side is given. The camera calibration methods made 3D coordinates transformation parameter and a lens distortion parameter with Modified DLT method. The triangle markers enabled to calculate a coordinate value of a depth direction on a camera coordinate. Experiments of 3D position measurement by using the MMC on a measurement space of cubic 2 m on each side show an average error of measurement of a center of gravity of a triangle marker was less than 2 mm. As compared with conventional motion capture method by using multiple cameras, the MMC has enough accuracy for 3D measurement. Also, by putting a triangle marker on each human joint, the MMC was able to capture a walking motion, a standing-up motion and a bending and stretching motion. In addition, a method using a triangle marker together with conventional spherical markers was proposed. Finally, a method to estimate a position of a marker by measuring the velocity of the marker was proposed in order to improve the accuracy of MMC.

  10. Spatial capture-recapture

    USGS Publications Warehouse

    Royle, J. Andrew; Chandler, Richard B.; Sollmann, Rahel; Gardner, Beth

    2013-01-01

    Spatial Capture-Recapture provides a revolutionary extension of traditional capture-recapture methods for studying animal populations using data from live trapping, camera trapping, DNA sampling, acoustic sampling, and related field methods. This book is a conceptual and methodological synthesis of spatial capture-recapture modeling. As a comprehensive how-to manual, this reference contains detailed examples of a wide range of relevant spatial capture-recapture models for inference about population size and spatial and temporal variation in demographic parameters. Practicing field biologists studying animal populations will find this book to be a useful resource, as will graduate students and professionals in ecology, conservation biology, and fisheries and wildlife management.

  11. Interplay between strong correlation and adsorption distances: Co on Cu(001)

    NASA Astrophysics Data System (ADS)

    Bahlke, Marc Philipp; Karolak, Michael; Herrmann, Carmen

    2018-01-01

    Adsorbed transition metal atoms can have partially filled d or f shells due to strong on-site Coulomb interaction. Capturing all effects originating from electron correlation in such strongly correlated systems is a challenge for electronic structure methods. It requires a sufficiently accurate description of the atomistic structure (in particular bond distances and angles), which is usually obtained from first-principles Kohn-Sham density functional theory (DFT), which due to the approximate nature of the exchange-correlation functional may provide an unreliable description of strongly correlated systems. To elucidate the consequences of this popular procedure, we apply a combination of DFT with the Anderson impurity model (AIM), as well as DFT + U for a calculation of the potential energy surface along the Co/Cu(001) adsorption coordinate, and compare the results with those obtained from DFT. The adsorption minimum is shifted towards larger distances by applying DFT+AIM, or the much cheaper DFT +U method, compared to the corresponding spin-polarized DFT results, by a magnitude comparable to variations between different approximate exchange-correlation functionals (0.08 to 0.12 Å). This shift originates from an increasing correlation energy at larger adsorption distances, which can be traced back to the Co 3 dx y and 3 dz2 orbitals being more correlated as the adsorption distance is increased. We can show that such considerations are important, as they may strongly affect electronic properties such as the Kondo temperature.

  12. Sustaining innovations in complex healthcare environments: A multiple-case study of rapid response teams

    PubMed Central

    Stolldorf, Deonni P; Havens, Donna S.; Jones, Cheryl B

    2015-01-01

    Objectives Rapid response teams are one innovation previously deployed in U.S. hospitals with the goal to improve the quality of care. Sustaining rapid response teams is important to achieve the desired implementation outcomes, reduce the risk of program investments losses, and prevent employee disillusionment and dissatisfaction. This study sought to examine factors that do and do not support the sustainability of Rapid Response Teams. Methods The study was conceptually guided by an adapted version of the Planning Model of Sustainability. A multiple-case study was conducted using a purposive sample of two hospitals with high RRT sustainability scores and two hospitals with low RRT sustainability scores. Data collection methods included: (a) a hospital questionnaire that was completed by a nurse administrator at each hospital; (b) semi-structured interviews with leaders, RRT members, and those activating RRT calls; and, (c) review of internal documents. Quantitative data were analyzed using descriptive statistics; qualitative data were analyzed using content analysis. Results Few descriptive differences were found between hospitals. However, there were notable differences in the operationalization of certain factors between high- and low-sustainability hospitals. Additional sustainability factors other than those captured by the Planning Model of Sustainability were also identified. Conclusions The sustainability of rapid response teams is optimized through effective operationalization of organizational and project design and implementation factors. Two additional factors—individual and team characteristics—should be included in the Planning Model of Sustainability and considered as potential facilitators (or inhibitors) of RRT sustainability. PMID:26756725

  13. Value Added: the Case for Point-of-View Camera use in Orthopedic Surgical Education.

    PubMed

    Karam, Matthew D; Thomas, Geb W; Taylor, Leah; Liu, Xiaoxing; Anthony, Chris A; Anderson, Donald D

    2016-01-01

    Orthopedic surgical education is evolving as educators search for new ways to enhance surgical skills training. Orthopedic educators should seek new methods and technologies to augment and add value to real-time orthopedic surgical experience. This paper describes a protocol whereby we have started to capture and evaluate specific orthopedic milestone procedures with a GoPro® point-of-view video camera and a dedicated video reviewing website as a way of supplementing the current paradigm in surgical skills training. We report our experience regarding the details and feasibility of this protocol. Upon identification of a patient undergoing surgical fixation of a hip or ankle fracture, an orthopedic resident places a GoPro® point-of-view camera on his or her forehead. All fluoroscopic images acquired during the case are saved and later incorporated into a video on the reviewing website. Surgical videos are uploaded to a secure server and are accessible for later review and assessment via a custom-built website. An electronic survey of resident participants was performed utilizing Qualtrics software. Results are reported using descriptive statistics. A total of 51 surgical videos involving 23 different residents have been captured to date. This includes 20 intertrochanteric hip fracture cases and 31 ankle fracture cases. The average duration of each surgical video was 1 hour and 16 minutes (range 40 minutes to 2 hours and 19 minutes). Of 24 orthopedic resident surgeons surveyed, 88% thought capturing a video portfolio of orthopedic milestones would benefit their education. There is a growing demand in orthopedic surgical education to extract more value from each surgical experience. While further work in development and refinement of such assessments is necessary, we feel that intraoperative video, particularly when captured and presented in a non-threatening, user friendly manner, can add significant value to the present and future paradigm of orthopedic surgical skill training.

  14. Decoy trapping and rocket-netting for northern pintails in spring

    USGS Publications Warehouse

    Grand, James B.; Fondell, Thomas F.

    1994-01-01

    Decoy traps and rocket-nets were compared for capturing Northern Pintails (Anas acuta: hereafter pintails) during May 1991 on the Yukon Flats, Alaska. Males were captured at similar rates using both methods (1.38 vs. 1.07 males/trap d, respectively), but baited rocket-nets were more efficient than decoy traps for capturing females (0.52 vs. 0.12 females/trap d). There were no significant differences in masses of pintails captured by each method.

  15. Comparative study on antibody immobilization strategies for efficient circulating tumor cell capture.

    PubMed

    Ates, Hatice Ceren; Ozgur, Ebru; Kulah, Haluk

    2018-03-23

    Methods for isolation and quantification of circulating tumor cells (CTCs) are attracting more attention every day, as the data for their unprecedented clinical utility continue to grow. However, the challenge is that CTCs are extremely rare (as low as 1 in a billion of blood cells) and a highly sensitive and specific technology is required to isolate CTCs from blood cells. Methods utilizing microfluidic systems for immunoaffinity-based CTC capture are preferred, especially when purity is the prime requirement. However, antibody immobilization strategy significantly affects the efficiency of such systems. In this study, two covalent and two bioaffinity antibody immobilization methods were assessed with respect to their CTC capture efficiency and selectivity, using an anti-epithelial cell adhesion molecule (EpCAM) as the capture antibody. Surface functionalization was realized on plain SiO 2 surfaces, as well as in microfluidic channels. Surfaces functionalized with different antibody immobilization methods are physically and chemically characterized at each step of functionalization. MCF-7 breast cancer and CCRF-CEM acute lymphoblastic leukemia cell lines were used as EpCAM positive and negative cell models, respectively, to assess CTC capture efficiency and selectivity. Comparisons reveal that bioaffinity based antibody immobilization involving streptavidin attachment with glutaraldehyde linker gave the highest cell capture efficiency. On the other hand, a covalent antibody immobilization method involving direct antibody binding by N-(3-dimethylaminopropyl)-N'-ethylcarbodiimide hydrochloride (EDC)-N-hydroxysuccinimide (NHS) reaction was found to be more time and cost efficient with a similar cell capture efficiency. All methods provided very high selectivity for CTCs with EpCAM expression. It was also demonstrated that antibody immobilization via EDC-NHS reaction in a microfluidic channel leads to high capture efficiency and selectivity.

  16. Structures for capturing CO.sub.2, methods of making the structures, and methods of capturing CO.sub.2

    DOEpatents

    Jones, Christopher W; Hicks, Jason C; Fauth, Daniel J; McMahan, Gray

    2012-10-30

    Briefly described, embodiments of this disclosure, among others, include carbon dioxide (CO.sub.2) sorption structures, methods of making CO.sub.2 sorption structures, and methods of using CO.sub.2 sorption structures.

  17. A New Browser-based, Ontology-driven Tool for Generating Standardized, Deep Descriptions of Geoscience Models

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.

    2016-12-01

    Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts such as variables, objects, quantities, operations, processes and assumptions. The purpose of this talk is to present details of the new ontology and to then demonstrate the MCM Tool for several hydrologic models.

  18. A Novel Method to Compute Breathing Volumes via Motion Capture Systems: Design and Experimental Trials.

    PubMed

    Massaroni, Carlo; Cassetta, Eugenio; Silvestri, Sergio

    2017-10-01

    Respiratory assessment can be carried out by using motion capture systems. A geometrical model is mandatory in order to compute the breathing volume as a function of time from the markers' trajectories. This study describes a novel model to compute volume changes and calculate respiratory parameters by using a motion capture system. The novel method, ie, prism-based method, computes the volume enclosed within the chest by defining 82 prisms from the 89 markers attached to the subject chest. Volumes computed with this method are compared to spirometry volumes and to volumes computed by a conventional method based on the tetrahedron's decomposition of the chest wall and integrated in a commercial motion capture system. Eight healthy volunteers were enrolled and 30 seconds of quiet breathing data collected from each of them. Results show a better agreement between volumes computed by the prism-based method and the spirometry (discrepancy of 2.23%, R 2  = .94) compared to the agreement between volumes computed by the conventional method and the spirometry (discrepancy of 3.56%, R 2  = .92). The proposed method also showed better performances in the calculation of respiratory parameters. Our findings open up prospects for the further use of the new method in the breathing assessment via motion capture systems.

  19. Implementation of a Learning Design Run-Time Environment for the .LRN Learning Management System

    ERIC Educational Resources Information Center

    del Cid, Jose Pablo Escobedo; de la Fuente Valentin, Luis; Gutierrez, Sergio; Pardo, Abelardo; Kloos, Carlos Delgado

    2007-01-01

    The IMS Learning Design specification aims at capturing the complete learning flow of courses, without being restricted to a particular pedagogical model. Such flow description for a course, called a Unit of Learning, must be able to be reproduced in different systems using a so called run-time environment. In the last few years there has been…

  20. Clinical report: Detection and management of bovine viral diarrhea virus Type 1b in a large dairy herd

    USDA-ARS?s Scientific Manuscript database

    Case Description: 1,081 newborn calves from a commercial dairy were tested for bovine viral diarrhea virus antigen by pooled RT-PCR as part of a screening program. Ear tissue from twenty six calves initially tested positive and 14 confirmed positive with antigen capture ELISA two weeks later (1.3...

  1. Recent European Developments in Helicopters

    NASA Technical Reports Server (NTRS)

    1921-01-01

    Descriptions are given of two captured helicopters, one driven by electric power, the other by a gasoline engine. An account is given of flight tests of the gasoline powered vehicle. After 15 successful flight tests, the gasoline powered vehicle crashed due to the insufficient thrust. Also discussed here are the applications of helicopters for military observations, for meteorological work, and for carrying radio antennas.

  2. 40 CFR 63.3920 - What reports must I submit?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... limitation (including any periods when emissions bypassed the add-on control device and were diverted to the... month by emission capture systems and add-on control devices using Equations 1 and 1A through 1D of § 63... description of the CPMS. (v) The date of the latest CPMS certification or audit. (vi) The date and time that...

  3. Feasibility of automated speech sample collection with stuttering children using interactive voice response (IVR) technology.

    PubMed

    Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena

    2015-04-01

    To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.

  4. Differential Estimates of Southern Flying Squirrel (Glaucomys volans) Population Structure Based on Capture Method.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laves, Kevin S.; Loeb, Susan C.

    2006-01-01

    ABSTRACT.—It is commonly assumed that population estimates derived from trapping small mammals are accurate and unbiased or that estimates derived from different capture methods are comparable. We captured southern flying squirrels (Glaucomys volans) using two methods to study their effect on red-cockaded woodpecker (Picoides borealis) reproductive success. Southern flying squirrels were captured at and removed from 30 red-cockaded woodpecker cluster sites during March to July 1994 and 1995 using Sherman traps placed in a grid encompassing a red-cockaded woodpecker nest tree and by hand from red-cockaded woodpecker cavities. Totals of 195 (1994) and 190 (1995) red-cockaded woodpecker cavities were examinedmore » at least three times each year. Trappability of southern flying squirrels in Sherman traps was significantly greater in 1995 (1.18%; 22,384 trap nights) than in 1994 (0.42%; 20,384 trap nights), and capture rate of southern flying squirrels in cavities was significantly greater in 1994 (22.7%; 502 cavity inspections) than in 1995 (10.8%; 555 cavity inspections). However, more southern flying squirrels were captured per cavity inspection than per Sherman trap night in both years. Male southern flying squirrels were more likely to be captured from cavities than in Sherman traps in 1994, but not in 1995. Both male and female juveniles were more likely to be captured in cavities than in traps in both years. In 1994 males in reproductive condition were more likely to be captured in cavities than in traps and in 1995 we captured significantly more reproductive females in cavities than in traps. Our data suggest that population estimates based solely on one trapping method may not represent true population size or structure of southern flying squirrels.« less

  5. 40 CFR 63.9322 - How do I determine the emission capture system efficiency?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... capture system efficiency? 63.9322 Section 63.9322 Protection of Environment ENVIRONMENTAL PROTECTION... capture system efficiency? You must use the procedures and test methods in this section to determine capture efficiency as part of the performance test required by § 63.9310. (a) Assuming 100 percent capture...

  6. 40 CFR 63.9322 - How do I determine the emission capture system efficiency?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... capture system efficiency? 63.9322 Section 63.9322 Protection of Environment ENVIRONMENTAL PROTECTION... capture system efficiency? You must use the procedures and test methods in this section to determine capture efficiency as part of the performance test required by § 63.9310. (a) Assuming 100 percent capture...

  7. [Estimation with the capture-recapture method of the number of economic immigrants in Mallorca].

    PubMed

    Ramos Monserrat, M; March Cerdá, J C

    2002-05-15

    estimate the number of irregular economic immigrants in Mallorca. We used the capture-recapture method, an indirect method based on contrasts of data from two or more sources. Data were obtained from the Delegación de Gobierno (police and immigration authority), Comisiones Obreras (labor union), and institutions that provide health-related services to immigrants. Individuals were identified by birth date and country of origin. The total number of economic immigrants estimated with this method was 39 392. According to the Delegación de Gobierno data, the number of regular immigrants on the date of our inquiry was 9000. With the capture-recapture method, the number of irregular immigrants in Mallorca was therefore estimated at 30 000. The capture-recapture method can be useful to estimate the population of irregular immigrants in a given area at a given time, if sufficiently precise information on the identity of each individual can be obtained.

  8. ESIP Information Quality Cluster (IQC)

    NASA Technical Reports Server (NTRS)

    Ramapriyan, H. K.; Peng, Ge; Moroni, David F.

    2016-01-01

    The Information Quality Cluster (IQC) within the Federation of Earth Science Information Partners (ESIP) was initially formed in 2011 and has evolved significantly over time. The current objectives of the IQC are to: 1. Actively evaluate community data quality best practices and standards; 2. Improve capture, description, discovery, and usability of information about data quality in Earth science data products; 3. Ensure producers of data products are aware of standards and best practices for conveying data quality, and data providers distributors intermediaries establish, improve and evolve mechanisms to assist users in discovering and understanding data quality information; and 4. Consistently provide guidance to data managers and stewards on how best to implement data quality standards and best practices to ensure and improve maturity of their data products. The activities of the IQC include: 1. Identification of additional needs for consistently capturing, describing, and conveying quality information through use case studies with broad and diverse applications; 2. Establishing and providing community-wide guidance on roles and responsibilities of key players and stakeholders including users and management; 3. Prototyping of conveying quality information to users in a more consistent, transparent, and digestible manner; 4. Establishing a baseline of standards and best practices for data quality; 5. Evaluating recommendations from NASA's DQWG in a broader context and proposing possible implementations; and 6. Engaging data providers, data managers, and data user communities as resources to improve our standards and best practices. Following the principles of openness of the ESIP Federation, IQC invites all individuals interested in improving capture, description, discovery, and usability of information about data quality in Earth science data products to participate in its activities.

  9. High-Alpha Research Vehicle (HARV) longitudinal controller: Design, analyses, and simulation resultss

    NASA Technical Reports Server (NTRS)

    Ostroff, Aaron J.; Hoffler, Keith D.; Proffitt, Melissa S.; Brown, Philip W.; Phillips, Michael R.; Rivers, Robert A.; Messina, Michael D.; Carzoo, Susan W.; Bacon, Barton J.; Foster, John F.

    1994-01-01

    This paper describes the design, analysis, and nonlinear simulation results (batch and piloted) for a longitudinal controller which is scheduled to be flight-tested on the High-Alpha Research Vehicle (HARV). The HARV is an F-18 airplane modified for and equipped with multi-axis thrust vectoring. The paper includes a description of the facilities, a detailed review of the feedback controller design, linear analysis results of the feedback controller, a description of the feed-forward controller design, nonlinear batch simulation results, and piloted simulation results. Batch simulation results include maximum pitch stick agility responses, angle of attack alpha captures, and alpha regulation for full lateral stick rolls at several alpha's. Piloted simulation results include task descriptions for several types of maneuvers, task guidelines, the corresponding Cooper-Harper ratings from three test pilots, and some pilot comments. The ratings show that desirable criteria are achieved for almost all of the piloted simulation tasks.

  10. Space Shuttle Guidance, Navigation, and Rendezvous Knowledge Capture Reports. Revision 1

    NASA Technical Reports Server (NTRS)

    Goodman, John L.

    2011-01-01

    This document is a catalog and readers guide to lessons learned, experience, and technical history reports, as well as compilation volumes prepared by United Space Alliance personnel for the NASA/Johnson Space Center (JSC) Flight Dynamics Division.1 It is intended to make it easier for future generations of engineers to locate knowledge capture documentation from the Shuttle Program. The first chapter covers observations on documentation quality and research challenges encountered during the Space Shuttle and Orion programs. The second chapter covers the knowledge capture approach used to create many of the reports covered in this document. These chapters are intended to provide future flight programs with insight that could be used to formulate knowledge capture and management strategies. The following chapters contain descriptions of each knowledge capture report. The majority of the reports concern the Space Shuttle. Three are included that were written in support of the Orion Program. Most of the reports were written from the years 2001 to 2011. Lessons learned reports concern primarily the shuttle Global Positioning System (GPS) upgrade and the knowledge capture process. Experience reports on navigation and rendezvous provide examples of how challenges were overcome and how best practices were identified and applied. Some reports are of a more technical history nature covering navigation and rendezvous. They provide an overview of mission activities and the evolution of operations concepts and trajectory design. The lessons learned, experience, and history reports would be considered secondary sources by historians and archivists.

  11. Method of capturing or trapping zinc using zinc getter materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunyadi Murph, Simona E.; Korinko, Paul S.

    2017-07-11

    A method of trapping or capturing zinc is disclosed. In particular, the method comprises a step of contacting a zinc vapor with a zinc getter material. The zinc getter material comprises nanoparticles and a metal substrate.

  12. Nonlocal and Mixed-Locality Multiscale Finite Element Methods

    DOE PAGES

    Costa, Timothy B.; Bond, Stephen D.; Littlewood, David J.

    2018-03-27

    In many applications the resolution of small-scale heterogeneities remains a significant hurdle to robust and reliable predictive simulations. In particular, while material variability at the mesoscale plays a fundamental role in processes such as material failure, the resolution required to capture mechanisms at this scale is often computationally intractable. Multiscale methods aim to overcome this difficulty through judicious choice of a subscale problem and a robust manner of passing information between scales. One promising approach is the multiscale finite element method, which increases the fidelity of macroscale simulations by solving lower-scale problems that produce enriched multiscale basis functions. Here, inmore » this study, we present the first work toward application of the multiscale finite element method to the nonlocal peridynamic theory of solid mechanics. This is achieved within the context of a discontinuous Galerkin framework that facilitates the description of material discontinuities and does not assume the existence of spatial derivatives. Analysis of the resulting nonlocal multiscale finite element method is achieved using the ambulant Galerkin method, developed here with sufficient generality to allow for application to multiscale finite element methods for both local and nonlocal models that satisfy minimal assumptions. Finally, we conclude with preliminary results on a mixed-locality multiscale finite element method in which a nonlocal model is applied at the fine scale and a local model at the coarse scale.« less

  13. Nonlocal and Mixed-Locality Multiscale Finite Element Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costa, Timothy B.; Bond, Stephen D.; Littlewood, David J.

    In many applications the resolution of small-scale heterogeneities remains a significant hurdle to robust and reliable predictive simulations. In particular, while material variability at the mesoscale plays a fundamental role in processes such as material failure, the resolution required to capture mechanisms at this scale is often computationally intractable. Multiscale methods aim to overcome this difficulty through judicious choice of a subscale problem and a robust manner of passing information between scales. One promising approach is the multiscale finite element method, which increases the fidelity of macroscale simulations by solving lower-scale problems that produce enriched multiscale basis functions. Here, inmore » this study, we present the first work toward application of the multiscale finite element method to the nonlocal peridynamic theory of solid mechanics. This is achieved within the context of a discontinuous Galerkin framework that facilitates the description of material discontinuities and does not assume the existence of spatial derivatives. Analysis of the resulting nonlocal multiscale finite element method is achieved using the ambulant Galerkin method, developed here with sufficient generality to allow for application to multiscale finite element methods for both local and nonlocal models that satisfy minimal assumptions. Finally, we conclude with preliminary results on a mixed-locality multiscale finite element method in which a nonlocal model is applied at the fine scale and a local model at the coarse scale.« less

  14. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  15. Case-Based Capture and Reuse of Aerospace Design Rationale

    NASA Technical Reports Server (NTRS)

    Leake, David B.

    2001-01-01

    The goal of this project was to apply artificial intelligence techniques to facilitate capture and reuse of aerospace design rationale. The project combined case-based reasoning (CBR) and concept maps (CMaps) to develop methods for capturing, organizing, and interactively accessing records of experiences encapsulating the methods and rationale underlying expert aerospace design, in order to bring the captured knowledge to bear to support future reasoning. The project's results contribute both principles and methods for effective design-aiding systems that aid capture and access of useful design knowledge. The project has been guided by the tenets that design-aiding systems must: (1) Leverage a designer's knowledge, rather than attempting to replace it; (2) Be able to reflect different designers' differing conceptualizations of the design task, and to clarify those conceptualizations to others; (3) Include capabilities to capture information both by interactive knowledge modeling and during normal use; and (4) Integrate into normal designer tasks as naturally and unobtrusive as possible.

  16. In situ characterization of the brain-microdevice interface using Device Capture Histology

    PubMed Central

    Woolley, Andrew J.; Desai, Himanshi A.; Steckbeck, Mitchell A.; Patel, Neil K.; Otto, Kevin J.

    2011-01-01

    Accurate assessment of brain-implantable microdevice bio-integration remains a formidable challenge. Prevailing histological methods require device extraction prior to tissue processing, often disrupting and removing the tissue of interest which had been surrounding the device. The Device-Capture Histology method, presented here, overcomes many limitations of the conventional Device-Explant Histology method, by collecting the device and surrounding tissue intact for subsequent labeling. With the implant remaining in situ, accurate and precise imaging of the morphologically preserved tissue at the brain/microdevice interface can then be collected and quantified. First, this article presents the Device-Capture Histology method for obtaining and processing the intact, undisturbed microdevice-tissue interface, and images using fluorescent labeling and confocal microscopy. Second, this article gives examples of how to quantify features found in the captured peridevice tissue. We also share histological data capturing 1) the impact of microdevice implantation on tissue, 2) the effects of an experimental anti-inflammatory coating, 3) a dense grouping of cell nuclei encapsulating a long-term implant, and 4) atypical oligodendrocyte organization neighboring a longterm implant. Data sets collected using the Device-Capture Histology method are presented to demonstrate the significant advantages of processing the intact microdevice-tissue interface, and to underscore the utility of the method in understanding the effects of the brain-implantable microdevices on nearby tissue. PMID:21802446

  17. Radiative capture reactions via indirect methods

    NASA Astrophysics Data System (ADS)

    Mukhamedzhanov, A. M.; Rogachev, G. V.

    2017-10-01

    Many radiative capture reactions of astrophysical interest occur at such low energies that their direct measurement is hardly possible. Until now the only indirect method, which was used to determine the astrophysical factor of the astrophysical radiative capture process, was the Coulomb dissociation. In this paper we address another indirect method, which can provide information about resonant radiative capture reactions at astrophysically relevant energies. This method can be considered an extension of the Trojan horse method for resonant radiative capture reactions. The idea of the suggested indirect method is to use the indirect reaction A (a ,s γ )F to obtain information about the radiative capture reaction A (x ,γ )F , where a =(s x ) and F =(x A ) . The main advantage of using the indirect reactions is the absence of the penetrability factor in the channel x +A , which suppresses the low-energy cross sections of the A (x ,γ )F reactions and does not allow one to measure these reactions at astrophysical energies. A general formalism to treat indirect resonant radiative capture reactions is developed when only a few intermediate states contribute and a statistical approach cannot be applied. The indirect method requires coincidence measurements of the triple differential cross section, which is a function of the photon scattering angle, energy, and the scattering angle of the outgoing spectator particle s . Angular dependence of the triple differential cross section at fixed scattering angle of the spectator s is the angular γ -s correlation function. Using indirect resonant radiative capture reactions, one can obtain information about important astrophysical resonant radiative capture reactions such as (p ,γ ) , (α ,γ ) , and (n ,γ ) on stable and unstable isotopes. The indirect technique makes accessible low-lying resonances, which are close to the threshold, and even subthreshold bound states located at negative energies. In this paper, after developing the general formalism, we demonstrate the application of the indirect reaction 12C(6Li,d γ )16O proceeding through 1- and 2+ subthreshold bound states and resonances to obtain the information about the 12C(α ,γ )16O radiative capture at the astrophysically most effective energy 0.3 MeV, which is impossible using standard direct measurements. Feasibility of the suggested approach is discussed.

  18. Establishing a theory for deuteron-induced surrogate reactions

    NASA Astrophysics Data System (ADS)

    Potel, G.; Nunes, F. M.; Thompson, I. J.

    2015-09-01

    Background: Deuteron-induced reactions serve as surrogates for neutron capture into compound states. Although these reactions are of great applicability, no theoretical efforts have been invested in this direction over the last decade. Purpose: The goal of this work is to establish on firm grounds a theory for deuteron-induced neutron-capture reactions. This includes formulating elastic and inelastic breakup in a consistent manner. Method: We describe this process both in post- and prior-form distorted wave Born approximation following previous works and discuss the differences in the formulation. While the convergence issues arising in the post formulation can be overcome in the prior formulation, in this case one still needs to take into account additional terms due to nonorthogonality. Results: We apply our method to the 93Nb(d ,p )X at Ed=15 and 25 MeV and are able to obtain a good description of the data. We look at the various partial wave contributions, as well as elastic versus inelastic contributions. We also connect our formulation with transfer to neutron bound states. Conclusions: Our calculations demonstrate that the nonorthogonality term arising in the prior formulation is significant and is at the heart of the long-standing controversy between the post and the prior formulations of the theory. We also show that the cross sections for these reactions are angular-momentum dependent and therefore the commonly used Weisskopf limit is inadequate. Finally, we make important predictions for the relative contributions of elastic breakup and nonelastic breakup and call for elastic-breakup measurements to further constrain our model.

  19. Establishing a theory for deuteron induced surrogate reactions

    DOE PAGES

    Potel, G.; Nunes, F. M.; Thompson, I. J.

    2015-09-18

    Background: Deuteron-induced reactions serve as surrogates for neutron capture into compound states. Although these reactions are of great applicability, no theoretical efforts have been invested in this direction over the last decade. Purpose: The goal of this work is to establish on firm grounds a theory for deuteron-induced neutron-capture reactions. This includes formulating elastic and inelastic breakup in a consistent manner. Method: We describe this process both in post- and prior-form distorted wave Born approximation following previous works and discuss the differences in the formulation. While the convergence issues arising in the post formulation can be overcome in the priormore » formulation, in this case one still needs to take into account additional terms due to nonorthogonality. Results: We apply our method to the Nb93(d,p)X at Ed=15 and 25 MeV and are able to obtain a good description of the data. We then look at the various partial wave contributions, as well as elastic versus inelastic contributions. We also connect our formulation with transfer to neutron bound states.Conclusions: Our calculations demonstrate that the nonorthogonality term arising in the prior formulation is significant and is at the heart of the long-standing controversy between the post and the prior formulations of the theory. We also show that the cross sections for these reactions are angular-momentum dependent and therefore the commonly used Weisskopf limit is inadequate. We finally make important predictions for the relative contributions of elastic breakup and nonelastic breakup and call for elastic-breakup measurements to further constrain our model.« less

  20. Approximate solutions for radial travel time and capture zone in unconfined aquifers.

    PubMed

    Zhou, Yangxiao; Haitjema, Henk

    2012-01-01

    Radial time-of-travel (TOT) capture zones have been evaluated for unconfined aquifers with and without recharge. The solutions of travel time for unconfined aquifers are rather complex and have been replaced with much simpler approximate solutions without significant loss of accuracy in most practical cases. The current "volumetric method" for calculating the radius of a TOT capture zone assumes no recharge and a constant aquifer thickness. It was found that for unconfined aquifers without recharge, the volumetric method leads to a smaller and less protective wellhead protection zone when ignoring drawdowns. However, if the saturated thickness near the well is used in the volumetric method a larger more protective TOT capture zone is obtained. The same is true when the volumetric method is used in the presence of recharge. However, for that case it leads to unreasonableness over the prediction of a TOT capture zone of 5 years or more. © 2011, The Author(s). Ground Water © 2011, National Ground Water Association.

  1. On the theoretical description of weakly charged surfaces.

    PubMed

    Wang, Rui; Wang, Zhen-Gang

    2015-03-14

    It is widely accepted that the Poisson-Boltzmann (PB) theory provides a valid description for charged surfaces in the so-called weak coupling limit. Here, we show that the image charge repulsion creates a depletion boundary layer that cannot be captured by a regular perturbation approach. The correct weak-coupling theory must include the self-energy of the ion due to the image charge interaction. The image force qualitatively alters the double layer structure and properties, and gives rise to many non-PB effects, such as nonmonotonic dependence of the surface energy on concentration and charge inversion. In the presence of dielectric discontinuity, there is no limiting condition for which the PB theory is valid.

  2. Radical “Visual Capture” Observed in a Patient with Severe Visual Agnosia

    PubMed Central

    Takaiwa, Akiko; Yoshimura, Hirokazu; Abe, Hirofumi; Terai, Satoshi

    2003-01-01

    We report the case of a 79-year-old female with visual agnosia due to brain infarction in the left posterior cerebral artery. She could recognize objects used in daily life rather well by touch (the number of objects correctly identified was 16 out of 20 presented objects), but she could not recognize them as well by vision (6 out of 20). In this case, it was expected that she would recognize them well when permitted to use touch and vision simultaneously. Our patient, however, performed poorly, producing 5 correct answers out of 20 in the Vision-and-Touch condition. It would be natural to think that visual capture functions when vision and touch provide contradictory information on concrete positions and shapes. However, in the present case, it functioned in spite of the visual deficit in recognizing objects. This should be called radical visual capture. By presenting detailed descriptions of her symptoms and neuropsychological and neuroradiological data, we clarify the characteristics of this type of capture. PMID:12719638

  3. Historic Methods for Capturing Magnetic Field Images

    ERIC Educational Resources Information Center

    Kwan, Alistair

    2016-01-01

    I investigated two late 19th-century methods for capturing magnetic field images from iron filings for historical insight into the pedagogy of hands-on physics education methods, and to flesh out teaching and learning practicalities tacit in the historical record. Both methods offer opportunities for close sensory engagement in data-collection…

  4. Understanding Treatment Burden and Quality of Life Impact of Participating in an Early Phase Pediatric Oncology Clinical Trial: A Pilot Study

    PubMed Central

    BACKUS, LORI; STOCKMAN, BETH; CARPENTER, JANET S.; LIN, LI; HAASE, JOAN

    2017-01-01

    Purpose Early phase clinical trials (EPTs) have led to new, more effective treatment options for children with cancer. Despite the extensive use of EPTs in pediatric oncology, little is known about parent and child experiences during EPT participation. The purposes of this pilot study were to assess the feasibility and preliminary results of having children with cancer and their parents complete measures of treatment burden and quality of life (QOL) concurrent with EPT participation. Methods In this descriptive, longitudinal, pilot study, parents and children were followed for the first 60 days of an EPT. Feasibility was assessed by participant enrollment and retention, and completion of measures. Measures completed included: Demographic form (completed at baseline); Diary of Trial Experiences to capture treatment burden (completed ongoing); and PedsQL™ Quality of Life Inventories, Cancer Modules, and Family Impact Module (completed at baseline, post-first disease evaluation, and off-study). Data were analyzed using descriptive statistics. Results Feasibility goals of enrollment, retention, and measure completion were partially met. Preliminary treatment burden and QOL results are provided. Conclusions While QOL assessments may provide insight into EPT experiences, future studies need to be conducted at multiple sites and enrollment goals must account for participant attrition. PMID:28849701

  5. Free-Form Region Description with Second-Order Pooling.

    PubMed

    Carreira, João; Caseiro, Rui; Batista, Jorge; Sminchisescu, Cristian

    2015-06-01

    Semantic segmentation and object detection are nowadays dominated by methods operating on regions obtained as a result of a bottom-up grouping process (segmentation) but use feature extractors developed for recognition on fixed-form (e.g. rectangular) patches, with full images as a special case. This is most likely suboptimal. In this paper we focus on feature extraction and description over free-form regions and study the relationship with their fixed-form counterparts. Our main contributions are novel pooling techniques that capture the second-order statistics of local descriptors inside such free-form regions. We introduce second-order generalizations of average and max-pooling that together with appropriate non-linearities, derived from the mathematical structure of their embedding space, lead to state-of-the-art recognition performance in semantic segmentation experiments without any type of local feature coding. In contrast, we show that codebook-based local feature coding is more important when feature extraction is constrained to operate over regions that include both foreground and large portions of the background, as typical in image classification settings, whereas for high-accuracy localization setups, second-order pooling over free-form regions produces results superior to those of the winning systems in the contemporary semantic segmentation challenges, with models that are much faster in both training and testing.

  6. Model-Based Referenceless Quality Metric of 3D Synthesized Images Using Local Image Description.

    PubMed

    Gu, Ke; Jakhetiya, Vinit; Qiao, Jun-Fei; Li, Xiaoli; Lin, Weisi; Thalmann, Daniel

    2017-07-28

    New challenges have been brought out along with the emerging of 3D-related technologies such as virtual reality (VR), augmented reality (AR), and mixed reality (MR). Free viewpoint video (FVV), due to its applications in remote surveillance, remote education, etc, based on the flexible selection of direction and viewpoint, has been perceived as the development direction of next-generation video technologies and has drawn a wide range of researchers' attention. Since FVV images are synthesized via a depth image-based rendering (DIBR) procedure in the "blind" environment (without reference images), a reliable real-time blind quality evaluation and monitoring system is urgently required. But existing assessment metrics do not render human judgments faithfully mainly because geometric distortions are generated by DIBR. To this end, this paper proposes a novel referenceless quality metric of DIBR-synthesized images using the autoregression (AR)-based local image description. It was found that, after the AR prediction, the reconstructed error between a DIBR-synthesized image and its AR-predicted image can accurately capture the geometry distortion. The visual saliency is then leveraged to modify the proposed blind quality metric to a sizable margin. Experiments validate the superiority of our no-reference quality method as compared with prevailing full-, reduced- and no-reference models.

  7. Multiple data sources improve DNA-based mark-recapture population estimates of grizzly bears.

    PubMed

    Boulanger, John; Kendall, Katherine C; Stetz, Jeffrey B; Roon, David A; Waits, Lisette P; Paetkau, David

    2008-04-01

    A fundamental challenge to estimating population size with mark-recapture methods is heterogeneous capture probabilities and subsequent bias of population estimates. Confronting this problem usually requires substantial sampling effort that can be difficult to achieve for some species, such as carnivores. We developed a methodology that uses two data sources to deal with heterogeneity and applied this to DNA mark-recapture data from grizzly bears (Ursus arctos). We improved population estimates by incorporating additional DNA "captures" of grizzly bears obtained by collecting hair from unbaited bear rub trees concurrently with baited, grid-based, hair snag sampling. We consider a Lincoln-Petersen estimator with hair snag captures as the initial session and rub tree captures as the recapture session and develop an estimator in program MARK that treats hair snag and rub tree samples as successive sessions. Using empirical data from a large-scale project in the greater Glacier National Park, Montana, USA, area and simulation modeling we evaluate these methods and compare the results to hair-snag-only estimates. Empirical results indicate that, compared with hair-snag-only data, the joint hair-snag-rub-tree methods produce similar but more precise estimates if capture and recapture rates are reasonably high for both methods. Simulation results suggest that estimators are potentially affected by correlation of capture probabilities between sample types in the presence of heterogeneity. Overall, closed population Huggins-Pledger estimators showed the highest precision and were most robust to sparse data, heterogeneity, and capture probability correlation among sampling types. Results also indicate that these estimators can be used when a segment of the population has zero capture probability for one of the methods. We propose that this general methodology may be useful for other species in which mark-recapture data are available from multiple sources.

  8. Hybrid asymptotic-numerical approach for estimating first-passage-time densities of the two-dimensional narrow capture problem.

    PubMed

    Lindsay, A E; Spoonmore, R T; Tzou, J C

    2016-10-01

    A hybrid asymptotic-numerical method is presented for obtaining an asymptotic estimate for the full probability distribution of capture times of a random walker by multiple small traps located inside a bounded two-dimensional domain with a reflecting boundary. As motivation for this study, we calculate the variance in the capture time of a random walker by a single interior trap and determine this quantity to be comparable in magnitude to the mean. This implies that the mean is not necessarily reflective of typical capture times and that the full density must be determined. To solve the underlying diffusion equation, the method of Laplace transforms is used to obtain an elliptic problem of modified Helmholtz type. In the limit of vanishing trap sizes, each trap is represented as a Dirac point source that permits the solution of the transform equation to be represented as a superposition of Helmholtz Green's functions. Using this solution, we construct asymptotic short-time solutions of the first-passage-time density, which captures peaks associated with rapid capture by the absorbing traps. When numerical evaluation of the Helmholtz Green's function is employed followed by numerical inversion of the Laplace transform, the method reproduces the density for larger times. We demonstrate the accuracy of our solution technique with a comparison to statistics obtained from a time-dependent solution of the diffusion equation and discrete particle simulations. In particular, we demonstrate that the method is capable of capturing the multimodal behavior in the capture time density that arises when the traps are strategically arranged. The hybrid method presented can be applied to scenarios involving both arbitrary domains and trap shapes.

  9. The minimum information about a genome sequence (MIGS) specification

    PubMed Central

    Field, Dawn; Garrity, George; Gray, Tanya; Morrison, Norman; Selengut, Jeremy; Sterk, Peter; Tatusova, Tatiana; Thomson, Nicholas; Allen, Michael J; Angiuoli, Samuel V; Ashburner, Michael; Axelrod, Nelson; Baldauf, Sandra; Ballard, Stuart; Boore, Jeffrey; Cochrane, Guy; Cole, James; Dawyndt, Peter; De Vos, Paul; dePamphilis, Claude; Edwards, Robert; Faruque, Nadeem; Feldman, Robert; Gilbert, Jack; Gilna, Paul; Glöckner, Frank Oliver; Goldstein, Philip; Guralnick, Robert; Haft, Dan; Hancock, David; Hermjakob, Henning; Hertz-Fowler, Christiane; Hugenholtz, Phil; Joint, Ian; Kagan, Leonid; Kane, Matthew; Kennedy, Jessie; Kowalchuk, George; Kottmann, Renzo; Kolker, Eugene; Kravitz, Saul; Kyrpides, Nikos; Leebens-Mack, Jim; Lewis, Suzanna E; Li, Kelvin; Lister, Allyson L; Lord, Phillip; Maltsev, Natalia; Markowitz, Victor; Martiny, Jennifer; Methe, Barbara; Mizrachi, Ilene; Moxon, Richard; Nelson, Karen; Parkhill, Julian; Proctor, Lita; White, Owen; Sansone, Susanna-Assunta; Spiers, Andrew; Stevens, Robert; Swift, Paul; Taylor, Chris; Tateno, Yoshio; Tett, Adrian; Turner, Sarah; Ussery, David; Vaughan, Bob; Ward, Naomi; Whetzel, Trish; Gil, Ingio San; Wilson, Gareth; Wipat, Anil

    2008-01-01

    With the quantity of genomic data increasing at an exponential rate, it is imperative that these data be captured electronically, in a standard format. Standardization activities must proceed within the auspices of open-access and international working bodies. To tackle the issues surrounding the development of better descriptions of genomic investigations, we have formed the Genomic Standards Consortium (GSC). Here, we introduce the minimum information about a genome sequence (MIGS) specification with the intent of promoting participation in its development and discussing the resources that will be required to develop improved mechanisms of metadata capture and exchange. As part of its wider goals, the GSC also supports improving the ‘transparency’ of the information contained in existing genomic databases. PMID:18464787

  10. Retarding friction versus white noise in the description of heavy ion fusion

    NASA Astrophysics Data System (ADS)

    Chushnyakova, Maria; Gontchar, Igor

    2014-03-01

    We performed modeling of the collision of two spherical nuclei resulting in capture. For this aim the stochastic differential equations are used with the white or colored noise and with the instant or retarding friction, respectively. The dissipative forces are proportional to the squared derivative of the strong nucleus-nucleus interaction potential (SnnP). The SnnP is calculated in the framework of the double folding approach with the density-dependent M3Y NN-forces. Calculations performed for 28Si+144Sm reaction show that accounting for the fluctuations typically reduces the capture cross sections by not more than 10%. In contradistinction, the influence of the memory effects is found resulting in about 20% enhancement of the cross section.

  11. An innovative permanent total enclosure for blast cleaning and painting ships in drydock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garland, C.; Lukey, M.

    1997-12-31

    This paper describes a new innovative Permanent Total Enclosure, or CAPE system, which encloses and captures emissions from blast cleaning and painting ship hulls in drydock. A description of the modular enclosure towers with unique seals is shown with several figures. The support barge with its environmental control equipment which includes a dust collector, VOC thermal oxidizer, dehumidifier, boiler, heating coils, air flow fans and, system controls is also described. Data measurements from the first two applications rate this system at 100 percent capture efficiency, 99 percent VOC destruction efficiency and 99.9 percent dust collection efficiency. Ships can be blastmore » cleaned and painted using noncompliant paints and meet all state and federal standards for air emissions.« less

  12. Piezoelectric and Magnetoelectric Thick Films for Fabricating Power Sources in Wireless Sensor Nodes

    PubMed Central

    Priya, Shashank; Ryu, Jungho; Park, Chee-Sung; Oliver, Josiah; Choi, Jong-Jin; Park, Dong-Soo

    2009-01-01

    In this manuscript, we review the progress made in the synthesis of thick film-based piezoelectric and magnetoelectric structures for harvesting energy from mechanical vibrations and magnetic field. Piezoelectric compositions in the system Pb(Zr,Ti)O3–Pb(Zn1/3Nb2/3)O3 (PZNT) have shown promise for providing enhanced efficiency due to higher energy density and thus form the base of transducers designed for capturing the mechanical energy. Laminate structures of PZNT with magnetostrictive ferrite materials provide large magnitudes of magnetoelectric coupling and are being targeted to capture the stray magnetic field energy. We analyze the models used to predict the performance of the energy harvesters and present a full system description. PMID:22454590

  13. Designing berthing mechanisms for international compatibility

    NASA Technical Reports Server (NTRS)

    Winch, John; Gonzalez-Vallejo, Juan J.

    1991-01-01

    The paper examines the technological issues regarding common berthing interfaces for the Space Station Freedom and pressurized modules from U.S., European, and Japanese space programs. The development of the common berthing mechanism (CBM) is based on common requirements concerning specifications, launch environments, and the unique requirements of ESA's Man-Tended Free Flyer. The berthing mechanism is composed of an active and a passive half, a remote manipulator system, 4 capture-latch assemblies, 16 structural bolts, and a pressure gage to verify equalization. Extensive graphic and verbal descriptions of each element are presented emphasizing the capture-latch motion and powered-bolt operation. The support systems to complete the interface are listed, and the manufacturing requirements for consistent fabrication are discussed to ensure effective international development.

  14. 21 CFR 172.860 - Fatty acids.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the gas chromatographic-electron capture method prescribed in paragraph (c)(3) of this section. If..._locations.html. (3) The gas chromatographic-electron capture method for testing fatty acids for chick-edema...

  15. 21 CFR 172.860 - Fatty acids.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the gas chromatographic-electron capture method prescribed in paragraph (c)(3) of this section. If..._locations.html. (3) The gas chromatographic-electron capture method for testing fatty acids for chick-edema...

  16. Using liquid desiccant as a regenerable filter for capturing and deactivating contaminants

    DOEpatents

    Slayzak, Steven J.; Anderson, Ren S.; Judkoff, Ronald D.; Blake, Daniel M.; Vinzant, Todd B.; Ryan, Joseph P.

    2007-12-11

    A method, and systems for implementing such method, for purifying and conditioning air of weaponized contaminants. The method includes wetting a filter packing media with a salt-based liquid desiccant, such as water with a high concentration of lithium chloride. Air is passed through the wetted filter packing media and the contaminants in are captured with the liquid desiccant while the liquid desiccant dehumidifies the air. The captured contaminants are then deactivated in the liquid desiccant, which may include heating the liquid desiccant. The liquid desiccant is regenerated by applying heat to the liquid desiccant and then removing moisture. The method includes repeating the wetting with the regenerated liquid desiccant which provides a regenerable filtering process that captures and deactivates contaminants on an ongoing basis while also conditioning the air. The method may include filtration effectiveness enhancement by electrostatic or inertial means.

  17. Methods of capturing and immobilizing radioactive nuclei with metal fluorite-based inorganic materials

    DOEpatents

    Wang, Yifeng; Miller, Andy; Bryan, Charles R.; Kruichak, Jessica Nicole

    2015-11-17

    Methods of capturing and immobilizing radioactive nuclei with metal fluorite-based inorganic materials are described. For example, a method of capturing and immobilizing radioactive nuclei includes flowing a gas stream through an exhaust apparatus. The exhaust apparatus includes a metal fluorite-based inorganic material. The gas stream includes a radioactive species. The radioactive species is removed from the gas stream by adsorbing the radioactive species to the metal fluorite-based inorganic material of the exhaust apparatus.

  18. Adapting Local Features for Face Detection in Thermal Image.

    PubMed

    Ma, Chao; Trung, Ngo Thanh; Uchiyama, Hideaki; Nagahara, Hajime; Shimada, Atsushi; Taniguchi, Rin-Ichiro

    2017-11-27

    A thermal camera captures the temperature distribution of a scene as a thermal image. In thermal images, facial appearances of different people under different lighting conditions are similar. This is because facial temperature distribution is generally constant and not affected by lighting condition. This similarity in face appearances is advantageous for face detection. To detect faces in thermal images, cascade classifiers with Haar-like features are generally used. However, there are few studies exploring the local features for face detection in thermal images. In this paper, we introduce two approaches relying on local features for face detection in thermal images. First, we create new feature types by extending Multi-Block LBP. We consider a margin around the reference and the generally constant distribution of facial temperature. In this way, we make the features more robust to image noise and more effective for face detection in thermal images. Second, we propose an AdaBoost-based training method to get cascade classifiers with multiple types of local features. These feature types have different advantages. In this way we enhance the description power of local features. We did a hold-out validation experiment and a field experiment. In the hold-out validation experiment, we captured a dataset from 20 participants, comprising 14 males and 6 females. For each participant, we captured 420 images with 10 variations in camera distance, 21 poses, and 2 appearances (participant with/without glasses). We compared the performance of cascade classifiers trained by different sets of the features. The experiment results showed that the proposed approaches effectively improve the performance of face detection in thermal images. In the field experiment, we compared the face detection performance in realistic scenes using thermal and RGB images, and gave discussion based on the results.

  19. The State and Trends of Barcode, RFID, Biometric and Pharmacy Automation Technologies in US Hospitals.

    PubMed

    Uy, Raymonde Charles Y; Kury, Fabricio P; Fontelo, Paul A

    2015-01-01

    The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions.

  20. Structure of a low-population intermediate state in the release of an enzyme product.

    PubMed

    De Simone, Alfonso; Aprile, Francesco A; Dhulesia, Anne; Dobson, Christopher M; Vendruscolo, Michele

    2015-01-09

    Enzymes can increase the rate of biomolecular reactions by several orders of magnitude. Although the steps of substrate capture and product release are essential in the enzymatic process, complete atomic-level descriptions of these steps are difficult to obtain because of the transient nature of the intermediate conformations, which makes them largely inaccessible to standard structure determination methods. We describe here the determination of the structure of a low-population intermediate in the product release process by human lysozyme through a combination of NMR spectroscopy and molecular dynamics simulations. We validate this structure by rationally designing two mutations, the first engineered to destabilise the intermediate and the second to stabilise it, thus slowing down or speeding up, respectively, product release. These results illustrate how product release by an enzyme can be facilitated by the presence of a metastable intermediate with transient weak interactions between the enzyme and product.

  1. Modelling runway incursion severity.

    PubMed

    Wilke, Sabine; Majumdar, Arnab; Ochieng, Washington Y

    2015-06-01

    Analysis of the causes underlying runway incursions is fundamental for the development of effective mitigation measures. However, there are significant weaknesses in the current methods to model these factors. This paper proposes a structured framework for modelling causal factors and their relationship to severity, which includes a description of the airport surface system architecture, establishment of terminological definitions, the determination and collection of appropriate data, the analysis of occurrences for severity and causes, and the execution of a statistical analysis framework. It is implemented in the context of U.S. airports, enabling the identification of a number of priority interventions, including the need for better investigation and causal factor capture, recommendations for airfield design, operating scenarios and technologies, and better training for human operators in the system. The framework is recommended for the analysis of runway incursions to support safety improvements and the methodology is transferable to other areas of aviation safety risk analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Economic Analysis of Centralized vs. Decentralized Electronic Data Capture in Multi-Center Clinical Studies

    PubMed Central

    Walden, Anita; Nahm, Meredith; Barnett, M. Edwina; Conde, Jose G.; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E.; Eisenstein, Eric L.

    2012-01-01

    Background New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. Methods We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. Main Outcome Measures The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Results Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Conclusion Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs. PMID:21335692

  3. Establishing a national resource: a health informatics collection to maintain the legacy of health informatics development.

    PubMed

    Ellis, Beverley; Roberts, Jean; Cooper, Helen

    2007-01-01

    This case study report of the establishment of a national repository of multi-media materials describes the creation process, the challenges faced in putting it into operation and the opportunities for the future. The initial resource has been incorporated under standard library and knowledge management practices. A collaborative action research method was used with active experts in the domain to determine the requirements and priorities for further development. The National Health Informatics Collection (NatHIC) is now accessible and the further issues are being addressed by inclusion in future University and NHS strategic plans. Ultimately the Collection will link with other facilities that contribute to the description and maintenance of effective informatics in support of health globally. The issues raised about the National Health Informatics Collection as established in the UK have resonance with the challenges of capturing the overall historic development of an emerging discipline in any country.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tokár, K.; Derian, R.; Mitas, L.

    Using explicitly correlated fixed-node quantum Monte Carlo and density functional theory (DFT) methods, we study electronic properties, ground-state multiplets, ionization potentials, electron affinities, and low-energy fragmentation channels of charged half-sandwich and multidecker vanadium-benzene systems with up to 3 vanadium atoms, including both anions and cations. It is shown that, particularly in anions, electronic correlations play a crucial role; these effects are not systematically captured with any commonly used DFT functionals such as gradient corrected, hybrids, and range-separated hybrids. On the other hand, tightly bound cations can be described qualitatively by DFT. A comparison of DFT and quantum Monte Carlo providesmore » an in-depth understanding of the electronic structure and properties of these correlated systems. The calculations also serve as a benchmark study of 3d molecular anions that require a balanced many-body description of correlations at both short- and long-range distances.« less

  5. Weak Fault Feature Extraction of Rolling Bearings Based on an Improved Kurtogram.

    PubMed

    Chen, Xianglong; Feng, Fuzhou; Zhang, Bingzhi

    2016-09-13

    Kurtograms have been verified to be an efficient tool in bearing fault detection and diagnosis because of their superiority in extracting transient features. However, the short-time Fourier Transform is insufficient in time-frequency analysis and kurtosis is deficient in detecting cyclic transients. Those factors weaken the performance of the original kurtogram in extracting weak fault features. Correlated Kurtosis (CK) is then designed, as a more effective solution, in detecting cyclic transients. Redundant Second Generation Wavelet Packet Transform (RSGWPT) is deemed to be effective in capturing more detailed local time-frequency description of the signal, and restricting the frequency aliasing components of the analysis results. The authors in this manuscript, combining the CK with the RSGWPT, propose an improved kurtogram to extract weak fault features from bearing vibration signals. The analysis of simulation signals and real application cases demonstrate that the proposed method is relatively more accurate and effective in extracting weak fault features.

  6. Human mobility: Models and applications

    NASA Astrophysics Data System (ADS)

    Barbosa, Hugo; Barthelemy, Marc; Ghoshal, Gourab; James, Charlotte R.; Lenormand, Maxime; Louail, Thomas; Menezes, Ronaldo; Ramasco, José J.; Simini, Filippo; Tomasini, Marcello

    2018-03-01

    Recent years have witnessed an explosion of extensive geolocated datasets related to human movement, enabling scientists to quantitatively study individual and collective mobility patterns, and to generate models that can capture and reproduce the spatiotemporal structures and regularities in human trajectories. The study of human mobility is especially important for applications such as estimating migratory flows, traffic forecasting, urban planning, and epidemic modeling. In this survey, we review the approaches developed to reproduce various mobility patterns, with the main focus on recent developments. This review can be used both as an introduction to the fundamental modeling principles of human mobility, and as a collection of technical methods applicable to specific mobility-related problems. The review organizes the subject by differentiating between individual and population mobility and also between short-range and long-range mobility. Throughout the text the description of the theory is intertwined with real-world applications.

  7. Mapping transiently formed and sparsely populated conformations on a complex energy landscape.

    PubMed

    Wang, Yong; Papaleo, Elena; Lindorff-Larsen, Kresten

    2016-08-23

    Determining the structures, kinetics, thermodynamics and mechanisms that underlie conformational exchange processes in proteins remains extremely difficult. Only in favourable cases is it possible to provide atomic-level descriptions of sparsely populated and transiently formed alternative conformations. Here we benchmark the ability of enhanced-sampling molecular dynamics simulations to determine the free energy landscape of the L99A cavity mutant of T4 lysozyme. We find that the simulations capture key properties previously measured by NMR relaxation dispersion methods including the structure of a minor conformation, the kinetics and thermodynamics of conformational exchange, and the effect of mutations. We discover a new tunnel that involves the transient exposure towards the solvent of an internal cavity, and show it to be relevant for ligand escape. Together, our results provide a comprehensive view of the structural landscape of a protein, and point forward to studies of conformational exchange in systems that are less characterized experimentally.

  8. Diffraction of a shock wave by a compression corner; regular and single Mach reflection

    NASA Technical Reports Server (NTRS)

    Vijayashankar, V. S.; Kutler, P.; Anderson, D.

    1976-01-01

    The two dimensional, time dependent Euler equations which govern the flow field resulting from the injection of a planar shock with a compression corner are solved with initial conditions that result in either regular reflection or single Mach reflection of the incident planar shock. The Euler equations which are hyperbolic are transformed to include the self similarity of the problem. A normalization procedure is employed to align the reflected shock and the Mach stem as computational boundaries to implement the shock fitting procedure. A special floating fitting scheme is developed in conjunction with the method of characteristics to fit the slip surface. The reflected shock, the Mach stem, and the slip surface are all treated as harp discontinuities, thus, resulting in a more accurate description of the inviscid flow field. The resulting numerical solutions are compared with available experimental data and existing first-order, shock-capturing numerical solutions.

  9. Standardization proposal of soft tissue artefact description for data sharing in human motion measurements.

    PubMed

    Cereatti, Andrea; Bonci, Tecla; Akbarshahi, Massoud; Aminian, Kamiar; Barré, Arnaud; Begon, Mickael; Benoit, Daniel L; Charbonnier, Caecilia; Dal Maso, Fabien; Fantozzi, Silvia; Lin, Cheng-Chung; Lu, Tung-Wu; Pandy, Marcus G; Stagni, Rita; van den Bogert, Antonie J; Camomilla, Valentina

    2017-09-06

    Soft tissue artefact (STA) represents one of the main obstacles for obtaining accurate and reliable skeletal kinematics from motion capture. Many studies have addressed this issue, yet there is no consensus on the best available bone pose estimator and the expected errors associated with relevant results. Furthermore, results obtained by different authors are difficult to compare due to the high variability and specificity of the phenomenon and the different metrics used to represent these data. Therefore, the aim of this study was twofold: firstly, to propose standards for description of STA; and secondly, to provide illustrative STA data samples for body segments in the upper and lower extremities and for a range of motor tasks specifically, level walking, stair ascent, sit-to-stand, hip- and knee-joint functional movements, cutting motion, running, hopping, arm elevation and functional upper-limb movements. The STA dataset includes motion of the skin markers measured in vivo and ex vivo using stereophotogrammetry as well as motion of the underlying bones measured using invasive or bio-imaging techniques (i.e., X-ray fluoroscopy or MRI). The data are accompanied by a detailed description of the methods used for their acquisition, with information given about their quality as well as characterization of the STA using the proposed standards. The availability of open-access and standard-format STA data will be useful for the evaluation and development of bone pose estimators thus contributing to the advancement of three-dimensional human movement analysis and its translation into the clinical practice and other applications. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Fluctuating cognition in dementia with Lewy bodies and Alzheimer's disease is qualitatively distinct

    PubMed Central

    Bradshaw, J; Saling, M; Hopwood, M; Anderson, V; Brodtmann, A

    2004-01-01

    Objectives: To document and illustrate qualitative features of fluctuating cognition as described by care givers of patients with probable dementia with Lewy bodies (DLB) and Alzheimer's disease (AD). To determine whether the quality of the fluctuations differs between DLB and AD. To examine the clinical utility of two recently developed rating scales. Methods: Care givers of 13 patients with early probable DLB and 12 patients with early probable AD were interviewed using the Clinician Assessment of Fluctuation and the One Day Fluctuation Assessment Scale, both developed recently. Descriptions of fluctuating cognition were recorded verbatim, analysed, and rated. Results: Descriptions of fluctuating cognition in DLB had a spontaneous, periodic, transient quality, which appeared to reflect an interruption in the ongoing flow of awareness or attention that impacted on functional abilities. Descriptions of fluctuations in AD frequently highlighted episodes of memory failure, or a more enduring state shift in the form of "good" and "bad" days, typically occurring in response to the cognitive demands of the immediate environment. These qualitative differences could be detected reliably by independent raters, but were not always captured in standard severity scores. Conclusion: Fluctuations occuring in DLB have particular characteristics that are distinguishable from fluctuations occurring in AD. Interpretation and application of the fluctuation criterion continues to limit the diagnostic sensitivity of the consensus criteria for DLB. Findings suggest that explicit documentation and a wider appreciation of these distinctions could improve the reliability with which less experienced clinicians identify this core diagnostic feature in the clinical setting. PMID:14966152

  11. Spatial Harmonic Decomposition as a tool for unsteady flow phenomena analysis

    NASA Astrophysics Data System (ADS)

    Duparchy, A.; Guillozet, J.; De Colombel, T.; Bornard, L.

    2014-03-01

    Hydropower is already the largest single renewable electricity source today but its further development will face new deployment constraints such as large-scale projects in emerging economies and the growth of intermittent renewable energy technologies. The potential role of hydropower as a grid stabilizer leads to operating hydro power plants in "off-design" zones. As a result, new methods of analyzing associated unsteady phenomena are needed to improve the design of hydraulic turbines. The key idea of the development is to compute a spatial description of a phenomenon by using a combination from several sensor signals. The spatial harmonic decomposition (SHD) extends the concept of so-called synchronous and asynchronous pulsations by projecting sensor signals on a linearly independent set of a modal scheme. This mathematical approach is very generic as it can be applied on any linear distribution of a scalar quantity defined on a closed curve. After a mathematical description of SHD, this paper will discuss the impact of instrumentation and provide tools to understand SHD signals. Then, as an example of a practical application, SHD is applied on a model test measurement in order to capture and describe dynamic pressure fields. Particularly, the spatial description of the phenomena provides new tools to separate the part of pressure fluctuations that contribute to output power instability or mechanical stresses. The study of the machine stability in partial load operating range in turbine mode or the comparison between the gap pressure field and radial thrust behavior during turbine brake operation are both relevant illustrations of SHD contribution.

  12. A Broadband High Dynamic Range Digital Receiving System for Electromagnetic Signals

    DTIC Science & Technology

    2010-08-26

    dB. [0014] In Steinbrecher (United States Patent No. 7,250,920), an air interface metasurface is described that efficiently captures incident...broadband electromagnetic energy and provides a method for segmenting the total metasurface capture area into a plurality of smaller capture areas...such that the sum of the capture areas is equal to the total capture area of the metasurface . The segmentation of the electromagnetic capture area is

  13. Orbital electron capture by the nucleus

    NASA Technical Reports Server (NTRS)

    Bambynek, W.; Behrens, H.; Chen, M. H.; Crasemann, B.; Fitzpatrick, M. L.; Ledingham, K. W. D.; Genz, H.; Mutterer, M.; Intemann, R. L.

    1976-01-01

    The theory of nuclear electron capture is reviewed in the light of current understanding of weak interactions. Experimental methods and results regarding capture probabilities, capture ratios, and EC/Beta(+) ratios are summarized. Radiative electron capture is discussed, including both theory and experiment. Atomic wave function overlap and electron exchange effects are covered, as are atomic transitions that accompany nuclear electron capture. Tables are provided to assist the reader in determining quantities of interest for specific cases.

  14. System Modeling of Metabolic Heat Regenerated Temperature Swing Adsorption (MTSA) Subassembly for Prototype Design

    NASA Technical Reports Server (NTRS)

    Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.

    2009-01-01

    This paper describes modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly: the sorbent bed, a sublimation (cooling) heat exchanger (SHX), and a condensing icing (warming) heat exchanger (CIHX). The primary function of the MTSA, removing carbon dioxide from a ventilation loop, is performed via the sorbent bed. The CIHX is used to heat the sorbent bed for desorption and to remove moisture from the ventilation loop while the SHX is alternately employed to cool the sorbent bed via sublimation of a spray of water at low pressure to prepare the reconditioned bed for the next cycle. This paper describes a system level model of the MTSA as developed in Thermal Desktop and SINDA/FLUINT including assumptions on geometry and physical phenomena, modeling methodology and relevant pa ra mete rizatio ns. Several areas of particular modeling interest are discussed. In the sorbent bed, capture of the translating CO2 saturation front and associated local energy and mass balance in both adsorbing and desorbing modes is covered. The CIHX poses particular challenges for modeling in SINDA/FLUINT as accounting for solids states in fluid submodels are not a native capability. Methods for capturing phase change and latent heat of ice as well as the transport properties across a layer of low density accreted frost are developed. This extended modeling capacity is applicable to temperatures greater than 258 K. To extend applicability to the minimum device temperature of 235 K, a method for a mapped transformation of temperatures from below the limit temperatures to some value above is given along with descriptions for associated material property transformations and the resulting impacts to total heat and mass transfer. Similar considerations are shown for the SHX along with assumptions for flow mechanics and resulting model methods for sublimation in a flow.

  15. A numerical study of adaptive space and time discretisations for Gross–Pitaevskii equations

    PubMed Central

    Thalhammer, Mechthild; Abhau, Jochen

    2012-01-01

    As a basic principle, benefits of adaptive discretisations are an improved balance between required accuracy and efficiency as well as an enhancement of the reliability of numerical computations. In this work, the capacity of locally adaptive space and time discretisations for the numerical solution of low-dimensional nonlinear Schrödinger equations is investigated. The considered model equation is related to the time-dependent Gross–Pitaevskii equation arising in the description of Bose–Einstein condensates in dilute gases. The performance of the Fourier-pseudo spectral method constrained to uniform meshes versus the locally adaptive finite element method and of higher-order exponential operator splitting methods with variable time stepsizes is studied. Numerical experiments confirm that a local time stepsize control based on a posteriori local error estimators or embedded splitting pairs, respectively, is effective in different situations with an enhancement either in efficiency or reliability. As expected, adaptive time-splitting schemes combined with fast Fourier transform techniques are favourable regarding accuracy and efficiency when applied to Gross–Pitaevskii equations with a defocusing nonlinearity and a mildly varying regular solution. However, the numerical solution of nonlinear Schrödinger equations in the semi-classical regime becomes a demanding task. Due to the highly oscillatory and nonlinear nature of the problem, the spatial mesh size and the time increments need to be of the size of the decisive parameter 0<ε≪1, especially when it is desired to capture correctly the quantitative behaviour of the wave function itself. The required high resolution in space constricts the feasibility of numerical computations for both, the Fourier pseudo-spectral and the finite element method. Nevertheless, for smaller parameter values locally adaptive time discretisations facilitate to determine the time stepsizes sufficiently small in order that the numerical approximation captures correctly the behaviour of the analytical solution. Further illustrations for Gross–Pitaevskii equations with a focusing nonlinearity or a sharp Gaussian as initial condition, respectively, complement the numerical study. PMID:25550676

  16. A numerical study of adaptive space and time discretisations for Gross-Pitaevskii equations.

    PubMed

    Thalhammer, Mechthild; Abhau, Jochen

    2012-08-15

    As a basic principle, benefits of adaptive discretisations are an improved balance between required accuracy and efficiency as well as an enhancement of the reliability of numerical computations. In this work, the capacity of locally adaptive space and time discretisations for the numerical solution of low-dimensional nonlinear Schrödinger equations is investigated. The considered model equation is related to the time-dependent Gross-Pitaevskii equation arising in the description of Bose-Einstein condensates in dilute gases. The performance of the Fourier-pseudo spectral method constrained to uniform meshes versus the locally adaptive finite element method and of higher-order exponential operator splitting methods with variable time stepsizes is studied. Numerical experiments confirm that a local time stepsize control based on a posteriori local error estimators or embedded splitting pairs, respectively, is effective in different situations with an enhancement either in efficiency or reliability. As expected, adaptive time-splitting schemes combined with fast Fourier transform techniques are favourable regarding accuracy and efficiency when applied to Gross-Pitaevskii equations with a defocusing nonlinearity and a mildly varying regular solution. However, the numerical solution of nonlinear Schrödinger equations in the semi-classical regime becomes a demanding task. Due to the highly oscillatory and nonlinear nature of the problem, the spatial mesh size and the time increments need to be of the size of the decisive parameter [Formula: see text], especially when it is desired to capture correctly the quantitative behaviour of the wave function itself. The required high resolution in space constricts the feasibility of numerical computations for both, the Fourier pseudo-spectral and the finite element method. Nevertheless, for smaller parameter values locally adaptive time discretisations facilitate to determine the time stepsizes sufficiently small in order that the numerical approximation captures correctly the behaviour of the analytical solution. Further illustrations for Gross-Pitaevskii equations with a focusing nonlinearity or a sharp Gaussian as initial condition, respectively, complement the numerical study.

  17. Protein-protein docking using region-based 3D Zernike descriptors

    PubMed Central

    2009-01-01

    Background Protein-protein interactions are a pivotal component of many biological processes and mediate a variety of functions. Knowing the tertiary structure of a protein complex is therefore essential for understanding the interaction mechanism. However, experimental techniques to solve the structure of the complex are often found to be difficult. To this end, computational protein-protein docking approaches can provide a useful alternative to address this issue. Prediction of docking conformations relies on methods that effectively capture shape features of the participating proteins while giving due consideration to conformational changes that may occur. Results We present a novel protein docking algorithm based on the use of 3D Zernike descriptors as regional features of molecular shape. The key motivation of using these descriptors is their invariance to transformation, in addition to a compact representation of local surface shape characteristics. Docking decoys are generated using geometric hashing, which are then ranked by a scoring function that incorporates a buried surface area and a novel geometric complementarity term based on normals associated with the 3D Zernike shape description. Our docking algorithm was tested on both bound and unbound cases in the ZDOCK benchmark 2.0 dataset. In 74% of the bound docking predictions, our method was able to find a near-native solution (interface C-αRMSD ≤ 2.5 Å) within the top 1000 ranks. For unbound docking, among the 60 complexes for which our algorithm returned at least one hit, 60% of the cases were ranked within the top 2000. Comparison with existing shape-based docking algorithms shows that our method has a better performance than the others in unbound docking while remaining competitive for bound docking cases. Conclusion We show for the first time that the 3D Zernike descriptors are adept in capturing shape complementarity at the protein-protein interface and useful for protein docking prediction. Rigorous benchmark studies show that our docking approach has a superior performance compared to existing methods. PMID:20003235

  18. Protein-protein docking using region-based 3D Zernike descriptors.

    PubMed

    Venkatraman, Vishwesh; Yang, Yifeng D; Sael, Lee; Kihara, Daisuke

    2009-12-09

    Protein-protein interactions are a pivotal component of many biological processes and mediate a variety of functions. Knowing the tertiary structure of a protein complex is therefore essential for understanding the interaction mechanism. However, experimental techniques to solve the structure of the complex are often found to be difficult. To this end, computational protein-protein docking approaches can provide a useful alternative to address this issue. Prediction of docking conformations relies on methods that effectively capture shape features of the participating proteins while giving due consideration to conformational changes that may occur. We present a novel protein docking algorithm based on the use of 3D Zernike descriptors as regional features of molecular shape. The key motivation of using these descriptors is their invariance to transformation, in addition to a compact representation of local surface shape characteristics. Docking decoys are generated using geometric hashing, which are then ranked by a scoring function that incorporates a buried surface area and a novel geometric complementarity term based on normals associated with the 3D Zernike shape description. Our docking algorithm was tested on both bound and unbound cases in the ZDOCK benchmark 2.0 dataset. In 74% of the bound docking predictions, our method was able to find a near-native solution (interface C-alphaRMSD < or = 2.5 A) within the top 1000 ranks. For unbound docking, among the 60 complexes for which our algorithm returned at least one hit, 60% of the cases were ranked within the top 2000. Comparison with existing shape-based docking algorithms shows that our method has a better performance than the others in unbound docking while remaining competitive for bound docking cases. We show for the first time that the 3D Zernike descriptors are adept in capturing shape complementarity at the protein-protein interface and useful for protein docking prediction. Rigorous benchmark studies show that our docking approach has a superior performance compared to existing methods.

  19. 3D Data Acquisition Platform for Human Activity Understanding

    DTIC Science & Technology

    2016-03-02

    3D data. The support for the acquisition of such research instrumentation have significantly facilitated our current and future research and educate ...SECURITY CLASSIFICATION OF: In this project, we incorporated motion capture devices, 3D vision sensors, and EMG sensors to cross validate...multimodality data acquisition, and address fundamental research problems of representation and invariant description of 3D data, human motion modeling and

  20. Metabolic Host Responses to Malarial Infection during the Intraerythrocytic Developmental Cycle

    DTIC Science & Technology

    2016-08-08

    by reproducing the experimentally determined 1) stage-specific production of biomass components and their precursors in the parasite and 2) metabolite...uptake, allow for the prediction of cellular growth ( biomass accumulation) and other phenotypic functions related to metabolism [9]. For example...our group to capture stage-specific growth phenotypes and biomass metabolite production [15]. Among these metabolic descriptions, only the network

  1. Relativistic Photoionization Computations with the Time Dependent Dirac Equation

    DTIC Science & Technology

    2016-10-12

    fields often occurs in the relativistic regime. A complete description of this phenomenon requires both relativistic and quantum mechanical treatment...photoionization, or other relativis- tic quantum electronics problems. While the Klein-Gordon equation captures much of the relevant physics, especially...for moderately heavy ions (Z 137), it does neglect the spin polarization of the electron. This memo parallels [1], but replaces the Klein-Gordon

  2. Handbook of capture-recapture analysis

    USGS Publications Warehouse

    Amstrup, Steven C.; McDonald, Trent L.; Manly, Bryan F.J.

    2005-01-01

    Every day, biologists in parkas, raincoats, and rubber boots go into the field to capture and mark a variety of animal species. Back in the office, statisticians create analytical models for the field biologists' data. But many times, representatives of the two professions do not fully understand one another's roles. This book bridges this gap by helping biologists understand state-of-the-art statistical methods for analyzing capture-recapture data. In so doing, statisticians will also become more familiar with the design of field studies and with the real-life issues facing biologists.Reliable outcomes of capture-recapture studies are vital to answering key ecological questions. Is the population increasing or decreasing? Do more or fewer animals have a particular characteristic? In answering these questions, biologists cannot hope to capture and mark entire populations. And frequently, the populations change unpredictably during a study. Thus, increasingly sophisticated models have been employed to convert data into answers to ecological questions. This book, by experts in capture-recapture analysis, introduces the most up-to-date methods for data analysis while explaining the theory behind those methods. Thorough, concise, and portable, it will be immensely useful to biologists, biometricians, and statisticians, students in both fields, and anyone else engaged in the capture-recapture process.

  3. Remote chemical immobilisation method for free-ranging Australian cattle.

    PubMed

    Hampton, J O; Skroblin, A; Perry, A L; De Ridder, T R

    2016-12-01

    Many situations are encountered in Australia where the capture and restraint of free-ranging cattle (Bos taurus/Bos indicus) is required. Chemical immobilisation via darting is a potentially useful tool for managing and researching large wild herbivores; however, there is no reliable method for its application to Australian cattle. The aim of this study was to develop an efficacious, humane, cost-effective ground darting method for free-ranging cattle. The 30 female cattle were darted and captured on a pastoral station in north-west Australia from a vehicle. Xylazine (0.59 mg/kg) and ketamine (3.59 mg/kg) were used to capture animals and yohimbine (0.10 mg/kg) was used as an antagonist to xylazine to reduce recumbent time. Cattle became recumbent at a mean time of 8 min and a mean distance of 260 m from darting. The mortality rate was zero on the day of capture and 7% at 14 days post-capture. The majority of darted cattle were successfully immobilised with one dart and recovered within 30 min, with consumables costing approximately A$30 per captured animal. The technique developed represents a rapid and humane method for capturing free-ranging cattle and, with consideration for legislation surrounding use of veterinary chemicals, could be applied in many contexts across Australia. © 2016 Australian Veterinary Association.

  4. Hydrodynamic description of spin Calogero-Sutherland model

    NASA Astrophysics Data System (ADS)

    Abanov, Alexander; Kulkarni, Manas; Franchini, Fabio

    2009-03-01

    We study a non-linear collective field theory for an integrable spin-Calogero-Sutherland model. The hydrodynamic description of this SU(2) model in terms of charge density, charge velocity and spin currents is used to study non-perturbative solutions (solitons) and examine their correspondence with known quantum numbers of elementary excitations [1]. A conventional linear bosonization or harmonic approximation is not sufficient to describe, for example, the physics of spin-charge (non)separation. Therefore, we need this new collective bosonic field description that captures the effects of the band curvature. In the strong coupling limit [2] this model reduces to integrable SU(2) Haldane-Shastry model. We study a non-linear coupling of left and right spin currents which form a Kac-Moody algebra. Our quantum hydrodynamic description for the spin case is an extension for the one found in the spinless version in [3].[3pt] [1] Y. Kato,T. Yamamoto, and M. Arikawa, J. Phys. Soc. Jpn. 66, 1954-1961 (1997).[0pt] [2] A. Polychronakos, Phys Rev Lett. 70,2329-2331(1993).[0pt] [3] A.G.Abanov and P.B. Wiegmann, Phys Rev Lett 95, 076402(2005)

  5. Minimally inconsistent reasoning in Semantic Web.

    PubMed

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.

  6. Minimally inconsistent reasoning in Semantic Web

    PubMed Central

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning. PMID:28750030

  7. Halogenated sulfidohydroboranes for nuclear medicine and boron neutron capture therapy

    DOEpatents

    Miura, Michiko; Slatkin, Daniel N.

    1997-03-18

    A method for performing boron neutron capture therapy for the treatment of tumors is disclosed. The method includes administering to a patient an iodinated sulfidohydroborane, a boron-10-containing compound. The site of the tumor is localized by visualizing the increased concentration of the iodine labelled compound at the tumor. The targeted tumor is then irradiated with a beam of neutrons having an energy distribution effective for neutron capture. Destruction of the tumor occurs due to high LET particle irradiation of the tissue secondary to the incident neutrons being captured by the boron-10 nuclei. Iodinated sulfidohydroboranes are disclosed which are especially suitable for the method of the invention. In a preferred embodiment, a compound having the formula Na.sub.4 B.sub.12 I.sub.11 SSB.sub.12 I.sub.11, or another pharmaceutically acceptable salt of the compound, may be administered to a cancer patient for boron neutron capture therapy.

  8. Halogenated sulfidohydroboranes for nuclear medicine and boron neutron capture therapy

    DOEpatents

    Miura, Michiko; Slatkin, Daniel N.

    1995-10-03

    A method for performing boron neutron capture therapy for the treatment of tumors is disclosed. The method includes administering to a patient an iodinated sulfidohydroborane, a boron-10-containing compound. The site of the tumor is localized by visualizing the increased concentration of the iodine labelled compound at the tumor. The targeted tumor is then irradiated with a beam of neutrons having an energy distribution effective for neutron capture. Destruction of the tumor occurs due to high LET particle irradiation of the tissue secondary to the incident neutrons being captured by the boron-10 nuclei. Iodinated sulfidohydroboranes are disclosed which are especially suitable for the method of the invention. In a preferred embodiment, a compound having the formula Na.sub.4 B.sub.12 I.sub.11 SSB.sub.12 I.sub.11, or another pharmaceutically acceptable salt of the compound, may be administered to a cancer patient for boron neutron capture therapy.

  9. Halogenated sulfidohydroboranes for nuclear medicine and boron neutron capture therapy

    DOEpatents

    Miura, Michiko; Slatkin, Daniel N.

    1997-08-05

    A method for performing boron neutron capture therapy for the treatment of tumors is disclosed. The method includes administering to a patient an iodinated sulfidohydroborane, a boron-10-containing compound. The site of the tumor is localized. by visualizing the increased concentration of the iodine labelled compound at the tumor. The targeted tumor is then irradiated with a beam of neutrons having an energy distribution effective for neutron capture. Destruction of the tumor occurs due to high LET particle irradiation of the tissue secondary to the incident neutrons being captured by the boron-10 nuclei. Iodinated sulfidohydroboranes are disclosed which are especially suitable for the method of the invention. In a preferred embodiment, a compound having the formula Na.sub.4 B.sub.12 I.sub.11 SSB.sub.12 I.sub.11, or another pharmaceutically acceptable salt of the compound, may be administered to a cancer patient for boron neutron capture therapy.

  10. Halogenated sulfidohydroboranes for nuclear medicine and boron neutron capture therapy

    DOEpatents

    Miura, M.; Slatkin, D.N.

    1995-10-03

    A method for performing boron neutron capture therapy for the treatment of tumors is disclosed. The method includes administering to a patient an iodinated sulfidohydroborane, a boron-10-containing compound. The site of the tumor is localized by visualizing the increased concentration of the iodine labelled compound at the tumor. The targeted tumor is then irradiated with a beam of neutrons having an energy distribution effective for neutron capture. Destruction of the tumor occurs due to high LET particle irradiation of the tissue secondary to the incident neutrons being captured by the boron-10 nuclei. Iodinated sulfidohydroboranes are disclosed which are especially suitable for the method of the invention. In a preferred embodiment, a compound having the formula Na{sub 4}B{sub 12}I{sub 11}SSB{sub 12}I{sub 11}, or another pharmaceutically acceptable salt of the compound, may be administered to a cancer patient for boron neutron capture therapy. 1 fig.

  11. Halogenated sulfidohydroboranes for nuclear medicine and boron neutron capture therapy

    DOEpatents

    Miura, M.; Slatkin, D.N.

    1997-03-18

    A method for performing boron neutron capture therapy for the treatment of tumors is disclosed. The method includes administering to a patient an iodinated sulfidohydroborane, a boron-10-containing compound. The site of the tumor is localized by visualizing the increased concentration of the iodine labelled compound at the tumor. The targeted tumor is then irradiated with a beam of neutrons having an energy distribution effective for neutron capture. Destruction of the tumor occurs due to high LET particle irradiation of the tissue secondary to the incident neutrons being captured by the boron-10 nuclei. Iodinated sulfidohydroboranes are disclosed which are especially suitable for the method of the invention. In a preferred embodiment, a compound having the formula Na{sub 4}B{sub 12}I{sub 11}SSB{sub 12}I{sub 11}, or another pharmaceutically acceptable salt of the compound, may be administered to a cancer patient for boron neutron capture therapy. 1 fig.

  12. Halogenated sulfidohydroboranes for nuclear medicine and boron neutron capture therapy

    DOEpatents

    Miura, M.; Slatkin, D.N.

    1997-08-05

    A method for performing boron neutron capture therapy for the treatment of tumors is disclosed. The method includes administering to a patient an iodinated sulfidohydroborane, a boron-10-containing compound. The site of the tumor is localized by visualizing the increased concentration of the iodine labelled compound at the tumor. The targeted tumor is then irradiated with a beam of neutrons having an energy distribution effective for neutron capture. Destruction of the tumor occurs due to high LET particle irradiation of the tissue secondary to the incident neutrons being captured by the boron-10 nuclei. Iodinated sulfidohydroboranes are disclosed which are especially suitable for the method of the invention. In a preferred embodiment, a compound having the formula Na{sub 4}B{sub 12}I{sub 11}SSB{sub 12}I{sub 11}, or another pharmaceutically acceptable salt of the compound, may be administered to a cancer patient for boron neutron capture therapy. 1 fig.

  13. Does the use of vaginal-implant transmitters affect neonate survival rate of white-tailed deer Odocoileus virginianus?

    USGS Publications Warehouse

    Swanson, C.C.; Jenks, J.A.; DePerno, C.S.; Klaver, R.W.; Osborn, R.G.; Tardiff, J.A.

    2008-01-01

    We compared survival of neonate white-tailed deer Odocoileus virginianus captured using vaginal-implant transmitters (VITs) and traditional ground searches to determine if capture method affects neonate survival. During winter 2003, 14 adult female radio-collared deer were fitted with VITs to aid in the spring capture of neonates; neonates were captured using VITs (N=14) and traditional ground searches (N=7). Of the VITs, seven (50%) resulted in the location of birth sites and the capture of 14 neonates. However, seven (50%) VITs were prematurely expelled prior to parturition. Predation accounted for seven neonate mortalities, and of these, five were neonates captured using VITs. During summer 2003, survival for neonates captured using VITs one, two, and three months post capture was 0.76 (SE=0.05; N=14), 0.64 (SE=0.07; N=11) and 0.64 (SE=0.08; N=9), respectively. Neonate survival one, two and three months post capture for neonates captured using ground searches was 0.71 (SE=0.11; N=7), 0.71 (SE=0.15; N=5) and 0.71 (SE=0.15; N=5), respectively. Although 71% of neonates that died were captured <24 hours after birth using VITs, survival did not differ between capture methods. Therefore, use of VITs to capture neonate white-tailed deer did not influence neonate survival. VITs enabled us to capture neonates in dense habitats which would have been difficult to locate using traditional ground searches.

  14. A level set method for determining critical curvatures for drainage and imbibition.

    PubMed

    Prodanović, Masa; Bryant, Steven L

    2006-12-15

    An accurate description of the mechanics of pore level displacement of immiscible fluids could significantly improve the predictions from pore network models of capillary pressure-saturation curves, interfacial areas and relative permeability in real porous media. If we assume quasi-static displacement, at constant pressure and surface tension, pore scale interfaces are modeled as constant mean curvature surfaces, which are not easy to calculate. Moreover, the extremely irregular geometry of natural porous media makes it difficult to evaluate surface curvature values and corresponding geometric configurations of two fluids. Finally, accounting for the topological changes of the interface, such as splitting or merging, is nontrivial. We apply the level set method for tracking and propagating interfaces in order to robustly handle topological changes and to obtain geometrically correct interfaces. We describe a simple but robust model for determining critical curvatures for throat drainage and pore imbibition. The model is set up for quasi-static displacements but it nevertheless captures both reversible and irreversible behavior (Haines jump, pore body imbibition). The pore scale grain boundary conditions are extracted from model porous media and from imaged geometries in real rocks. The method gives quantitative agreement with measurements and with other theories and computational approaches.

  15. Gradient Augmented Level Set Method for Two Phase Flow Simulations with Phase Change

    NASA Astrophysics Data System (ADS)

    Anumolu, C. R. Lakshman; Trujillo, Mario F.

    2016-11-01

    A sharp interface capturing approach is presented for two-phase flow simulations with phase change. The Gradient Augmented Levelset method is coupled with the two-phase momentum and energy equations to advect the liquid-gas interface and predict heat transfer with phase change. The Ghost Fluid Method (GFM) is adopted for velocity to discretize the advection and diffusion terms in the interfacial region. Furthermore, the GFM is employed to treat the discontinuity in the stress tensor, velocity, and temperature gradient yielding an accurate treatment in handling jump conditions. Thermal convection and diffusion terms are approximated by explicitly identifying the interface location, resulting in a sharp treatment for the energy solution. This sharp treatment is extended to estimate the interfacial mass transfer rate. At the computational cell, a d-cubic Hermite interpolating polynomial is employed to describe the interface location, which is locally fourth-order accurate. This extent of subgrid level description provides an accurate methodology for treating various interfacial processes with a high degree of sharpness. The ability to predict the interface and temperature evolutions accurately is illustrated by comparing numerical results with existing 1D to 3D analytical solutions.

  16. Participatory evaluation (I)--sharing lessons from fieldwork in Asia.

    PubMed

    Crishna, B

    2007-05-01

    There is a need to study methodologies for evaluating social development projects. Traditional methods of evaluation are often not able to capture or measure the 'spirit of change' in people, which is the very essence of human development. Using participatory methodologies is a positive way to ensure that evaluations encourage an understanding of the value of critical analysis among service providers and other stakeholders. Participatory evaluation provides a systematic process of learning through experiences. Practical experiences of conducting a number of evaluation studies in social development projects have led the author to develop four basic principles of participatory evaluation strategies. This has been further conceptualized through an extensive literature search. The article develops and shares these principles through descriptions of field experiences in Asia. The article illustrates that the role of any evaluation remains a learning process, one which promotes a climate of reflection and self-assessment. It shows how using participatory methods can create this environment of learning. However, one needs to keep in mind that participatory evaluation takes time, and that the role and calibre of the facilitator are crucial. Participatory evaluation methods have been recommended for social development projects to ensure that stakeholders remain in control of their own lives and decisions.

  17. Qualitative Analysis of E-Liquid Emissions as a Function of Flavor Additives Using Two Aerosol Capture Methods.

    PubMed

    Eddingsaas, Nathan; Pagano, Todd; Cummings, Cody; Rahman, Irfan; Robinson, Risa; Hensel, Edward

    2018-02-13

    This work investigates emissions sampling methods employed for qualitative identification of compounds in e-liquids and their resultant aerosols to assess what capture methods may be sufficient to identify harmful and potentially harmful constituents present. Three popular e-liquid flavors (cinnamon, mango, vanilla) were analyzed using qualitative gas chromatography-mass spectrometry (GC-MS) in the un-puffed state. Each liquid was also machine-puffed under realistic-use flow rate conditions and emissions were captured using two techniques: filter pads and methanol impingers. GC-MS analysis was conducted on the emissions captured using both techniques from all three e-liquids. The e-liquid GC-MS analysis resulted in positive identification of 13 compounds from the cinnamon flavor e-liquid, 31 from mango, and 19 from vanilla, including a number of compounds observed in all e-liquid experiments. Nineteen compounds were observed in emissions which were not present in the un-puffed e-liquid. Qualitative GC-MS analysis of the emissions samples identify compounds observed in all three samples: e-liquid, impinge, and filter pads, and each subset thereof. A limited number of compounds were observed in emissions captured with impingers, but were not observed in emissions captured using filter pads; a larger number of compounds were observed on emissions collected from the filter pads, but not those captured with impingers. It is demonstrated that sampling methods have different sampling efficiencies and some compounds might be missed using only one method. It is recommended to investigate filter pads, impingers, thermal desorption tubes, and solvent extraction resins to establish robust sampling methods for emissions testing of e-cigarette emissions.

  18. Qualitative Analysis of E-Liquid Emissions as a Function of Flavor Additives Using Two Aerosol Capture Methods

    PubMed Central

    Eddingsaas, Nathan; Pagano, Todd; Cummings, Cody; Rahman, Irfan; Robinson, Risa

    2018-01-01

    This work investigates emissions sampling methods employed for qualitative identification of compounds in e-liquids and their resultant aerosols to assess what capture methods may be sufficient to identify harmful and potentially harmful constituents present. Three popular e-liquid flavors (cinnamon, mango, vanilla) were analyzed using qualitative gas chromatography-mass spectrometry (GC-MS) in the un-puffed state. Each liquid was also machine-puffed under realistic-use flow rate conditions and emissions were captured using two techniques: filter pads and methanol impingers. GC-MS analysis was conducted on the emissions captured using both techniques from all three e-liquids. The e-liquid GC-MS analysis resulted in positive identification of 13 compounds from the cinnamon flavor e-liquid, 31 from mango, and 19 from vanilla, including a number of compounds observed in all e-liquid experiments. Nineteen compounds were observed in emissions which were not present in the un-puffed e-liquid. Qualitative GC-MS analysis of the emissions samples identify compounds observed in all three samples: e-liquid, impinge, and filter pads, and each subset thereof. A limited number of compounds were observed in emissions captured with impingers, but were not observed in emissions captured using filter pads; a larger number of compounds were observed on emissions collected from the filter pads, but not those captured with impingers. It is demonstrated that sampling methods have different sampling efficiencies and some compounds might be missed using only one method. It is recommended to investigate filter pads, impingers, thermal desorption tubes, and solvent extraction resins to establish robust sampling methods for emissions testing of e-cigarette emissions. PMID:29438289

  19. Rare Cell Capture in Microfluidic Devices

    PubMed Central

    Pratt, Erica D.; Huang, Chao; Hawkins, Benjamin G.; Gleghorn, Jason P.; Kirby, Brian J.

    2010-01-01

    This article reviews existing methods for the isolation, fractionation, or capture of rare cells in microfluidic devices. Rare cell capture devices face the challenge of maintaining the efficiency standard of traditional bulk separation methods such as flow cytometers and immunomagnetic separators while requiring very high purity of the target cell population, which is typically already at very low starting concentrations. Two major classifications of rare cell capture approaches are covered: (1) non-electrokinetic methods (e.g., immobilization via antibody or aptamer chemistry, size-based sorting, and sheath flow and streamline sorting) are discussed for applications using blood cells, cancer cells, and other mammalian cells, and (2) electrokinetic (primarily dielectrophoretic) methods using both electrode-based and insulative geometries are presented with a view towards pathogen detection, blood fractionation, and cancer cell isolation. The included methods were evaluated based on performance criteria including cell type modeled and used, number of steps/stages, cell viability, and enrichment, efficiency, and/or purity. Major areas for improvement are increasing viability and capture efficiency/purity of directly processed biological samples, as a majority of current studies only process spiked cell lines or pre-diluted/lysed samples. Despite these current challenges, multiple advances have been made in the development of devices for rare cell capture and the subsequent elucidation of new biological phenomena; this article serves to highlight this progress as well as the electrokinetic and non-electrokinetic methods that can potentially be combined to improve performance in future studies. PMID:21532971

  20. Specification and Design of Electrical Flight System Architectures with SysML

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L., Jr.; Jimenez, Alejandro

    2012-01-01

    Modern space flight systems are required to perform more complex functions than previous generations to support space missions. This demand is driving the trend to deploy more electronics to realize system functionality. The traditional approach for the specification, design, and deployment of electrical system architectures in space flight systems includes the use of informal definitions and descriptions that are often embedded within loosely coupled but highly interdependent design documents. Traditional methods become inefficient to cope with increasing system complexity, evolving requirements, and the ability to meet project budget and time constraints. Thus, there is a need for more rigorous methods to capture the relevant information about the electrical system architecture as the design evolves. In this work, we propose a model-centric approach to support the specification and design of electrical flight system architectures using the System Modeling Language (SysML). In our approach, we develop a domain specific language for specifying electrical system architectures, and we propose a design flow for the specification and design of electrical interfaces. Our approach is applied to a practical flight system.

  1. A soft-rigid contact model of MPM for granular flow impact on retaining structures

    NASA Astrophysics Data System (ADS)

    Li, Xinpo; Xie, Yanfang; Gutierrez, Marte

    2018-02-01

    Protective measures against hazards associated with rapid debris avalanches include a variety of retaining structures such as rock/boulder fences, gabions, earthfill barriers and retaining walls. However, the development of analytical and numerical methods for the rational assessment of impact force generated by granular flows is still a challenge. In this work, a soft-rigid contact model is built under the coding framework of MPM which is a hybrid method with Eulerian-Lagrangian description. The soft bodies are discretized into particles (material points), and the rigid bodies are presented by rigid node-based surfaces. Coulomb friction model is used to implement the modeled contact mechanics, and a velocity-dependent friction coefficient is coupled into the model. Simulations of a physical experiment show that the peak and residual value of impact forces are well captured by the MPM model. An idealized scenario of debris avalanche flow down a hillslope and impacting on a retaining wall are analyzed using the MPM model. The calculated forces can provide a quantitative estimate from which mound design could proceed for practical implementation in the field.

  2. A New Self-Consistent Field Model of Polymer/Nanoparticle Mixture

    NASA Astrophysics Data System (ADS)

    Chen, Kang; Li, Hui-Shu; Zhang, Bo-Kai; Li, Jian; Tian, Wen-De

    2016-02-01

    Field-theoretical method is efficient in predicting assembling structures of polymeric systems. However, it’s challenging to generalize this method to study the polymer/nanoparticle mixture due to its multi-scale nature. Here, we develop a new field-based model which unifies the nanoparticle description with the polymer field within the self-consistent field theory. Instead of being “ensemble-averaged” continuous distribution, the particle density in the final morphology can represent individual particles located at preferred positions. The discreteness of particle density allows our model to properly address the polymer-particle interface and the excluded-volume interaction. We use this model to study the simplest system of nanoparticles immersed in the dense homopolymer solution. The flexibility of tuning the interfacial details allows our model to capture the rich phenomena such as bridging aggregation and depletion attraction. Insights are obtained on the enthalpic and/or entropic origin of the structural variation due to the competition between depletion and interfacial interaction. This approach is readily extendable to the study of more complex polymer-based nanocomposites or biology-related systems, such as dendrimer/drug encapsulation and membrane/particle assembly.

  3. THE ROLE OF SELF-INJURY IN THE ORGANIZATION OF BEHAVIOUR

    PubMed Central

    Sandman, Curt A.; Kemp, Aaron S.; Mabini, Christopher; Pincus, David; Magnusson, Magnus

    2012-01-01

    Background Self-injuring acts are among the most dramatic behaviours exhibited by human beings. There is no known single cause and there is no universally agreed upon treatment. Sophisticated sequential and temporal analysis of behaviour has provided alternative descriptions of self-injury that provide new insights into its initiation and maintenance. Method Forty hours of observations for each of 32 participants were collected in a contiguous two-week period. Twenty categories of behavioural and environmental events were recorded electronically that captured the precise time each observation occurred. Temporal behavioural/environmental patterns associated with self-injurious events were revealed with a method (t-patterns; THEME) for detecting non-linear, real-time patterns. Results Results indicated that acts of self-injury contributed both to more patterns and to more complex patterns. Moreover, self-injury left its imprint on the organization of behaviour even when counts of self-injury were expelled from the continuous record. Conclusions Behaviour of participants was organized in a more diverse array of patterns with SIB was present. Self-injuring acts may function as singular points, increasing coherence within self-organizing patterns of behaviour. PMID:22452417

  4. An approximate methods approach to probabilistic structural analysis

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A major research and technology program in Probabilistic Structural Analysis Methods (PSAM) is currently being sponsored by the NASA Lewis Research Center with Southwest Research Institute as the prime contractor. This program is motivated by the need to accurately predict structural response in an environment where the loadings, the material properties, and even the structure may be considered random. The heart of PSAM is a software package which combines advanced structural analysis codes with a fast probability integration (FPI) algorithm for the efficient calculation of stochastic structural response. The basic idea of PAAM is simple: make an approximate calculation of system response, including calculation of the associated probabilities, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The deterministic solution resulting should give a reasonable and realistic description of performance-limiting system responses, although some error will be inevitable. If the simple model has correctly captured the basic mechanics of the system, however, including the proper functional dependence of stress, frequency, etc. on design parameters, then the response sensitivities calculated may be of significantly higher accuracy.

  5. Controlled viable release of selectively captured label-free cells in microchannels.

    PubMed

    Gurkan, Umut Atakan; Anand, Tarini; Tas, Huseyin; Elkan, David; Akay, Altug; Keles, Hasan Onur; Demirci, Utkan

    2011-12-07

    Selective capture of cells from bodily fluids in microchannels has broadly transformed medicine enabling circulating tumor cell isolation, rapid CD4(+) cell counting for HIV monitoring, and diagnosis of infectious diseases. Although cell capture methods have been demonstrated in microfluidic systems, the release of captured cells remains a significant challenge. Viable retrieval of captured label-free cells in microchannels will enable a new era in biological sciences by allowing cultivation and post-processing. The significant challenge in release comes from the fact that the cells adhere strongly to the microchannel surface, especially when immuno-based immobilization methods are used. Even though fluid shear and enzymes have been used to detach captured cells in microchannels, these methods are known to harm cells and affect cellular characteristics. This paper describes a new technology to release the selectively captured label-free cells in microchannels without the use of fluid shear or enzymes. We have successfully released the captured CD4(+) cells (3.6% of the mononuclear blood cells) from blood in microfluidic channels with high specificity (89% ± 8%), viability (94% ± 4%), and release efficiency (59% ± 4%). We have further validated our system by specifically capturing and controllably releasing the CD34(+) stem cells from whole blood, which were quantified to be 19 cells per million blood cells in the blood samples used in this study. Our results also indicated that both CD4(+) and CD34(+) cells released from the microchannels were healthy and amenable for in vitro culture. Manual flow based microfluidic method utilizes inexpensive, easy to fabricate microchannels allowing selective label-free cell capture and release in less than 10 minutes, which can also be used at the point-of-care. The presented technology can be used to isolate and purify a broad spectrum of cells from mixed populations offering widespread applications in applied biological sciences, such as tissue engineering, regenerative medicine, rare cell and stem cell isolation, proteomic/genomic research, and clonal/population analyses.

  6. Effects of capturing and collaring on polar bears: findings from long-term research on the southern Beaufort Sea population

    USGS Publications Warehouse

    Rode, Karyn D.; Pagano, Anthony M.; Bromaghin, Jeffrey F.; Atwood, Todd C.; Durner, George M.; Simac, Kristin S.; Amstrup, Steven C.

    2014-01-01

    Context: The potential for research methods to affect wildlife is an increasing concern among both scientists and the public. This topic has a particular urgency for polar bears because additional research is needed to monitor and understand population responses to rapid loss of sea ice habitat.Aims: This study used data collected from polar bears sampled in the Alaska portion of the southern Beaufort Sea to investigate the potential for capture to adversely affect behaviour and vital rates. We evaluated the extent to which capture, collaring and handling may influence activity and movement days to weeks post-capture, and body mass, body condition, reproduction and survival over 6 months or more.Methods: We compared post-capture activity and movement rates, and relationships between prior capture history and body mass, body condition and reproductive success. We also summarised data on capture-related mortality.Key results: Individual-based estimates of activity and movement rates reached near-normal levels within 2–3 days and fully normal levels within 5 days post-capture. Models of activity and movement rates among all bears had poor fit, but suggested potential for prolonged, lower-level rate reductions. Repeated captures was not related to negative effects on body condition, reproduction or cub growth or survival. Capture-related mortality was substantially reduced after 1986, when immobilisation drugs were changed, with only 3 mortalities in 2517 captures from 1987–2013.Conclusions: Polar bears in the southern Beaufort Sea exhibited the greatest reductions in activity and movement rates 3.5 days post-capture. These shorter-term, post-capture effects do not appear to have translated into any long-term effects on body condition, reproduction, or cub survival. Additionally, collaring had no effect on polar bear recovery rates, body condition, reproduction or cub survival.Implications: This study provides empirical evidence that current capture-based research methods do not have long-term implications, and are not contributing to observed changes in body condition, reproduction or survival in the southern Beaufort Sea. Continued refinement of capture protocols, such as the use of low-impact dart rifles and reversible drug combinations, might improve polar bear response to capture and abate short-term reductions in activity and movement post-capture.

  7. Capturing and modelling high-complex alluvial topography with UAS-borne laser scanning

    NASA Astrophysics Data System (ADS)

    Mandlburger, Gottfried; Wieser, Martin; Pfennigbauer, Martin

    2015-04-01

    Due to fluvial activity alluvial forests are zones of highest complexity and relief energy. Alluvial forests are dominated by new and pristine channels in consequence of current and historic flood events. Apart from topographic features, the vegetation structure is typically very complex featuring, both, dense under story as well as high trees. Furthermore, deadwood and debris carried from upstream during periods of high discharge within the river channel are deposited in these areas. Therefore, precise modelling of the micro relief of alluvial forests using standard tools like Airborne Laser Scanning (ALS) is hardly feasible. Terrestrial Laser Scanning (TLS), in turn, is very time consuming for capturing larger areas as many scan positions are necessary for obtaining complete coverage due to view occlusions in the forest. In the recent past, the technological development of Unmanned Arial Systems (UAS) has reached a level that light-weight survey-grade laser scanners can be operated from these platforms. For capturing alluvial topography this could bridge the gap between ALS and TLS in terms of providing a very detailed description of the topography and the vegetation structure due to the achievable very high point density of >100 points per m2. In our contribution we demonstrate the feasibility to apply UAS-borne laser scanning for capturing and modelling the complex topography of the study area Neubacher Au, an alluvial forest at the pre-alpine River Pielach (Lower Austria). The area was captured with Riegl's VUX-1 compact time-of-flight laser scanner mounted on a RiCopter (X-8 array octocopter). The scanner features an effective scan rate of 500 kHz and was flown in 50-100 m above ground. At this flying height the laser footprint is 25-50 mm allowing mapping of very small surface details. Furthermore, online waveform processing of the backscattered laser energy enables the retrieval of multiple targets for single laser shots resulting in a dense point cloud of, both, the ground surface and the alluvial vegetation. From the acquired point cloud the following products could be derived: (i) a very high resolution Digital Terrain Model (10 cm raster), (ii) a high resolution model of the water surface of the River Pielach (especially useful for validation of topo-bathymetry LiDAR data) and (iii) a detailed description of the complex vegetation structure.

  8. A mobile app for securely capturing and transferring clinical images to the electronic health record: description and preliminary usability study.

    PubMed

    Landman, Adam; Emani, Srinivas; Carlile, Narath; Rosenthal, David I; Semakov, Simon; Pallin, Daniel J; Poon, Eric G

    2015-01-02

    Photographs are important tools to record, track, and communicate clinical findings. Mobile devices with high-resolution cameras are now ubiquitous, giving clinicians the opportunity to capture and share images from the bedside. However, secure and efficient ways to manage and share digital images are lacking. The aim of this study is to describe the implementation of a secure application for capturing and storing clinical images in the electronic health record (EHR), and to describe initial user experiences. We developed CliniCam, a secure Apple iOS (iPhone, iPad) application that allows for user authentication, patient selection, image capture, image annotation, and storage of images as a Portable Document Format (PDF) file in the EHR. We leveraged our organization's enterprise service-oriented architecture to transmit the image file from CliniCam to our enterprise clinical data repository. There is no permanent storage of protected health information on the mobile device. CliniCam also required connection to our organization's secure WiFi network. Resident physicians from emergency medicine, internal medicine, and dermatology used CliniCam in clinical practice for one month. They were then asked to complete a survey on their experience. We analyzed the survey results using descriptive statistics. Twenty-eight physicians participated and 19/28 (68%) completed the survey. Of the respondents who used CliniCam, 89% found it useful or very useful for clinical practice and easy to use, and wanted to continue using the app. Respondents provided constructive feedback on location of the photos in the EHR, preferring to have photos embedded in (or linked to) clinical notes instead of storing them as separate PDFs within the EHR. Some users experienced difficulty with WiFi connectivity which was addressed by enhancing CliniCam to check for connectivity on launch. CliniCam was implemented successfully and found to be easy to use and useful for clinical practice. CliniCam is now available to all clinical users in our hospital, providing a secure and efficient way to capture clinical images and to insert them into the EHR. Future clinical image apps should more closely link clinical images and clinical documentation and consider enabling secure transmission over public WiFi or cellular networks.

  9. Characterizing protein conformations by correlation analysis of coarse-grained contact matrices.

    PubMed

    Lindsay, Richard J; Siess, Jan; Lohry, David P; McGee, Trevor S; Ritchie, Jordan S; Johnson, Quentin R; Shen, Tongye

    2018-01-14

    We have developed a method to capture the essential conformational dynamics of folded biopolymers using statistical analysis of coarse-grained segment-segment contacts. Previously, the residue-residue contact analysis of simulation trajectories was successfully applied to the detection of conformational switching motions in biomolecular complexes. However, the application to large protein systems (larger than 1000 amino acid residues) is challenging using the description of residue contacts. Also, the residue-based method cannot be used to compare proteins with different sequences. To expand the scope of the method, we have tested several coarse-graining schemes that group a collection of consecutive residues into a segment. The definition of these segments may be derived from structural and sequence information, while the interaction strength of the coarse-grained segment-segment contacts is a function of the residue-residue contacts. We then perform covariance calculations on these coarse-grained contact matrices. We monitored how well the principal components of the contact matrices is preserved using various rendering functions. The new method was demonstrated to assist the reduction of the degrees of freedom for describing the conformation space, and it potentially allows for the analysis of a system that is approximately tenfold larger compared with the corresponding residue contact-based method. This method can also render a family of similar proteins into the same conformational space, and thus can be used to compare the structures of proteins with different sequences.

  10. Characterizing protein conformations by correlation analysis of coarse-grained contact matrices

    NASA Astrophysics Data System (ADS)

    Lindsay, Richard J.; Siess, Jan; Lohry, David P.; McGee, Trevor S.; Ritchie, Jordan S.; Johnson, Quentin R.; Shen, Tongye

    2018-01-01

    We have developed a method to capture the essential conformational dynamics of folded biopolymers using statistical analysis of coarse-grained segment-segment contacts. Previously, the residue-residue contact analysis of simulation trajectories was successfully applied to the detection of conformational switching motions in biomolecular complexes. However, the application to large protein systems (larger than 1000 amino acid residues) is challenging using the description of residue contacts. Also, the residue-based method cannot be used to compare proteins with different sequences. To expand the scope of the method, we have tested several coarse-graining schemes that group a collection of consecutive residues into a segment. The definition of these segments may be derived from structural and sequence information, while the interaction strength of the coarse-grained segment-segment contacts is a function of the residue-residue contacts. We then perform covariance calculations on these coarse-grained contact matrices. We monitored how well the principal components of the contact matrices is preserved using various rendering functions. The new method was demonstrated to assist the reduction of the degrees of freedom for describing the conformation space, and it potentially allows for the analysis of a system that is approximately tenfold larger compared with the corresponding residue contact-based method. This method can also render a family of similar proteins into the same conformational space, and thus can be used to compare the structures of proteins with different sequences.

  11. Implementation of novel statistical procedures and other advanced approaches to improve analysis of CASA data.

    PubMed

    Ramón, M; Martínez-Pastor, F

    2018-04-23

    Computer-aided sperm analysis (CASA) produces a wealth of data that is frequently ignored. The use of multiparametric statistical methods can help explore these datasets, unveiling the subpopulation structure of sperm samples. In this review we analyse the significance of the internal heterogeneity of sperm samples and its relevance. We also provide a brief description of the statistical tools used for extracting sperm subpopulations from the datasets, namely unsupervised clustering (with non-hierarchical, hierarchical and two-step methods) and the most advanced supervised methods, based on machine learning. The former method has allowed exploration of subpopulation patterns in many species, whereas the latter offering further possibilities, especially considering functional studies and the practical use of subpopulation analysis. We also consider novel approaches, such as the use of geometric morphometrics or imaging flow cytometry. Finally, although the data provided by CASA systems provides valuable information on sperm samples by applying clustering analyses, there are several caveats. Protocols for capturing and analysing motility or morphometry should be standardised and adapted to each experiment, and the algorithms should be open in order to allow comparison of results between laboratories. Moreover, we must be aware of new technology that could change the paradigm for studying sperm motility and morphology.

  12. The Dynamic Photometric Stereo Method Using a Multi-Tap CMOS Image Sensor.

    PubMed

    Yoda, Takuya; Nagahara, Hajime; Taniguchi, Rin-Ichiro; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2018-03-05

    The photometric stereo method enables estimation of surface normals from images that have been captured using different but known lighting directions. The classical photometric stereo method requires at least three images to determine the normals in a given scene. However, this method cannot be applied to dynamic scenes because it is assumed that the scene remains static while the required images are captured. In this work, we present a dynamic photometric stereo method for estimation of the surface normals in a dynamic scene. We use a multi-tap complementary metal-oxide-semiconductor (CMOS) image sensor to capture the input images required for the proposed photometric stereo method. This image sensor can divide the electrons from the photodiode from a single pixel into the different taps of the exposures and can thus capture multiple images under different lighting conditions with almost identical timing. We implemented a camera lighting system and created a software application to enable estimation of the normal map in real time. We also evaluated the accuracy of the estimated surface normals and demonstrated that our proposed method can estimate the surface normals of dynamic scenes.

  13. An evaluation of multipass electrofishing for estimating the abundance of stream-dwelling salmonids

    Treesearch

    James T. Peterson; Russell F. Thurow; John W. Guzevich

    2004-01-01

    Failure to estimate capture efficiency, defined as the probability of capturing individual fish, can introduce a systematic error or bias into estimates of fish abundance. We evaluated the efficacy of multipass electrofishing removal methods for estimating fish abundance by comparing estimates of capture efficiency from multipass removal estimates to capture...

  14. A pilot project in distance education: nurse practitioner students' experience of personal video capture technology as an assessment method of clinical skills.

    PubMed

    Strand, Haakan; Fox-Young, Stephanie; Long, Phil; Bogossian, Fiona

    2013-03-01

    This paper reports on a pilot project aimed at exploring postgraduate distance students' experiences using personal video capture technology to complete competency assessments in physical examination. A pre-intervention survey gathered demographic data from nurse practitioner students (n=31) and measured their information communication technology fluency. Subsequently, thirteen (13) students were allocated a hand held video camera to use in their clinical setting. Those participating in the trial completed a post-intervention survey and further data were gathered using semi-structured interviews. Data were analysed by descriptive statistics and deductive content analysis, and the Unified Theory of Acceptance and Use of Technology (Venkatesh et al., 2003) were used to guide the project. Uptake of the intervention was high (93%) as students recognised the potential benefit. Students were video recorded while performing physical examinations. They described high level of stress and some anxiety, which decreased rapidly while assessment was underway. Barriers experienced were in the areas of facilitating conditions (technical character e.g. upload of files) and social influence (e.g. local ethical approval). Students valued the opportunity to reflect on their recorded performance with their clinical mentors and by themselves. This project highlights the demands and difficulties of introducing technology to support work-based learning. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Understanding Emergency Medicine Physicians Multitasking Behaviors Around Interruptions.

    PubMed

    Fong, Allan; Ratwani, Raj M

    2018-06-11

    Interruptions can adversely impact human performance, particularly in fast-paced and high-risk environments such as the emergency department (ED). Understanding physician behaviors before, during, and after interruptions is important to the design and promotion of safe and effective workflow solutions. However, traditional human factors based interruption models do not accurately reflect the complexities of real-world environments like the ED and may not capture multiple interruptions and multitasking. We present a more comprehensive framework for understanding interruptions that is composed of three phases, each with multiple levels: Interruption Start Transition, Interruption Engagement, and Interruption End Transition. This three-phase framework is not constrained to discrete task transitions, providing a robust method to categorize multitasking behaviors around interruptions. We apply this framework in categorizing 457 interruption episodes. 457 interruption episodes were captured during 36 hours of observation. The interrupted task was immediately suspended 348 (76.1%) times. Participants engaged in new self-initiated tasks during the interrupting task 164 (35.9%) times and did not directly resume the interrupted task in 284 (62.1%) interruption episodes. Using this framework provides a more detailed description of the types of physician behaviors in complex environments. Understanding the different types of interruption and resumption patterns, which may have a different impact on performance, can support the design of interruption mitigation strategies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  16. A model to estimate the size of nanoparticle agglomerates in gas-solid fluidized beds

    NASA Astrophysics Data System (ADS)

    de Martín, Lilian; van Ommen, J. Ruud

    2013-11-01

    The estimation of nanoparticle agglomerates' size in fluidized beds remains an open challenge, mainly due to the difficulty of characterizing the inter-agglomerate van der Waals force. The current approach is to describe micron-sized nanoparticle agglomerates as micron-sized particles with 0.1-0.2-μm asperities. This simplification does not capture the influence of the particle size on the van der Waals attraction between agglomerates. In this paper, we propose a new description where the agglomerates are micron-sized particles with nanoparticles on the surface, acting as asperities. As opposed to previous models, here the van der Waals force between agglomerates decreases with an increase in the particle size. We have also included an additional force due to the hydrogen bond formation between the surfaces of hydrophilic and dry nanoparticles. The average size of the fluidized agglomerates has been estimated equating the attractive force obtained from this method to the weight of the individual agglomerates. The results have been compared to 54 experimental values, most of them collected from the literature. Our model approximates without a systematic error the size of most of the nanopowders, both in conventional and centrifugal fluidized beds, outperforming current models. Although simple, the model is able to capture the influence of the nanoparticle size, particle density, and Hamaker coefficient on the inter-agglomerate forces.

  17. Evaluating Integrative Cancer Clinics With the Claim Assessment Profile: An Example With the InspireHealth Clinic

    PubMed Central

    Hilton, Lara; Elfenbaum, Pamela; Jain, Shamini; Sprengel, Meredith; Jonas, Wayne B.

    2016-01-01

    Background: The evaluation of freestanding integrative cancer clinical programs is challenging and is rarely done. We have developed an approach called the Claim Assessment Profile (CAP) to identify whether evaluation of a practice is justified, feasible, and likely to provide useful information. Objectives: A CAP was performed in order to (1) clarify the healing claims at InspireHealth, an integrative oncology treatment program, by defining the most important impacts on its clients; (2) gather information about current research capacity at the clinic; and (3) create a program theory and path model for use in prospective research. Study Design/Methods: This case study design incorporates methods from a variety of rapid assessment approaches. Procedures included site visits to observe the program, structured qualitative interviews with 26 providers and staff, surveys to capture descriptive data about the program, and observational data on program implementation. Results: The InspireHealth program is a well-established, multi-site, thriving integrative oncology clinical practice that focuses on patient support, motivation, and health behavior engagement. It delivers patient-centered care via a standardized treatment protocol. There arehigh levels of research interest from staff and resources by which to conduct research. Conclusions: This analysis provides the primary descriptive and claims clarification of an integrative oncology treatment program, an evaluation readiness report, a detailed logic model explicating program theory, and a clinical outcomes path model for conducting prospective research. Prospective evaluation of this program would be feasible and valuable, adding to our knowledge base of integrative cancer therapies. PMID:29444602

  18. Using Policy-Capturing to Measure Attitudes in Organizational Diagnosis.

    ERIC Educational Resources Information Center

    Madden, Joseph M.

    1981-01-01

    Discusses an indirect method of attitude measurement, policy-capturing, that can be applied on an individual basis. In three experiments this method detected prejudicial attitudes toward females not detected with traditional methods. Can be used as a self-improvement diagnostic tool for developing awareness of behavior influences. (JAC)

  19. Thiourea derivatives, methods of their preparation and their use in neutron capture therapy of malignant melanoma

    DOEpatents

    Gabel, D.

    1991-06-04

    The present invention pertains to boron containing thiouracil derivatives, their method of preparations, and their use in the therapy of malignant melanoma using boron neutron capture therapy. No Drawings

  20. Filament capturing with the multimaterial moment-of-fluid method*

    DOE PAGES

    Jemison, Matthew; Sussman, Mark; Shashkov, Mikhail

    2015-01-15

    A novel method for capturing two-dimensional, thin, under-resolved material configurations, known as “filaments,” is presented in the context of interface reconstruction. This technique uses a partitioning procedure to detect disconnected regions of material in the advective preimage of a cell (indicative of a filament) and makes use of the existing functionality of the Multimaterial Moment-of-Fluid interface reconstruction method to accurately capture the under-resolved feature, while exactly conserving volume. An algorithm for Adaptive Mesh Refinement in the presence of filaments is developed so that refinement is introduced only near the tips of filaments and where the Moment-of-Fluid reconstruction error is stillmore » large. Comparison to the standard Moment-of-Fluid method is made. As a result, it is demonstrated that using filament capturing at a given resolution yields gains in accuracy comparable to introducing an additional level of mesh refinement at significantly lower cost.« less

  1. A Quantitative Investigation of the Relationship between Adult Attachment and the Leadership Styles of Florida's Public Service Leaders

    ERIC Educational Resources Information Center

    White, April L.

    2013-01-01

    Many organizations find selecting a leader to be highly challenging. Investigators have found and admit that the study of leadership is a very complex phenomenon that cannot be easily captured and explained in a manner that could lead to a final description about leadership or offer clear steps on how to choose the right leader. Among the many…

  2. Advanced Extravehicular Mobility Unit Informatics Software Design

    NASA Technical Reports Server (NTRS)

    Wright, Theodore

    2014-01-01

    This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.

  3. Noisy Oscillations in the Actin Cytoskeleton of Chemotactic Amoeba.

    PubMed

    Negrete, Jose; Pumir, Alain; Hsu, Hsin-Fang; Westendorf, Christian; Tarantola, Marco; Beta, Carsten; Bodenschatz, Eberhard

    2016-09-30

    Biological systems with their complex biochemical networks are known to be intrinsically noisy. Here we investigate the dynamics of actin polymerization of amoeboid cells, which are close to the onset of oscillations. We show that the large phenotypic variability in the polymerization dynamics can be accurately captured by a generic nonlinear oscillator model in the presence of noise. We determine the relative role of the noise with a single dimensionless, experimentally accessible parameter, thus providing a quantitative description of the variability in a population of cells. Our approach, which rests on a generic description of a system close to a Hopf bifurcation and includes the effect of noise, can characterize the dynamics of a large class of noisy systems close to an oscillatory instability.

  4. Geometric Brownian Motion with Tempered Stable Waiting Times

    NASA Astrophysics Data System (ADS)

    Gajda, Janusz; Wyłomańska, Agnieszka

    2012-08-01

    One of the earliest system that was used to asset prices description is Black-Scholes model. It is based on geometric Brownian motion and was used as a tool for pricing various financial instruments. However, when it comes to data description, geometric Brownian motion is not capable to capture many properties of present financial markets. One can name here for instance periods of constant values. Therefore we propose an alternative approach based on subordinated tempered stable geometric Brownian motion which is a combination of the popular geometric Brownian motion and inverse tempered stable subordinator. In this paper we introduce the mentioned process and present its main properties. We propose also the estimation procedure and calibrate the analyzed system to real data.

  5. Noisy Oscillations in the Actin Cytoskeleton of Chemotactic Amoeba

    NASA Astrophysics Data System (ADS)

    Negrete, Jose; Pumir, Alain; Hsu, Hsin-Fang; Westendorf, Christian; Tarantola, Marco; Beta, Carsten; Bodenschatz, Eberhard

    2016-09-01

    Biological systems with their complex biochemical networks are known to be intrinsically noisy. Here we investigate the dynamics of actin polymerization of amoeboid cells, which are close to the onset of oscillations. We show that the large phenotypic variability in the polymerization dynamics can be accurately captured by a generic nonlinear oscillator model in the presence of noise. We determine the relative role of the noise with a single dimensionless, experimentally accessible parameter, thus providing a quantitative description of the variability in a population of cells. Our approach, which rests on a generic description of a system close to a Hopf bifurcation and includes the effect of noise, can characterize the dynamics of a large class of noisy systems close to an oscillatory instability.

  6. Demographic estimation methods for plants with unobservable life-states

    USGS Publications Warehouse

    Kery, M.; Gregg, K.B.; Schaub, M.

    2005-01-01

    Demographic estimation of vital parameters in plants with an unobservable dormant state is complicated, because time of death is not known. Conventional methods assume that death occurs at a particular time after a plant has last been seen aboveground but the consequences of assuming a particular duration of dormancy have never been tested. Capture-recapture methods do not make assumptions about time of death; however, problems with parameter estimability have not yet been resolved. To date, a critical comparative assessment of these methods is lacking. We analysed data from a 10 year study of Cleistes bifaria, a terrestrial orchid with frequent dormancy, and compared demographic estimates obtained by five varieties of the conventional methods, and two capture-recapture methods. All conventional methods produced spurious unity survival estimates for some years or for some states, and estimates of demographic rates sensitive to the time of death assumption. In contrast, capture-recapture methods are more parsimonious in terms of assumptions, are based on well founded theory and did not produce spurious estimates. In Cleistes, dormant episodes lasted for 1-4 years (mean 1.4, SD 0.74). The capture-recapture models estimated ramet survival rate at 0.86 (SE~ 0.01), ranging from 0.77-0.94 (SEs # 0.1) in anyone year. The average fraction dormant was estimated at 30% (SE 1.5), ranging 16 -47% (SEs # 5.1) in anyone year. Multistate capture-recapture models showed that survival rates were positively related to precipitation in the current year, but transition rates were more strongly related to precipitation in the previous than in the current year, with more ramets going dormant following dry years. Not all capture-recapture models of interest have estimable parameters; for instance, without excavating plants in years when they do not appear aboveground, it is not possible to obtain independent timespecific survival estimates for dormant plants. We introduce rigorous computer algebra methods to identify the parameters that are estimable in principle. As life-states are a prominent feature in plant life cycles, multi state capture-recapture models are a natural framework for analysing population dynamics of plants with dormancy.

  7. Age-structured mark-recapture analysis: A virtual-population-analysis-based model for analyzing age-structured capture-recapture data

    USGS Publications Warehouse

    Coggins, L.G.; Pine, William E.; Walters, C.J.; Martell, S.J.D.

    2006-01-01

    We present a new model to estimate capture probabilities, survival, abundance, and recruitment using traditional Jolly-Seber capture-recapture methods within a standard fisheries virtual population analysis framework. This approach compares the numbers of marked and unmarked fish at age captured in each year of sampling with predictions based on estimated vulnerabilities and abundance in a likelihood function. Recruitment to the earliest age at which fish can be tagged is estimated by using a virtual population analysis method to back-calculate the expected numbers of unmarked fish at risk of capture. By using information from both marked and unmarked animals in a standard fisheries age structure framework, this approach is well suited to the sparse data situations common in long-term capture-recapture programs with variable sampling effort. ?? Copyright by the American Fisheries Society 2006.

  8. Fusion cross sections for reactions involving medium and heavy nucleus-nucleus systems

    NASA Astrophysics Data System (ADS)

    Atta, Debasis; Basu, D. N.

    2014-12-01

    Existing data on near-barrier fusion excitation functions of medium and heavy nucleus-nucleus systems have been analyzed by using a simple diffused-barrier formula derived assuming the Gaussian shape of the barrier-height distributions. The fusion cross section is obtained by folding the Gaussian barrier distribution with the classical expression for the fusion cross section for a fixed barrier. The energy dependence of the fusion cross section, thus obtained, provides good description to the existing data on near-barrier fusion and capture excitation functions for medium and heavy nucleus-nucleus systems. The theoretical values for the parameters of the barrier distribution are estimated which can be used for fusion or capture cross-section predictions that are especially important for planning experiments for synthesizing new superheavy elements.

  9. A questionnaire measuring staff perceptions of Lean adoption in healthcare: development and psychometric testing.

    PubMed

    Kaltenbrunner, Monica; Bengtsson, Lars; Mathiassen, Svend Erik; Engström, Maria

    2017-03-24

    During the past decade, the concept of Lean has spread rapidly within the healthcare sector, but there is a lack of instruments that can measure staff's perceptions of Lean adoption. Thus, the aim of the present study was to develop a questionnaire measuring Lean in healthcare, based on Liker's description of Lean, by adapting an existing instrument developed for the service sector. A mixed-method design was used. Initially, items from the service sector instrument were categorized according to Liker's 14 principles describing Lean within four domains: philosophy, processes, people and partners and problem-solving. Items were lacking for three of Liker's principles and were therefore developed de novo. Think-aloud interviews were conducted with 12 healthcare staff from different professions to contextualize and examine the face validity of the questionnaire prototype. Thereafter, the adjusted questionnaire's psychometric properties were assessed on the basis of a cross-sectional survey among 386 staff working in primary care. The think-aloud interviews led to adjustments in the questionnaire to better suit a healthcare context, and the number of items was reduced. Confirmatory factor analysis of the adjusted questionnaire showed a generally acceptable correspondence with Liker's description of Lean. Internal consistency, measured using Cronbach's alpha, for the factors in Liker's description of Lean was 0.60 for the factor people and partners, and over 0.70 for the three other factors. Test-retest reliability measured by the intra-class correlation coefficient ranged from 0.77 to 0.88 for the four factors. We designed a questionnaire capturing staff's perceptions of Lean adoption in healthcare on the basis of Liker's description. This Lean in Healthcare Questionnaire (LiHcQ) showed generally acceptable psychometric properties, which supports its usability for measuring Lean adoption in healthcare. We suggest that further research focus on verifying the usability of LiHcQ in other healthcare settings, and on adjusting the instrument if needed.

  10. Capturing Public Opinion on Public Health Topics: A Comparison of Experiences from a Systematic Review, Focus Group Study, and Analysis of Online, User-Generated Content.

    PubMed

    Giles, Emma Louise; Adams, Jean M

    2015-01-01

    Capturing public opinion toward public health topics is important to ensure that services, policy, and research are aligned with the beliefs and priorities of the general public. A number of approaches can be used to capture public opinion. We are conducting a program of work on the effectiveness and acceptability of health promoting financial incentive interventions. We have captured public opinion on financial incentive interventions using three methods: a systematic review, focus group study, and analysis of online user-generated comments to news media reports. In this short editorial-style piece, we compare and contrast our experiences with these three methods. Each of these methods had their advantages and disadvantages. Advantages include tailoring of the research question for systematic reviews, probing of answers during focus groups, and the ability to aggregate a large data set using online user-generated content. However, disadvantages include needing to update systematic reviews, participants conforming to a dominant perspective in focus groups, and being unable to collect respondent characteristics during analysis of user-generated online content. That said, analysis of user-generated online content offers additional time and resource advantages, and we found it elicited similar findings to those obtained via more traditional methods, such as systematic reviews and focus groups. A number of methods for capturing public opinions on public health topics are available. Public health researchers, policy makers, and practitioners should choose methods appropriate to their aims. Analysis of user-generated online content, especially in the context of news media reports, may be a quicker and cheaper alternative to more traditional methods, without compromising on the breadth of opinions captured.

  11. Caught Ya! A School-Based Practical Activity to Evaluate the Capture-Mark-Release-Recapture Method

    ERIC Educational Resources Information Center

    Kingsnorth, Crawford; Cruickshank, Chae; Paterson, David; Diston, Stephen

    2017-01-01

    The capture-mark-release-recapture method provides a simple way to estimate population size. However, when used as part of ecological sampling, this method does not easily allow an opportunity to evaluate the accuracy of the calculation because the actual population size is unknown. Here, we describe a method that can be used to measure the…

  12. How many atoms are required to characterize accurately trajectory fluctuations of a protein?

    NASA Astrophysics Data System (ADS)

    Cukier, Robert I.

    2010-06-01

    Large molecules, whose thermal fluctuations sample a complex energy landscape, exhibit motions on an extended range of space and time scales. Principal component analysis (PCA) is often used to extract dominant motions that in proteins are typically domain motions. These motions are captured in the large eigenvalue (leading) principal components. There is also information in the small eigenvalues, arising from approximate linear dependencies among the coordinates. These linear dependencies suggest that instead of using all the atom coordinates to represent a trajectory, it should be possible to use a reduced set of coordinates with little loss in the information captured by the large eigenvalue principal components. In this work, methods that can monitor the correlation (overlap) between a reduced set of atoms and any number of retained principal components are introduced. For application to trajectory data generated by simulations, where the overall translational and rotational motion needs to be eliminated before PCA is carried out, some difficulties with the overlap measures arise and methods are developed to overcome them. The overlap measures are evaluated for a trajectory generated by molecular dynamics for the protein adenylate kinase, which consists of a stable, core domain, and two more mobile domains, referred to as the LID domain and the AMP-binding domain. The use of reduced sets corresponding, for the smallest set, to one-eighth of the alpha carbon (CA) atoms relative to using all the CA atoms is shown to predict the dominant motions of adenylate kinase. The overlap between using all the CA atoms and all the backbone atoms is essentially unity for a sum over PCA modes that effectively capture the exact trajectory. A reduction to a few atoms (three in the LID and three in the AMP-binding domain) shows that at least the first principal component, characterizing a large part of the LID-binding and AMP-binding motion, is well described. Based on these results, the overlap criterion should be applicable as a guide to postulating and validating coarse-grained descriptions of generic biomolecular assemblies.

  13. Reducing Energy-Related CO2 Emissions Using Accelerated Limestone Weathering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rau, G H; Knauss, K G; Langer, W H

    2004-04-27

    Following earlier descriptions, the use and impacts of accelerated weathering of limestone AWL; reaction: CO{sub 2} + H{sub 2}O + CaCO{sub 3} {yields} Ca{sup 2+} + 2(HCO{sub 3}{sup -}) as a CO{sub 2} capture and sequestration method is further explored. Since ready access to the ocean is likely an essential requirement for AWL, it is shown that significant limestone resources are relatively close to a majority of CO{sub 2}-emitting power plants along the coastal US. Furthermore, waste fines, representing more than 20% of current US crushed limestone production (>10{sup 9} tonnes/yr), could be used in many instances as an inexpensivemore » or free source of AWL carbonate. With limestone transportation to coastal sites then as the dominant cost variable, CO{sub 2} sequestration (plus capture) costs of $3-$4/tonne are achievable in certain locations. While there is vastly more limestone and water on earth than that required for AWL to capture and sequester all fossil fuel CO{sub 2} production, the transportation cost of bringing limestone, seawater, and waste CO{sub 2} into contact likely limits the method's applicability to perhaps 10-20% of US point-source emissions. Using a bench-scale laboratory reactor, it is shown that CO{sub 2} sequestration rates of 10{sup -6} to 10{sup -5} moles/sec per m{sup 2} of limestone surface area are readily achievable using seawater. This translates into reaction densities as high as 2 x 10{sup -2} tonnes CO{sub 2} m{sup -3}day{sup -1}, highly dependent on limestone particle size, solution turbulence and flow, and CO{sub 2} concentration. Modeling of AWL end-solution disposal in the ocean shows significantly reduced effects on ocean pH and carbonate chemistry relative to those caused by direct CO{sub 2} disposal into the atmosphere or ocean. In fact the increase in ocean Ca{sup 2+} and bicarbonate offered by AWL should significantly enhance the growth of corals and other marine calcifiers whose health is currently being threatened by anthropogenic CO{sub 2} invasion and pH reduction in the ocean.« less

  14. Relative contributions of three descriptive methods: implications for behavioral assessment.

    PubMed

    Pence, Sacha T; Roscoe, Eileen M; Bourret, Jason C; Ahearn, William H

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods-the ABC method, the conditional probability method, and the conditional and background probability method-to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior participated. Functional analyses indicated that participants' problem behavior was maintained by social positive reinforcement (n = 2), social negative reinforcement (n = 2), or automatic reinforcement (n = 2). Results showed that for all but 1 participant, descriptive analysis outcomes were similar across methods. In addition, for all but 1 participant, the descriptive analysis outcome differed substantially from the functional analysis outcome. This supports the general finding that descriptive analysis is a poor means of determining functional relations.

  15. A comparison of radiative capture with decay gamma-ray method in bore hole logging for economic minerals

    USGS Publications Warehouse

    Senftle, F.E.; Moxham, R.M.; Tanner, A.B.

    1972-01-01

    The recent availability of borehole logging sondes employing a source of neutrons and a Ge(Li) detector opens up the possibility of analyzing either decay or capture gamma rays. The most efficient method for a given element can be predicted by calculating the decay-to-capture count ratio for the most prominent peaks in the respective spectra. From a practical point of view such a calculation must be slanted toward short irradiation and count times at each station in a borehole. A simplified method of computation is shown, and the decay-to-capture count ratio has been calculated and tabulated for the optimum value in the decay mode irrespective of the irradiation time, and also for a ten minute irradiation time. Based on analysis of a single peak in each spectrum, the results indicate the preferred technique and the best decay or capture peak to observe for those elements of economic interest. ?? 1972.

  16. Bridging gaps in handoffs: a continuity of care based approach.

    PubMed

    Abraham, Joanna; Kannampallil, Thomas G; Patel, Vimla L

    2012-04-01

    Handoff among healthcare providers has been recognized as a major source of medical errors. Most prior research has often focused on the communication aspects of handoff, with limited emphasis on the overall handoff process, especially from a clinician workflow perspective. Such a workflow perspective that is based on the continuity of care model provides a framework required to identify and support an interconnected trajectory of care events affecting handoff communication. To this end, we propose a new methodology, referred to as the clinician-centered approach that allows us to investigate and represent the entire clinician workflow prior to, during and, after handoff communication. This representation of clinician activities supports a comprehensive analysis of the interdependencies in the handoff process across the care continuum, as opposed to a single discrete, information sharing activity. The clinician-centered approach is supported by multifaceted methods for data collection such as observations, shadowing of clinicians, audio recording of handoff communication, semi-structured interviews and artifact identification and collection. The analysis followed a two-stage mixed inductive-deductive method. The iterative development of clinician-centered approach was realized using a multi-faceted study conducted in the Medical Intensive Care Unit (MICU) of an academic hospital. Using the clinician-centered approach, we (a) identify the nature, inherent characteristics and the interdependencies between three phases of the handoff process and (b) develop a descriptive framework of handoff communication in critical care that captures the non-linear, recursive and interactive nature of collaboration and decision-making. The results reported in this paper serve as a "proof of concept" of our approach, emphasizing the importance of capturing a coordinated and uninterrupted succession of clinician information management and transfer activities in relation to patient care events. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. The ecology and larval habitats characteristics of anopheline mosquitoes (Diptera: Culicidae) in Aligudarz County (Luristan province, western Iran)

    PubMed Central

    Amani, Hamid; Yaghoobi-Ershadi, Mohammad Reza; Kassiri, Hamid

    2014-01-01

    Objective To determine ecology and characteristics of the larval habitats of the genus Anopheles (Dipetra: Culicidae) in Aligudarz County, western Iran. Methods This descriptive cross-sectional research was carried out to study the anopheline larvae ecology in seven rural districts, Aligudarz County, from late April to late November 1997. Larvae were captured using the dipping method. Larval breeding places characteristics were noted according to water situation (turbid or clean, stagnant or running), substrate type, site type (man-made or natural), sunlight situation, site situation (transient or permanent, with or without vegetation). Results A total of 9 620 3rd and 4th instar larvae of Anopheles from 115 breeding places in 22 villages were captured, which belonged to the following species: Anopheles stephensi, Anopheles d'thali, Anopheles apoci, Anopheles superpictus (forms A and B), Anopheles marterii sogdianus, Anopheles turkhodi, Anopheles maculipennis S.L and Anopheles claviger. Anopheles stephensi, Anopheles maculipennis S.L and Anopheles apoci were collected for the first time in this county. Anopheles superpictus (93.18%) was the most prevailed one and dispersed over the entire region. Larval habitats consisted of nine natural and three artificial larval habitats. The most important larval habitats were river edges (54.8%), rice fields (12.2%), and grassland (8.7%) with permanent or transient, stagnant or running and clean water, with or without vegetation, sand or mud substrate in full sunlight area. Conclusions Regarding this research, river edges and rice fields are the most important breeding places of malaria vectors in Aligudarz County. It is worthy of note in larvicidal programs. PMID:25183088

  18. The use of mechanistic descriptions of algal growth and zooplankton grazing in an estuarine eutrophication model

    NASA Astrophysics Data System (ADS)

    Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.

    2003-03-01

    A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.

  19. What's Next in Complex Networks? Capturing the Concept of Attacking Play in Invasive Team Sports.

    PubMed

    Ramos, João; Lopes, Rui J; Araújo, Duarte

    2018-01-01

    The evolution of performance analysis within sports sciences is tied to technology development and practitioner demands. However, how individual and collective patterns self-organize and interact in invasive team sports remains elusive. Social network analysis has been recently proposed to resolve some aspects of this problem, and has proven successful in capturing collective features resulting from the interactions between team members as well as a powerful communication tool. Despite these advances, some fundamental team sports concepts such as an attacking play have not been properly captured by the more common applications of social network analysis to team sports performance. In this article, we propose a novel approach to team sports performance centered on sport concepts, namely that of an attacking play. Network theory and tools including temporal and bipartite or multilayered networks were used to capture this concept. We put forward eight questions directly related to team performance to discuss how common pitfalls in the use of network tools for capturing sports concepts can be avoided. Some answers are advanced in an attempt to be more precise in the description of team dynamics and to uncover other metrics directly applied to sport concepts, such as the structure and dynamics of attacking plays. Finally, we propose that, at this stage of knowledge, it may be advantageous to build up from fundamental sport concepts toward complex network theory and tools, and not the other way around.

  20. Precise calculation of neutron-capture reactions contribution in energy release for different types of VVER-1000 fuel assemblies

    NASA Astrophysics Data System (ADS)

    Tikhomirov, Georgy; Bahdanovich, Rynat; Pham, Phu

    2017-09-01

    Precise calculation of energy release in a nuclear reactor is necessary to obtain the correct spatial power distribution and predict characteristics of burned nuclear fuel. In this work, previously developed method for calculation neutron-capture reactions - capture component - contribution in effective energy release in a fuel core of nuclear reactor is discussed. The method was improved and implemented to the different models of VVER-1000 reactor developed for MCU 5 and MCNP 4 computer codes. Different models of equivalent cell and fuel assembly in the beginning of fuel cycle were calculated. These models differ by the geometry, fuel enrichment and presence of burnable absorbers. It is shown, that capture component depends on fuel enrichment and presence of burnable absorbers. Its value varies for different types of hot fuel assemblies from 3.35% to 3.85% of effective energy release. Average capture component contribution in effective energy release for typical serial fresh fuel of VVER-1000 is 3.5%, which is 7 MeV/fission. The method will be used in future to estimate the dependency of capture energy on fuel density, burn-up, etc.

  1. Novel shortcut estimation method for regeneration energy of amine solvents in an absorption-based carbon capture process.

    PubMed

    Kim, Huiyong; Hwang, Sung June; Lee, Kwang Soon

    2015-02-03

    Among various CO2 capture processes, the aqueous amine-based absorption process is considered the most promising for near-term deployment. However, the performance evaluation of newly developed solvents still requires complex and time-consuming procedures, such as pilot plant tests or the development of a rigorous simulator. Absence of accurate and simple calculation methods for the energy performance at an early stage of process development has lengthened and increased expense of the development of economically feasible CO2 capture processes. In this paper, a novel but simple method to reliably calculate the regeneration energy in a standard amine-based carbon capture process is proposed. Careful examination of stripper behaviors and exploitation of energy balance equations around the stripper allowed for calculation of the regeneration energy using only vapor-liquid equilibrium and caloric data. Reliability of the proposed method was confirmed by comparing to rigorous simulations for two well-known solvents, monoethanolamine (MEA) and piperazine (PZ). The proposed method can predict the regeneration energy at various operating conditions with greater simplicity, greater speed, and higher accuracy than those proposed in previous studies. This enables faster and more precise screening of various solvents and faster optimization of process variables and can eventually accelerate the development of economically deployable CO2 capture processes.

  2. Multiscale fracture network characterization and impact on flow: A case study on the Latemar carbonate platform

    NASA Astrophysics Data System (ADS)

    Hardebol, N. J.; Maier, C.; Nick, H.; Geiger, S.; Bertotti, G.; Boro, H.

    2015-12-01

    A fracture network arrangement is quantified across an isolated carbonate platform from outcrop and aerial imagery to address its impact on fluid flow. The network is described in terms of fracture density, orientation, and length distribution parameters. Of particular interest is the role of fracture cross connections and abutments on the effective permeability. Hence, the flow simulations explicitly account for network topology by adopting Discrete-Fracture-and-Matrix description. The interior of the Latemar carbonate platform (Dolomites, Italy) is taken as outcrop analogue for subsurface reservoirs of isolated carbonate build-ups that exhibit a fracture-dominated permeability. New is our dual strategy to describe the fracture network both as deterministic- and stochastic-based inputs for flow simulations. The fracture geometries are captured explicitly and form a multiscale data set by integration of interpretations from outcrops, airborne imagery, and lidar. The deterministic network descriptions form the basis for descriptive rules that are diagnostic of the complex natural fracture arrangement. The fracture networks exhibit a variable degree of multitier hierarchies with smaller-sized fractures abutting against larger fractures under both right and oblique angles. The influence of network topology on connectivity is quantified using Discrete-Fracture-Single phase fluid flow simulations. The simulation results show that the effective permeability for the fracture and matrix ensemble can be 50 to 400 times higher than the matrix permeability of 1.0 · 10-14 m2. The permeability enhancement is strongly controlled by the connectivity of the fracture network. Therefore, the degree of intersecting and abutting fractures should be captured from outcrops with accuracy to be of value as analogue.

  3. Perturbative Gaussianizing transforms for cosmological fields

    NASA Astrophysics Data System (ADS)

    Hall, Alex; Mead, Alexander

    2018-01-01

    Constraints on cosmological parameters from large-scale structure have traditionally been obtained from two-point statistics. However, non-linear structure formation renders these statistics insufficient in capturing the full information content available, necessitating the measurement of higher order moments to recover information which would otherwise be lost. We construct quantities based on non-linear and non-local transformations of weakly non-Gaussian fields that Gaussianize the full multivariate distribution at a given order in perturbation theory. Our approach does not require a model of the fields themselves and takes as input only the first few polyspectra, which could be modelled or measured from simulations or data, making our method particularly suited to observables lacking a robust perturbative description such as the weak-lensing shear. We apply our method to simulated density fields, finding a significantly reduced bispectrum and an enhanced correlation with the initial field. We demonstrate that our method reconstructs a large proportion of the linear baryon acoustic oscillations, improving the information content over the raw field by 35 per cent. We apply the transform to toy 21 cm intensity maps, showing that our method still performs well in the presence of complications such as redshift-space distortions, beam smoothing, pixel noise and foreground subtraction. We discuss how this method might provide a route to constructing a perturbative model of the fully non-Gaussian multivariate likelihood function.

  4. ODE constrained mixture modelling: a method for unraveling subpopulation structures and dynamics.

    PubMed

    Hasenauer, Jan; Hasenauer, Christine; Hucho, Tim; Theis, Fabian J

    2014-07-01

    Functional cell-to-cell variability is ubiquitous in multicellular organisms as well as bacterial populations. Even genetically identical cells of the same cell type can respond differently to identical stimuli. Methods have been developed to analyse heterogeneous populations, e.g., mixture models and stochastic population models. The available methods are, however, either incapable of simultaneously analysing different experimental conditions or are computationally demanding and difficult to apply. Furthermore, they do not account for biological information available in the literature. To overcome disadvantages of existing methods, we combine mixture models and ordinary differential equation (ODE) models. The ODE models provide a mechanistic description of the underlying processes while mixture models provide an easy way to capture variability. In a simulation study, we show that the class of ODE constrained mixture models can unravel the subpopulation structure and determine the sources of cell-to-cell variability. In addition, the method provides reliable estimates for kinetic rates and subpopulation characteristics. We use ODE constrained mixture modelling to study NGF-induced Erk1/2 phosphorylation in primary sensory neurones, a process relevant in inflammatory and neuropathic pain. We propose a mechanistic pathway model for this process and reconstructed static and dynamical subpopulation characteristics across experimental conditions. We validate the model predictions experimentally, which verifies the capabilities of ODE constrained mixture models. These results illustrate that ODE constrained mixture models can reveal novel mechanistic insights and possess a high sensitivity.

  5. Estimating Incidence Rate of Hospital-Treated Self-Harm in Hong Kong Using Capture-Recapture Approach.

    PubMed

    Leung Kwok, Chi; Yip, Paul S F

    2018-05-01

    A surveillance system for self-harm has not been established in Hong Kong. The existing data source has an unknown degree of underreporting, and therefore a capture-recapture method has been proposed to correct for the incompleteness. To assess the underestimation of the incidence of self-harm cases presenting to hospital in Hong Kong using a capture and recapture method. Two different yet overlapping hospital administrative datasets of self-harm were obtained from all public hospitals in Hong Kong. From 2002 to 2011, 59,473 distinct episodes involving 36,411 patients were identified. A capture-recapture model considering heterogeneous capture probabilities was applied to estimate the number of self-harm episodes. The estimated number of self-harm incidence was 79,923, equally shared by females and males. Cases of self-harm by females were more likely to be ascertained than those by males. The estimated annual incidence rate of self-harm in Hong Kong from 2002 to 2011 ranged from 96.4 in 2010 to 132.7 in 2002. The proposed method does not include patients who required no medical attention and those where the patient consulted private doctors. The capture-recapture model is a useful method for adjusting the underestimation of self-harm cases from existing databases when surveillance system is not available and to reveal some hidden patterns.

  6. Simulating Thermal Cycling and Isothermal Deformation Response of Polycrystalline NiTi

    NASA Technical Reports Server (NTRS)

    Manchiraju, Sivom; Gaydosh, Darrell J.; Noebe, Ronald D.; Anderson, Peter M.

    2011-01-01

    A microstructure-based FEM model that couples crystal plasticity, crystallographic descriptions of the B2-B19' martensitic phase transformation, and anisotropic elasticity is used to simulate thermal cycling and isothermal deformation in polycrystalline NiTi (49.9at% Ni). The model inputs include anisotropic elastic properties, polycrystalline texture, DSC data, and a subset of isothermal deformation and load-biased thermal cycling data. A key experimental trend is captured.namely, the transformation strain during thermal cycling is predicted to reach a peak with increasing bias stress, due to the onset of plasticity at larger bias stress. Plasticity induces internal stress that affects both thermal cycling and isothermal deformation responses. Affected thermal cycling features include hysteretic width, two-way shape memory effect, and evolution of texture with increasing bias stress. Affected isothermal deformation features include increased hardening during loading and retained martensite after unloading. These trends are not captured by microstructural models that lack plasticity, nor are they all captured in a robust manner by phenomenological approaches. Despite this advance in microstructural modeling, quantitative differences exist, such as underprediction of open loop strain during thermal cycling.

  7. FBI Fingerprint Image Capture System High-Speed-Front-End throughput modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rathke, P.M.

    1993-09-01

    The Federal Bureau of Investigation (FBI) has undertaken a major modernization effort called the Integrated Automated Fingerprint Identification System (IAFISS). This system will provide centralized identification services using automated fingerprint, subject descriptor, mugshot, and document processing. A high-speed Fingerprint Image Capture System (FICS) is under development as part of the IAFIS program. The FICS will capture digital and microfilm images of FBI fingerprint cards for input into a central database. One FICS design supports two front-end scanning subsystems, known as the High-Speed-Front-End (HSFE) and Low-Speed-Front-End, to supply image data to a common data processing subsystem. The production rate of themore » HSFE is critical to meeting the FBI`s fingerprint card processing schedule. A model of the HSFE has been developed to help identify the issues driving the production rate, assist in the development of component specifications, and guide the evolution of an operations plan. A description of the model development is given, the assumptions are presented, and some HSFE throughput analysis is performed.« less

  8. A study on validating KinectV2 in comparison of Vicon system as a motion capture system for using in Health Engineering in industry

    NASA Astrophysics Data System (ADS)

    Jebeli, Mahvash; Bilesan, Alireza; Arshi, Ahmadreza

    2017-06-01

    The currently available commercial motion capture systems are constrained by space requirement and thus pose difficulties when used in developing kinematic description of human movements within the existing manufacturing and production cells. The Kinect sensor does not share similar limitations but it is not as accurate. The proposition made in this article is to adopt the Kinect sensor in to facilitate implementation of Health Engineering concepts to industrial environments. This article is an evaluation of the Kinect sensor accuracy when providing three dimensional kinematic data. The sensor is thus utilized to assist in modeling and simulation of worker performance within an industrial cell. For this purpose, Kinect 3D data was compared to that of Vicon motion capture system in a gait analysis laboratory. Results indicated that the Kinect sensor exhibited a coefficient of determination of 0.9996 on the depth axis and 0.9849 along the horizontal axis and 0.2767 on vertical axis. The results prove the competency of the Kinect sensor to be used in the industrial environments.

  9. Identifying sources of heterogeneity in capture probabilities: An example using the Great Tit Parus major

    USGS Publications Warehouse

    Senar, J.C.; Conroy, M.J.; Carrascal, L.M.; Domenech, J.; Mozetich, I.; Uribe, F.

    1999-01-01

    Heterogeneous capture probabilities are a common problem in many capture-recapture studies. Several methods of detecting the presence of such heterogeneity are currently available, and stratification of data has been suggested as the standard method to avoid its effects. However, few studies have tried to identify sources of heterogeneity, or whether there are interactions among sources. The aim of this paper is to suggest an analytical procedure to identify sources of capture heterogeneity. We use data on the sex and age of Great Tits captured in baited funnel traps, at two localities differing in average temperature. We additionally use 'recapture' data obtained by videotaping at feeder (with no associated trap), where the tits ringed with different colours were recorded. This allowed us to test whether individuals in different classes (age, sex and condition) are not trapped because of trap shyness or because o a reduced use of the bait. We used logistic regression analysis of the capture probabilities to test for the effects of age, sex, condition, location and 'recapture method. The results showed a higher recapture probability in the colder locality. Yearling birds (either males or females) had the highest recapture prob abilities, followed by adult males, while adult females had the lowest recapture probabilities. There was no effect of the method of 'recapture' (trap or video tape), which suggests that adult females are less often captured in traps no because of trap-shyness but because of less dependence on supplementary food. The potential use of this methodological approach in other studies is discussed.

  10. Fluorescent detection of C-reactive protein using polyamide beads

    NASA Astrophysics Data System (ADS)

    Jagadeesh, Shreesha; Chen, Lu; Aitchison, Stewart

    2016-03-01

    Bacterial infection causes Sepsis which is one of the leading cause of mortality in hospitals. This infection can be quantified from blood plasma using C - reactive protein (CRP). A quick diagnosis at the patient's location through Point-of- Care (POC) testing could give doctors the confidence to prescribe antibiotics. In this paper, the development and testing of a bead-based procedure for CRP quantification is described. The size of the beads enable them to be trapped in wells without the need for magnetic methods of immobilization. Large (1.5 mm diameter) Polyamide nylon beads were used as the substrate for capturing CRP from pure analyte samples. The beads captured CRP either directly through adsorption or indirectly by having specific capture antibodies on their surface. Both methods used fluorescent imaging techniques to quantify the protein. The amount of CRP needed to give a sufficient fluorescent signal through direct capture method was found suitable for identifying bacterial causes of infection. Similarly, viral infections could be quantified by the more sensitive indirect capture method. This bead-based assay can be potentially integrated as a disposable cartridge in a POC device due to its passive nature and the small quantities needed.

  11. Method for the capture and storage of waste

    DOEpatents

    None

    2017-01-24

    Systems and methods for capturing waste are disclosed. The systems and methods provide for a high level of confinement and long term stability. The systems and methods include adsorbing waste into a metal-organic framework (MOF), and applying pressure to the MOF material's framework to crystallize or make amorphous the MOF material thereby changing the MOF's pore structure and sorption characteristics without collapsing the MOF framework.

  12. The Dynamic Photometric Stereo Method Using a Multi-Tap CMOS Image Sensor †

    PubMed Central

    Yoda, Takuya; Nagahara, Hajime; Taniguchi, Rin-ichiro; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2018-01-01

    The photometric stereo method enables estimation of surface normals from images that have been captured using different but known lighting directions. The classical photometric stereo method requires at least three images to determine the normals in a given scene. However, this method cannot be applied to dynamic scenes because it is assumed that the scene remains static while the required images are captured. In this work, we present a dynamic photometric stereo method for estimation of the surface normals in a dynamic scene. We use a multi-tap complementary metal-oxide-semiconductor (CMOS) image sensor to capture the input images required for the proposed photometric stereo method. This image sensor can divide the electrons from the photodiode from a single pixel into the different taps of the exposures and can thus capture multiple images under different lighting conditions with almost identical timing. We implemented a camera lighting system and created a software application to enable estimation of the normal map in real time. We also evaluated the accuracy of the estimated surface normals and demonstrated that our proposed method can estimate the surface normals of dynamic scenes. PMID:29510599

  13. Capture Versus Capture Zones: Clarifying Terminology Related to Sources of Water to Wells.

    PubMed

    Barlow, Paul M; Leake, Stanley A; Fienen, Michael N

    2018-03-15

    The term capture, related to the source of water derived from wells, has been used in two distinct yet related contexts by the hydrologic community. The first is a water-budget context, in which capture refers to decreases in the rates of groundwater outflow and (or) increases in the rates of recharge along head-dependent boundaries of an aquifer in response to pumping. The second is a transport context, in which capture zone refers to the specific flowpaths that define the three-dimensional, volumetric portion of a groundwater flow field that discharges to a well. A closely related issue that has become associated with the source of water to wells is streamflow depletion, which refers to the reduction in streamflow caused by pumping, and is a type of capture. Rates of capture and streamflow depletion are calculated by use of water-budget analyses, most often with groundwater-flow models. Transport models, particularly particle-tracking methods, are used to determine capture zones to wells. In general, however, transport methods are not useful for quantifying actual or potential streamflow depletion or other types of capture along aquifer boundaries. To clarify the sometimes subtle differences among these terms, we describe the processes and relations among capture, capture zones, and streamflow depletion, and provide proposed terminology to distinguish among them. Published 2018. This article is a U.S. Government work and is in the public domain in the USA. Groundwater published by Wiley Periodicals, Inc. on behalf of National Ground Water Association.

  14. Introduction to the special collection of papers on the San Luis Basin Sustainability Metrics Project: a methodology for evaluating regional sustainability.

    PubMed

    Heberling, Matthew T; Hopton, Matthew E

    2012-11-30

    This paper introduces a collection of four articles describing the San Luis Basin Sustainability Metrics Project. The Project developed a methodology for evaluating regional sustainability. This introduction provides the necessary background information for the project, description of the region, overview of the methods, and summary of the results. Although there are a multitude of scientifically based sustainability metrics, many are data intensive, difficult to calculate, and fail to capture all aspects of a system. We wanted to see if we could develop an approach that decision-makers could use to understand if their system was moving toward or away from sustainability. The goal was to produce a scientifically defensible, but straightforward and inexpensive methodology to measure and monitor environmental quality within a regional system. We initiated an interdisciplinary pilot project in the San Luis Basin, south-central Colorado, to test the methodology. The objectives were: 1) determine the applicability of using existing datasets to estimate metrics of sustainability at a regional scale; 2) calculate metrics through time from 1980 to 2005; and 3) compare and contrast the results to determine if the system was moving toward or away from sustainability. The sustainability metrics, chosen to represent major components of the system, were: 1) Ecological Footprint to capture the impact and human burden on the system; 2) Green Net Regional Product to represent economic welfare; 3) Emergy to capture the quality-normalized flow of energy through the system; and 4) Fisher information to capture the overall dynamic order and to look for possible regime changes. The methodology, data, and results of each metric are presented in the remaining four papers of the special collection. Based on the results of each metric and our criteria for understanding the sustainability trends, we find that the San Luis Basin is moving away from sustainability. Although we understand there are strengths and limitations of the methodology, we argue that each metric identifies changes to major components of the system. Published by Elsevier Ltd.

  15. Nystagmus Assessments Documented by Emergency Physicians in Acute Dizziness Presentations: A Target for Decision Support?

    PubMed Central

    Kerber, Kevin A.; Morgenstern, Lewis B.; Meurer, William J.; McLaughlin, Thomas; Hall, Pamela A.; Forman, Jane; Fendrick, A. Mark; Newman-Toker, David E.

    2011-01-01

    Objectives Dizziness is a common presenting complaint to the emergency department (ED), and emergency physicians (EPs) consider these presentations a priority for decision support. Assessing for nystagmus and defining its features are important steps for any acute dizziness decision algorithm. The authors sought to describe nystagmus documentation in routine ED care to determine if nystagmus assessments might be an important target in decision support efforts. Methods Medical records from ED visits for dizziness were captured as part of a surveillance study embedded within an ongoing population-based cohort study. Visits with documentation of a nystagmus assessment were reviewed and coded for presence or absence of nystagmus, ability to draw a meaningful inference from the description, and coherence with the final EP diagnosis when a peripheral vestibular diagnosis was made. Results Of 1,091 visits for dizziness, 887 (81.3%) documented a nystagmus assessment. Nystagmus was present in 185 out of 887 (20.9%) visits. When nystagmus was present, no further characteristics were recorded in 48 of the 185 visits (26%). The documentation of nystagmus (including all descriptors recorded) enabled a meaningful inference about the localization or cause in only 10 of the 185 (5.4%) visits. The nystagmus description conflicted with the EP diagnosis in 113 (80.7%) of the 140 visits that received a peripheral vestibular diagnosis. Conclusions Nystagmus assessments are frequently documented in acute dizziness presentations, but details do not generally enable a meaningful inference. Recorded descriptions usually conflict with the diagnosis when a peripheral vestibular diagnosis is rendered. Nystagmus assessments might be an important target in developing decision support for dizziness presentations. PMID:21676060

  16. Variable High Order Multiblock Overlapping Grid Methods for Mixed Steady and Unsteady Multiscale Viscous Flows

    NASA Technical Reports Server (NTRS)

    Sjogreen, Bjoern; Yee, H. C.

    2007-01-01

    Flows containing steady or nearly steady strong shocks in parts of the flow field, and unsteady turbulence with shocklets on other parts of the flow field are difficult to capture accurately and efficiently employing the same numerical scheme even under the multiblock grid or adaptive grid refinement framework. On one hand, sixth-order or higher shock-capturing methods are appropriate for unsteady turbulence with shocklets. On the other hand, lower order shock-capturing methods are more effective for strong steady shocks in terms of convergence. In order to minimize the shortcomings of low order and high order shock-capturing schemes for the subject flows,a multi- block overlapping grid with different orders of accuracy on different blocks is proposed. Test cases to illustrate the performance of the new solver are included.

  17. Design Evaluation for Personnel, Training and Human Factors (DEPTH) Final Report.

    DTIC Science & Technology

    1998-01-17

    human activity was primarily intended to facilitate man-machine design analyses of complex systems. By importing computer aided design (CAD) data, the human figure models and analysis algorithms can help to ensure components can be seen, reached, lifted and removed by most maintainers. These simulations are also useful for logistics data capture, training, and task analysis. DEPTH was also found to be useful in obtaining task descriptions for technical

  18. Optical holography applications for the zero-g Atmospheric Cloud Physics Laboratory

    NASA Technical Reports Server (NTRS)

    Kurtz, R. L.

    1974-01-01

    A complete description of holography is provided, both for the time-dependent case of moving scene holography and for the time-independent case of stationary holography. Further, a specific holographic arrangement for application to the detection of particle size distribution in an atmospheric simulation cloud chamber. In this chamber particle growth rate is investigated; therefore, the proposed holographic system must capture continuous particle motion in real time. Such a system is described.

  19. Validity of Eye Movement Methods and Indices for Capturing Semantic (Associative) Priming Effects

    ERIC Educational Resources Information Center

    Odekar, Anshula; Hallowell, Brooke; Kruse, Hans; Moates, Danny; Lee, Chao-Yang

    2009-01-01

    Purpose: The purpose of this investigation was to evaluate the usefulness of eye movement methods and indices as a tool for studying priming effects by verifying whether eye movement indices capture semantic (associative) priming effects in a visual cross-format (written word to semantically related picture) priming paradigm. Method: In the…

  20. Laser capture microdissection: Arcturus(XT) infrared capture and UV cutting methods.

    PubMed

    Gallagher, Rosa I; Blakely, Steven R; Liotta, Lance A; Espina, Virginia

    2012-01-01

    Laser capture microdissection (LCM) is a technique that allows the precise procurement of enriched cell populations from a heterogeneous tissue under direct microscopic visualization. LCM can be used to harvest the cells of interest directly or can be used to isolate specific cells by ablating the unwanted cells, resulting in histologically enriched cell populations. The fundamental components of laser microdissection technology are (a) visualization of the cells of interest via microscopy, (b) transfer of laser energy to a thermolabile polymer with either the formation of a polymer-cell composite (capture method) or transfer of laser energy via an ultraviolet laser to photovolatize a region of tissue (cutting method), and (c) removal of cells of interest from the heterogeneous tissue section. Laser energy supplied by LCM instruments can be infrared (810 nm) or ultraviolet (355 nm). Infrared lasers melt thermolabile polymers for cell capture, whereas ultraviolet lasers ablate cells for either removal of unwanted cells or excision of a defined area of cells. LCM technology is applicable to an array of applications including mass spectrometry, DNA genotyping and loss-of-heterozygosity analysis, RNA transcript profiling, cDNA library generation, proteomics discovery, and signal kinase pathway profiling. This chapter describes the unique features of the Arcturus(XT) laser capture microdissection instrument, which incorporates both infrared capture and ultraviolet cutting technology in one instrument, using a proteomic downstream assay as a model.

  1. Correlation of Descriptive Analysis and Instrumental Puncture Testing of Watermelon Cultivars.

    PubMed

    Shiu, J W; Slaughter, D C; Boyden, L E; Barrett, D M

    2016-06-01

    The textural properties of 5 seedless watermelon cultivars were assessed by descriptive analysis and the standard puncture test using a hollow probe with increased shearing properties. The use of descriptive analysis methodology was an effective means of quantifying watermelon sensory texture profiles for characterizing specific cultivars' characteristics. Of the 10 cultivars screened, 71% of the variation in the sensory attributes was measured using the 1st 2 principal components. Pairwise correlation of the hollow puncture probe and sensory parameters determined that initial slope, maximum force, and work after maximum force measurements all correlated well to the sensory attributes crisp and firm. These findings confirm that maximum force correlates well with not only firmness in watermelon, but crispness as well. The initial slope parameter also captures the sensory crispness of watermelon, but is not as practical to measure in the field as maximum force. The work after maximum force parameter is thought to reflect cellular arrangement and membrane integrity that in turn impact sensory firmness and crispness. Watermelon cultivar types were correctly predicted by puncture test measurements in heart tissue 87% of the time, although descriptive analysis was correct 54% of the time. © 2016 Institute of Food Technologists®

  2. First record of Mollisquama sp. (Chondrichthyes: Squaliformes: Dalatiidae) from the Gulf of Mexico, with a morphological comparison to the holotype description of Mollisquama parini Dolganov.

    PubMed

    Grace, Mark A; Doosey, Michael H; Bart, Henry L; Naylor, Gavin J P

    2015-04-22

    The description of the pocket shark genus Mollisquama (M. parini Dolganov, 1984) is based on a single known specimen collected from the Nazca Ridge of the southeast Pacific Ocean. A second Mollisquama specimen has been captured in the central Gulf of Mexico establishing a considerable range extension and a parturition locality because the specimen has a healed vitelline scar. Both the holotype of M. parini and the Gulf of Mexico specimen possess the remarkable pocket gland with its large slit-like external opening located just above the pectoral fin. Features found on the Gulf of Mexico specimen that were not noted in the description of M. parini include a series of ventral abdominal photophore agglomerations and a modified dermal denticle surrounded by a radiating arrangement of denticles just posterior to the mouth. Based on a morphometric and meristic comparison of the Gulf of Mexico specimen with information in the description of M. parini, the Gulf of Mexico specimen is identified as Mollisquama sp. due to differences in tooth morphology and vertebral counts. Phylogenetic analysis of NADH2 gene sequences places Mollisquama sister to Dalatias plus Isistius within the family Dalatiidae.

  3. On the upscaling of process-based models in deltaic applications

    NASA Astrophysics Data System (ADS)

    Li, L.; Storms, J. E. A.; Walstra, D. J. R.

    2018-03-01

    Process-based numerical models are increasingly used to study the evolution of marine and terrestrial depositional environments. Whilst a detailed description of small-scale processes provides an accurate representation of reality, application on geological timescales is restrained by the associated increase in computational time. In order to reduce the computational time, a number of acceleration methods are combined and evaluated for a schematic supply-driven delta (static base level) and an accommodation-driven delta (variable base level). The performance of the combined acceleration methods is evaluated by comparing the morphological indicators such as distributary channel networking and delta volumes derived from the model predictions for various levels of acceleration. The results of the accelerated models are compared to the outcomes from a series of simulations to capture autogenic variability. Autogenic variability is quantified by re-running identical models on an initial bathymetry with 1 cm added noise. The overall results show that the variability of the accelerated models fall within the autogenic variability range, suggesting that the application of acceleration methods does not significantly affect the simulated delta evolution. The Time-scale compression method (the acceleration method introduced in this paper) results in an increased computational efficiency of 75% without adversely affecting the simulated delta evolution compared to a base case. The combination of the Time-scale compression method with the existing acceleration methods has the potential to extend the application range of process-based models towards geologic timescales.

  4. Estimation and modeling of electrofishing capture efficiency for fishes in wadeable warmwater streams

    USGS Publications Warehouse

    Price, A.; Peterson, James T.

    2010-01-01

    Stream fish managers often use fish sample data to inform management decisions affecting fish populations. Fish sample data, however, can be biased by the same factors affecting fish populations. To minimize the effect of sample biases on decision making, biologists need information on the effectiveness of fish sampling methods. We evaluated single-pass backpack electrofishing and seining combined with electrofishing by following a dual-gear, mark–recapture approach in 61 blocknetted sample units within first- to third-order streams. We also estimated fish movement out of unblocked units during sampling. Capture efficiency and fish abundances were modeled for 50 fish species by use of conditional multinomial capture–recapture models. The best-approximating models indicated that capture efficiencies were generally low and differed among species groups based on family or genus. Efficiencies of single-pass electrofishing and seining combined with electrofishing were greatest for Catostomidae and lowest for Ictaluridae. Fish body length and stream habitat characteristics (mean cross-sectional area, wood density, mean current velocity, and turbidity) also were related to capture efficiency of both methods, but the effects differed among species groups. We estimated that, on average, 23% of fish left the unblocked sample units, but net movement varied among species. Our results suggest that (1) common warmwater stream fish sampling methods have low capture efficiency and (2) failure to adjust for incomplete capture may bias estimates of fish abundance. We suggest that managers minimize bias from incomplete capture by adjusting data for site- and species-specific capture efficiency and by choosing sampling gear that provide estimates with minimal bias and variance. Furthermore, if block nets are not used, we recommend that managers adjust the data based on unconditional capture efficiency.

  5. Urinary corticosterone responses to capture and toe-clipping in the cane toad (Rhinella marina) indicate that toe-clipping is a stressor for amphibians.

    PubMed

    Narayan, Edward J; Molinia, Frank C; Kindermann, Christina; Cockrem, John F; Hero, Jean-Marc

    2011-11-01

    Toe-clipping, the removal of one or more toes, is a common method used to individually mark free-living animals. Whilst this method is widely used in studies of amphibians, the appropriateness of the method, and its potential detrimental effects have been the subject of debate. Here, we provide for the first time, evidence that toe-clipping is a stressor in a wild amphibian. We measured urinary corticosterone responses of male cane toads (Rhinella marina) to capture and handling only, and to toe-clipping under field conditions. Urinary testosterone concentrations and white blood cell proportions were also measured. Urinary corticosterone metabolite concentrations increased 6h after capture and handling only and remained high for 24h; corticosterone returned to baseline levels after 48 h and remained low at 72 h post capture and handling. Corticosterone concentrations in toads subjected to toe-clipping increased at 6h to significantly higher concentrations than after capture and handling only, then decreased more slowly than after capture and handling, and were still elevated (approximately double basal level) 72 h after toe-clipping. Testosterone did not change significantly after capture and handling only, whereas after toe-clipping testosterone decreased at 6h and remained low at 72 h. There were weak short-term effects of toe-clipping compared with capture and handling only on white blood cell proportions. We have clearly shown that toe-clipping is a distinctly stronger stressor than capture and handling alone. This indicates that there is an ethical cost of toe-clipping, and this should be considered when planning studies of amphibians. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Interviewing patients and practitioners working together in teams. A multi-layered puzzle: putting the pieces together.

    PubMed

    Ringstad, Oystein

    2010-08-01

    This paper presents and evaluates a methodological approach aiming at analysing some of the complex interaction between patients and different health care practitioners working together in teams. Qualitative health care research describes the values, perceptions and conceptions of patients and practitioners. In modern clinical work patients and professional practitioners often work together on complex cases involving different kinds of knowledge and values, each of them representing different perspectives. We need studies designed to capture this complexity. The methodological approach presented here is exemplified with a study in rehabilitation medicine. In this part of the health care system the clinical work is organized in multi-professional clinical teams including patients, handling complex rehabilitation processes. In the presented approach data are collected in individual in-depth interviews to have thorough descriptions of each individual perspective. The interaction in the teams is analysed by comparing different descriptions of the same situations from the involved individuals. We may then discuss how these perceptions relate to each other and how the individuals in the team interact. Two examples from an empirical study are presented and discussed, illustrating how communication, differences in evaluations and the interpretation of incidents, arguments, emotions and interpersonal relations may be discussed. It is argued that this approach may give information which can supplement the methods commonly applied in qualitative health care research today.

  7. Non-integer viscoelastic constitutive law to model soft biological tissues to in-vivo indentation.

    PubMed

    Demirci, Nagehan; Tönük, Ergin

    2014-01-01

    During the last decades, derivatives and integrals of non-integer orders are being more commonly used for the description of constitutive behavior of various viscoelastic materials including soft biological tissues. Compared to integer order constitutive relations, non-integer order viscoelastic material models of soft biological tissues are capable of capturing a wider range of viscoelastic behavior obtained from experiments. Although integer order models may yield comparably accurate results, non-integer order material models have less number of parameters to be identified in addition to description of an intermediate material that can monotonically and continuously be adjusted in between an ideal elastic solid and an ideal viscous fluid. In this work, starting with some preliminaries on non-integer (fractional) calculus, the "spring-pot", (intermediate mechanical element between a solid and a fluid), non-integer order three element (Zener) solid model, finally a user-defined large strain non-integer order viscoelastic constitutive model was constructed to be used in finite element simulations. Using the constitutive equation developed, by utilizing inverse finite element method and in vivo indentation experiments, soft tissue material identification was performed. The results indicate that material coefficients obtained from relaxation experiments, when optimized with creep experimental data could simulate relaxation, creep and cyclic loading and unloading experiments accurately. Non-integer calculus viscoelastic constitutive models, having physical interpretation and modeling experimental data accurately is a good alternative to classical phenomenological viscoelastic constitutive equations.

  8. Initial proposition of kinematics model for selected karate actions analysis

    NASA Astrophysics Data System (ADS)

    Hachaj, Tomasz; Koptyra, Katarzyna; Ogiela, Marek R.

    2017-03-01

    The motivation for this paper is to initially propose and evaluate two new kinematics models that were developed to describe motion capture (MoCap) data of karate techniques. We decided to develop this novel proposition to create the model that is capable to handle actions description both from multimedia and professional MoCap hardware. For the evaluation purpose we have used 25-joints data with karate techniques recordings acquired with Kinect version 2. It is consisted of MoCap recordings of two professional sport (black belt) instructors and masters of Oyama Karate. We have selected following actions for initial analysis: left-handed furi-uchi punch, right leg hiza-geri kick, right leg yoko-geri kick and left-handed jodan-uke block. Basing on evaluation we made we can conclude that both proposed kinematics models seems to be convenient method for karate actions description. From two proposed variables models it seems that global might be more useful for further usage. We think that because in case of considered punches variables seems to be less correlated and they might also be easier to interpret because of single reference coordinate system. Also principal components analysis proved to be reliable way to examine the quality of kinematics models and with the plot of the variable in principal components space we can nicely present the dependences between variables.

  9. Transport-reaction model for defect and carrier behavior within displacement cascades in gallium arsenide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wampler, William R.; Myers, Samuel M.

    2014-02-01

    A model is presented for recombination of charge carriers at displacement damage in gallium arsenide, which includes clustering of the defects in atomic displacement cascades produced by neutron or ion irradiation. The carrier recombination model is based on an atomistic description of capture and emission of carriers by the defects with time evolution resulting from the migration and reaction of the defects. The physics and equations on which the model is based are presented, along with details of the numerical methods used for their solution. The model uses a continuum description of diffusion, field-drift and reaction of carriers and defectsmore » within a representative spherically symmetric cluster. The initial radial defect profiles within the cluster were chosen through pair-correlation-function analysis of the spatial distribution of defects obtained from the binary-collision code MARLOWE, using recoil energies for fission neutrons. Charging of the defects can produce high electric fields within the cluster which may influence transport and reaction of carriers and defects, and which may enhance carrier recombination through band-to-trap tunneling. Properties of the defects are discussed and values for their parameters are given, many of which were obtained from density functional theory. The model provides a basis for predicting the transient response of III-V heterojunction bipolar transistors to pulsed neutron irradiation.« less

  10. Judgment and decision making.

    PubMed

    Fischhoff, Baruch

    2010-09-01

    The study of judgment and decision making entails three interrelated forms of research: (1) normative analysis, identifying the best courses of action, given decision makers' values; (2) descriptive studies, examining actual behavior in terms comparable to the normative analyses; and (3) prescriptive interventions, helping individuals to make better choices, bridging the gap between the normative ideal and the descriptive reality. The research is grounded in analytical foundations shared by economics, psychology, philosophy, and management science. Those foundations provide a framework for accommodating affective and social factors that shape and complement the cognitive processes of decision making. The decision sciences have grown through applications requiring collaboration with subject matter experts, familiar with the substance of the choices and the opportunities for interventions. Over the past half century, the field has shifted its emphasis from predicting choices, which can be successful without theoretical insight, to understanding the processes shaping them. Those processes are often revealed through biases that suggest non-normative processes. The practical importance of these biases depends on the sensitivity of specific decisions and the support that individuals have in making them. As a result, the field offers no simple summary of individuals' competence as decision makers, but a suite of theories and methods suited to capturing these sensitivities. Copyright © 2010 John Wiley & Sons, Ltd. For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  11. Simplifying the Reuse and Interoperability of Geoscience Data Sets and Models with Semantic Metadata that is Human-Readable and Machine-actionable

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2017-12-01

    Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.

  12. Toward a metric for patterned injury analysis

    NASA Astrophysics Data System (ADS)

    Oliver, William R.; Fritsch, Daniel S.

    1997-02-01

    An intriguing question in the matching of objects with patterned injures in two and three dimensions is that of an appropriate metric for closeness -- is it possible to objectively measure how well an object 'fits' a patterned injury. Many investigators have suggested an energy-based metric, and have used such metrics to analyze craniofacial growth and anatomic variation. A strict dependence on homology is the primary disadvantage of this energy functional for generalized biological structures; many shapes do not have obvious landmarks. Some tentative solutions to the problem of landmark dependency for patterned injury analysis are presented. One intriguing approach comes from recent work in axiomatic vision. This approach has resulted in the development of a multiresolution medial axis for the extraction of shape primitives which can be used as the basis for registration. A scale-based description of this process can be captured in structures called cores, which can describe object shape and position in a highly compact manner. Cores may provide a scale- and shape-based method of determining correspondences necessary for determining the number and position of landmarks for some patterned injuries. Each of the approaches described are generalizable to higher dimensions, and can thus be used to analyze both two- and three- dimensional data. Together, they may represent a reasonable way of measuring shape distance for the purpose of matching objects and wounds, and can be combined with texture measures for a complete description.

  13. Skill components of task analysis

    PubMed Central

    Rogers, Wendy A.; Fisk, Arthur D.

    2017-01-01

    Some task analysis methods break down a task into a hierarchy of subgoals. Although an important tool of many fields of study, learning to create such a hierarchy (redescription) is not trivial. To further the understanding of what makes task analysis a skill, the present research examined novices’ problems with learning Hierarchical Task Analysis and captured practitioners’ performance. All participants received a task description and analyzed three cooking and three communication tasks by drawing on their knowledge of those tasks. Thirty six younger adults (18–28 years) in Study 1 analyzed one task before training and five afterwards. Training consisted of a general handout that all participants received and an additional handout that differed between three conditions: a list of steps, a flow-diagram, and concept map. In Study 2, eight experienced task analysts received the same task descriptions as in Study 1 and demonstrated their understanding of task analysis while thinking aloud. Novices’ initial task analysis scored low on all coding criteria. Performance improved on some criteria but was well below 100 % on others. Practitioners’ task analyses were 2–3 levels deep but also scored low on some criteria. A task analyst’s purpose of analysis may be the reason for higher specificity of analysis. This research furthers the understanding of Hierarchical Task Analysis and provides insights into the varying nature of task analyses as a function of experience. The derived skill components can inform training objectives. PMID:29075044

  14. Spatial Pyramid Covariance based Compact Video Code for Robust Face Retrieval in TV-series.

    PubMed

    Li, Yan; Wang, Ruiping; Cui, Zhen; Shan, Shiguang; Chen, Xilin

    2016-10-10

    We address the problem of face video retrieval in TV-series which searches video clips based on the presence of specific character, given one face track of his/her. This is tremendously challenging because on one hand, faces in TV-series are captured in largely uncontrolled conditions with complex appearance variations, and on the other hand retrieval task typically needs efficient representation with low time and space complexity. To handle this problem, we propose a compact and discriminative representation for the huge body of video data, named Compact Video Code (CVC). Our method first models the face track by its sample (i.e., frame) covariance matrix to capture the video data variations in a statistical manner. To incorporate discriminative information and obtain more compact video signature suitable for retrieval, the high-dimensional covariance representation is further encoded as a much lower-dimensional binary vector, which finally yields the proposed CVC. Specifically, each bit of the code, i.e., each dimension of the binary vector, is produced via supervised learning in a max margin framework, which aims to make a balance between the discriminability and stability of the code. Besides, we further extend the descriptive granularity of covariance matrix from traditional pixel-level to more general patchlevel, and proceed to propose a novel hierarchical video representation named Spatial Pyramid Covariance (SPC) along with a fast calculation method. Face retrieval experiments on two challenging TV-series video databases, i.e., the Big Bang Theory and Prison Break, demonstrate the competitiveness of the proposed CVC over state-of-the-art retrieval methods. In addition, as a general video matching algorithm, CVC is also evaluated in traditional video face recognition task on a standard Internet database, i.e., YouTube Celebrities, showing its quite promising performance by using an extremely compact code with only 128 bits.

  15. Nest visits and capture events affect breeding success of Yellow-billed and Pacific loons

    USGS Publications Warehouse

    Uher-Koch, Brian D.; Schmutz, Joel A.; Wright, Kenneth G.

    2015-01-01

    Accurate estimates of breeding success are essential for understanding population dynamics and for managing populations. Unfortunately, research activities to collect these data can negatively impact the breeding success of the study species and bias estimates of breeding success. Despite the potential for negative impacts, few studies have documented the effect of capturing incubating adults on nest survival or compared nest survival following different capture methods. In this study we evaluate the impacts of investigator disturbance associated with captures and nest visits on nest survival of Yellow-billed Loons (Gavia adamsii) and Pacific Loons (Gavia pacifica) in the National Petroleum Reserve-Alaska (NPR-A), an area of conservation concern, in 2011–2013. In an effort to reduce capture-related nest failures, we developed a new suspended dive net technique to catch territorial aquatic birds while off their nests. We then compared nest survival following suspended dive net captures to bow-net trap captures of breeding adult loons. Daily nest survival following bow-net trap or suspended dive net capture was about 30% lower than when adults were not captured. The effect of captures on nest survival was similar between bow-net trap and suspended dive net capture methods. Nest visits without captures also negatively impacted nest survival, although less than captures. If not accounted for, nest visitation biased daily survival rates of nests downward 6%. Effects of investigator disturbance did not differ by species or between years. Our results suggest that any source of disturbance that displaces incubating adult loons could potentially reduce nest survival. To maximize breeding success, human disturbance factors should be limited near loon nests.

  16. A Descriptive and Interpretative Information System for the IODP

    NASA Astrophysics Data System (ADS)

    Blum, P.; Foster, P. A.; Mateo, Z.

    2006-12-01

    The ODP/IODP has a long and rich history of collecting descriptive and interpretative information (DESCINFO) from rock and sediment cores from the world's oceans. Unlike instrumental data, DESCINFO generated by subject experts is biased by the scientific and cultural background of the observers and their choices of classification schemes. As a result, global searches of DESCINFO and its integration with other data are problematical. To address this issue, the IODP-USIO is in the process of designing and implementing a DESCINFO system for IODP Phase 2 (2007-2013) that meets the user expectations expressed over the past decade. The requirements include support of (1) detailed, material property-based descriptions as well as classification-based descriptions; (2) global searches by physical sample and digital data sources as well as any of the descriptive parameters; (3) user-friendly data capture tools for a variety of workflows; and (4) extensive visualization of DESCINFO data along with instrumental data and images; and (5) portability/interoperability such that the system can work with database schemas of other organizations - a specific challenge given the schema and semantic heterogeneity not only among the three IODP operators but within the geosciences in general. The DESCINFO approach is based on the definition of a set of generic observable parameters that are populated with numeric or text values. Text values are derived from controlled, extensible hierarchical value lists that allow descriptions at the appropriate level of detail and ensure successful data searches. Material descriptions can be completed independently of domain-specific classifications, genetic concepts, and interpretative frameworks.

  17. A scalable, fully automated process for construction of sequence-ready human exome targeted capture libraries

    PubMed Central

    2011-01-01

    Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol. PMID:21205303

  18. A Method to Capture Macroslip at Bolted Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, Ronald Neil; Heitman, Lili Anne Akin

    2015-10-01

    Relative motion at bolted connections can occur for large shock loads as the internal shear force in the bolted connection overcomes the frictional resistive force. This macroslip in a structure dissipates energy and reduces the response of the components above the bolted connection. There is a need to be able to capture macroslip behavior in a structural dynamics model. A linear model and many nonlinear models are not able to predict marcoslip effectively. The proposed method to capture macroslip is to use the multi-body dynamics code ADAMS to model joints with 3-D contact at the bolted interfaces. This model includesmore » both static and dynamic friction. The joints are preloaded and the pinning effect when a bolt shank impacts a through hole inside diameter is captured. Substructure representations of the components are included to account for component flexibility and dynamics. This method was applied to a simplified model of an aerospace structure and validation experiments were performed to test the adequacy of the method.« less

  19. A Method to Capture Macroslip at Bolted Interfaces [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopkins, Ronald Neil; Heitman, Lili Anne Akin

    2016-01-01

    Relative motion at bolted connections can occur for large shock loads as the internal shear force in the bolted connection overcomes the frictional resistive force. This macroslip in a structure dissipates energy and reduces the response of the components above the bolted connection. There is a need to be able to capture macroslip behavior in a structural dynamics model. A linear model and many nonlinear models are not able to predict marcoslip effectively. The proposed method to capture macroslip is to use the multi-body dynamics code ADAMS to model joints with 3-D contact at the bolted interfaces. This model includesmore » both static and dynamic friction. The joints are preloaded and the pinning effect when a bolt shank impacts a through hole inside diameter is captured. Substructure representations of the components are included to account for component flexibility and dynamics. This method was applied to a simplified model of an aerospace structure and validation experiments were performed to test the adequacy of the method.« less

  20. Colorimetric characterization of digital cameras with unrestricted capture settings applicable for different illumination circumstances

    NASA Astrophysics Data System (ADS)

    Fang, Jingyu; Xu, Haisong; Wang, Zhehong; Wu, Xiaomin

    2016-05-01

    With colorimetric characterization, digital cameras can be used as image-based tristimulus colorimeters for color communication. In order to overcome the restriction of fixed capture settings adopted in the conventional colorimetric characterization procedures, a novel method was proposed considering capture settings. The method calculating colorimetric value of the measured image contains five main steps, including conversion from RGB values to equivalent ones of training settings through factors based on imaging system model so as to build the bridge between different settings, scaling factors involved in preparation steps for transformation mapping to avoid errors resulted from nonlinearity of polynomial mapping for different ranges of illumination levels. The experiment results indicate that the prediction error of the proposed method, which was measured by CIELAB color difference formula, reaches less than 2 CIELAB units under different illumination levels and different correlated color temperatures. This prediction accuracy for different capture settings remains the same level as the conventional method for particular lighting condition.

  1. Development of Holmium-163 electron-capture spectroscopy with transition-edge sensors

    DOE PAGES

    Croce, Mark Philip; Rabin, Michael W.; Mocko, Veronika; ...

    2016-08-01

    Calorimetric decay energy spectroscopy of electron-capture-decaying isotopes is a promising method to achieve the sensitivity required for electron neutrino mass measurement. The very low total nuclear decay energy (Q EC < 3 keV) and short half-life (4570 years) of 163Ho make it attractive for high-precision electron-capture spectroscopy (ECS) near the kinematic endpoint, where the neutrino momentum goes to zero. In the ECS approach, an electron-capture-decaying isotope is embedded inside a microcalorimeter designed to capture and measure the energy of all the decay radiation except that of the escaping neutrino. We have developed a complete process for proton irradiation-based isotope production,more » isolation, and purification of 163Ho. We have developed transition-edge sensors for this measurement and methods for incorporating 163Ho into high-resolution microcalorimeters, and have measured the electron-capture spectrum of 163Ho. Finally, we present our work in these areas and discuss the measured spectrum and its comparison to current theory.« less

  2. Development of Holmium-163 electron-capture spectroscopy with transition-edge sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croce, Mark Philip; Rabin, Michael W.; Mocko, Veronika

    Calorimetric decay energy spectroscopy of electron-capture-decaying isotopes is a promising method to achieve the sensitivity required for electron neutrino mass measurement. The very low total nuclear decay energy (Q EC < 3 keV) and short half-life (4570 years) of 163Ho make it attractive for high-precision electron-capture spectroscopy (ECS) near the kinematic endpoint, where the neutrino momentum goes to zero. In the ECS approach, an electron-capture-decaying isotope is embedded inside a microcalorimeter designed to capture and measure the energy of all the decay radiation except that of the escaping neutrino. We have developed a complete process for proton irradiation-based isotope production,more » isolation, and purification of 163Ho. We have developed transition-edge sensors for this measurement and methods for incorporating 163Ho into high-resolution microcalorimeters, and have measured the electron-capture spectrum of 163Ho. Finally, we present our work in these areas and discuss the measured spectrum and its comparison to current theory.« less

  3. Back-pack unit for capturing waterfowl and upland game by night-lighting

    USGS Publications Warehouse

    Drewien, R.C.; Reeves, H.M.; Springer, P.F.; Kuck, T.L.

    1967-01-01

    A night-lighting unit, designed as a light weight back-pack, proved successful for capturing waterfowl pairs, pheasants (Phasianus colchicus), and cottontail rabbits (Sylvilagus floridanus) during the spring and summer when most breeding populations are widely dispersed. Eighty ducks of seven species were captured in 48 hours (1.7 ducks per hour) of night-lighting in marsh habitat. Similarly, 30 pheasants were trapped in 25 hours (1.2 birds per hour) and 63 cottontail rabbits were either observed at close range (6-12 ft) or captured during night-lighting operations in upland habitat. Catch per hour of effort increased for all species as their night habitat requirements and reaction to night-lights became known. The mobile unit proved well suited for intensive use on small areas where other methods of capture were unfeasible and where representative coverage of various habitat types was desired. Besides its utility for capturing animals, the unit provided a method for studying nocturnal movements, behavior, and habitat use of marked animals.

  4. Monitoring Species of Concern Using Noninvasive Genetic Sampling and Capture-Recapture Methods

    DTIC Science & Technology

    2016-11-01

    ABBREVIATIONS AICc Akaike’s Information Criterion with small sample size correction AZGFD Arizona Game and Fish Department BMGR Barry M. Goldwater...MNKA Minimum Number Known Alive N Abundance Ne Effective Population Size NGS Noninvasive Genetic Sampling NGS-CR Noninvasive Genetic...parameter estimates from capture-recapture models require sufficient sample sizes , capture probabilities and low capture biases. For NGS-CR, sample

  5. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...

  6. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...

  7. Wireless on-demand and networking of Puritan Bennett 840 ventilators for direct data capture.

    PubMed

    Howard, William R

    2007-11-01

    Manual transcription inaccuracies have been reported to be a frequent occurrence in clinical practice, which has been confirmed in my institution. I explored alternative methods of obtaining data directly from the Puritan Bennett ventilator. The aim of this project was to record all of the ventilator settings and monitored data with wireless technology. I evaluated 2 data-capture methods: on-demand data capture into a personal digital assistant, and continuous ventilator networking with a stand-alone computer. I was successful in both the intensive care unit and laboratory environment in transferring ventilator data, for up to several days, and with a data-capture interval as short as 60 seconds.

  8. Neutron Resonance Spin Determination Using Multi-Segmented Detector DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baramsai, B.; Mitchell, G. E.; Chyzh, A.

    2011-06-01

    A sensitive method to determine the spin of neutron resonances is introduced based on the statistical pattern recognition technique. The new method was used to assign the spins of s-wave resonances in {sup 155}Gd. The experimental neutron capture data for these nuclei were measured with the DANCE (Detector for Advanced Neutron Capture Experiment) calorimeter at the Los Alamos Neutron Science Center. The highly segmented calorimeter provided detailed multiplicity distributions of the capture {gamma}-rays. Using this information, the spins of the neutron capture resonances were determined. With these new spin assignments, level spacings are determined separately for s-wave resonances with J{supmore » {pi}} = 1{sup -} and 2{sup -}.« less

  9. GN/C translation and rotation control parameters for AR/C (category 2)

    NASA Technical Reports Server (NTRS)

    Henderson, David M.

    1991-01-01

    Detailed analysis of the Automatic Rendezvous and Capture problem indicate a need for three different regions of mathematical description for the GN&C algorithms: (1) multi-vehicle orbital mechanics to the rendezvous interface point, i.e., within 100 n.; (2) relative motion solutions (such as Clohessy-Wiltshire type) from the far-field to the near-field interface, i.e., within 1 nm; and (3) close proximity motion, the nearfield motion where the relative differences in the gravitational and orbit inertial accelerations can be neglected from the equations of motion. This paper defines the reference coordinate frames and control parameters necessary to model the relative motion and attitude of spacecraft in the close proximity of another space system (Region 2 and 3) during the Automatic Rendezvous and Capture phase of an orbit operation.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samedov, V. V., E-mail: v-samedov@yandex.ru

    Fluctuations of charge induced by charge carriers on the detector electrodes make a significant contribution to the energy resolution of ionization detectors, namely, semiconductor detectors and gas and liquid ionization chambers. These fluctuations are determined by the capture of charge carriers, as they drift in the bulk of the detector under the action of an electric field, by traps. In this study, we give a correct mathematical description of charge induction on electrodes of an ionization detector for an arbitrary electric field distribution in the detector with consideration of charge carrier capture by traps. The characteristic function obtained in thismore » study yields the general expression for the distribution function of the charge induced on the detector electrodes. The formulas obtained in this study are useful for analysis of the influence of charge carrier transport on energy resolution of ionization detectors.« less

  11. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of chlorinated pesticides in aquatic tissue by capillary-column gas chromatography with electron-capture detection

    USGS Publications Warehouse

    Leiker, Thomas J.; Madsen, J.E.; Deacon, J.R.; Foreman, W.T.

    1995-01-01

    A method for the determination of chlorinated organic compounds in aquatic tissue by dual capillary-column gas chromatography with electron-capture detection is described. Whole-body-fish or corbicula tissue is homogenized, Soxhlet extracted, lipid removed by gel permeation chromatography, and fractionated using alumina/silica adsorption chromatography. The extracts are analyzed by dissimilar capillary-column gas chromatography with electron-capture detection. The method reporting limits are 5 micrograms per kilogram (μg/kg) for chlorinated compounds, 50 μg/kg for polychlorinated biphenyls, and 200 μg/kg for toxaphene.

  12. Amine enriched solid sorbents for carbon dioxide capture

    DOEpatents

    Gray, McMahan L.; Soong, Yee; Champagne, Kenneth J.

    2003-04-15

    A new method for making low-cost CO.sub.2 sorbents that can be used in large-scale gas-solid processes. The new method entails treating a solid substrate with acid or base and simultaneous or subsequent treatment with a substituted amine salt. The method eliminates the need for organic solvents and polymeric materials for the preparation of CO.sub.2 capture systems.

  13. Comparative capture rate responses of mosquito vectors to light trap and human landing collection methods

    USDA-ARS?s Scientific Manuscript database

    Capture rate responses of female Aedes albopictus Skuse, Anopheles quadrimaculatus Say, Culex nigripalpus Theobald, Culex quinquefasciatus Say, and Ochlerotatus triseriatus (Wiedemann) to CDC-type light trap (LT) and human landing (HL) collection methods were observed and evaluated for congruency wi...

  14. Method and apparatus to monitor a beam of ionizing radiation

    DOEpatents

    Blackburn, Brandon W.; Chichester, David L.; Watson, Scott M.; Johnson, James T.; Kinlaw, Mathew T.

    2015-06-02

    Methods and apparatus to capture images of fluorescence generated by ionizing radiation and determine a position of a beam of ionizing radiation generating the fluorescence from the captured images. In one embodiment, the fluorescence is the result of ionization and recombination of nitrogen in air.

  15. Technology Tips

    ERIC Educational Resources Information Center

    Mathematics Teacher, 2004

    2004-01-01

    Some inexpensive or free ways that enable to capture and use images in work are mentioned. The first tip demonstrates the methods of using some of the built-in capabilities of the Macintosh and Windows-based PC operating systems, and the second tip describes methods to capture and create images using SnagIt.

  16. Photoplasma of optically excited metal vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bezuglov, N.N.; Llyucharev, A.N.; Stacewicz, T.

    1994-09-01

    A wide range of questions associated with various aspects of photoplasma physics is considered. A comprehensive analysis of processes of optical excitation and de-excitation depending on optical characteristics of an absorbing gas medium is given. Analytical methods used for determining the excitation degree of photoresonance plasma in conditions of resonance radiation transfer are described. The accuracy of the Biberman approximation for effective lifetimes in population kinetics of resonance plasma states is analyzed for many experimental conditions. A detailed discussion of primary ionization mechanisms in photoplasma is given; the kinetics of ionization processes is discussed; and systematization of various types ofmore » photoresonance plasma is presented. Basis aspects of the LIBORS model, which is widely used for studying ionization kinetics of laser photoresonance plasma, and its limitations are considered. An ingenious method used to analytically solve a class of decay-type nonlinear problems, which arise for the capture equation in the case of noticeable saturation of a resonance transition by a short laser pulse, is described. A reliable quantitative description of fluorescence decay curve peculiarities that are associated with the bleaching of gases at resonance line frequencies can be obtained by this method. Some possible applications of photoplasma in problems of optics and spectroscopy are considered. 75 refs., 24 figs., 1 tab.« less

  17. Approach to magnetic neutron capture therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznetsov, Anatoly A.; Podoynitsyn, Sergey N.; Filippov, Victor I.

    2005-11-01

    Purpose: The method of magnetic neutron capture therapy can be described as a combination of two methods: magnetic localization of drugs using magnetically targeted carriers and neutron capture therapy itself. Methods and Materials: In this work, we produced and tested two types of particles for such therapy. Composite ultradispersed ferro-carbon (Fe-C) and iron-boron (Fe-B) particles were formed from vapors of respective materials. Results: Two-component ultradispersed particles, containing Fe and C, were tested as magnetic adsorbent of L-boronophenylalanine and borax and were shown that borax sorption could be effective for creation of high concentration of boron atoms in the area ofmore » tumor. Kinetics of boron release into the physiologic solution demonstrate that ultradispersed Fe-B (10%) could be applied for an effective magnetic neutron capture therapy. Conclusion: Both types of the particles have high magnetization and magnetic homogeneity, allow to form stable magnetic suspensions, and have low toxicity.« less

  18. An air-liquid contactor for large-scale capture of CO2 from air.

    PubMed

    Holmes, Geoffrey; Keith, David W

    2012-09-13

    We present a conceptually simple method for optimizing the design of a gas-liquid contactor for capture of carbon dioxide from ambient air, or 'air capture'. We apply the method to a slab geometry contactor that uses components, design and fabrication methods derived from cooling towers. We use mass transfer data appropriate for capture using a strong NaOH solution, combined with engineering and cost data derived from engineering studies performed by Carbon Engineering Ltd, and find that the total costs for air contacting alone-no regeneration-can be of the order of $60 per tonne CO(2). We analyse the reasons why our cost estimate diverges from that of other recent reports and conclude that the divergence arises from fundamental design choices rather than from differences in costing methodology. Finally, we review the technology risks and conclude that they can be readily addressed by prototype testing.

  19. Capture and dissociation in the complex-forming CH + H2 → CH2 + H, CH + H2 reactions.

    PubMed

    González, Miguel; Saracibar, Amaia; Garcia, Ernesto

    2011-02-28

    The rate coefficients for the capture process CH + H(2)→ CH(3) and the reactions CH + H(2)→ CH(2) + H (abstraction), CH + H(2) (exchange) have been calculated in the 200-800 K temperature range, using the quasiclassical trajectory (QCT) method and the most recent global potential energy surface. The reactions, which are of interest in combustion and in astrochemistry, proceed via the formation of long-lived CH(3) collision complexes, and the three H atoms become equivalent. QCT rate coefficients for capture are in quite good agreement with experiments. However, an important zero point energy (ZPE) leakage problem occurs in the QCT calculations for the abstraction, exchange and inelastic exit channels. To account for this issue, a pragmatic but accurate approach has been applied, leading to a good agreement with experimental abstraction rate coefficients. Exchange rate coefficients have also been calculated using this approach. Finally, calculations employing QCT capture/phase space theory (PST) models have been carried out, leading to similar values for the abstraction rate coefficients as the QCT and previous quantum mechanical capture/PST methods. This suggests that QCT capture/PST models are a good alternative to the QCT method for this and similar systems.

  20. Capturing method for integral three-dimensional imaging using multiviewpoint robotic cameras

    NASA Astrophysics Data System (ADS)

    Ikeya, Kensuke; Arai, Jun; Mishina, Tomoyuki; Yamaguchi, Masahiro

    2018-03-01

    Integral three-dimensional (3-D) technology for next-generation 3-D television must be able to capture dynamic moving subjects with pan, tilt, and zoom camerawork as good as in current TV program production. We propose a capturing method for integral 3-D imaging using multiviewpoint robotic cameras. The cameras are controlled through a cooperative synchronous system composed of a master camera controlled by a camera operator and other reference cameras that are utilized for 3-D reconstruction. When the operator captures a subject using the master camera, the region reproduced by the integral 3-D display is regulated in real space according to the subject's position and view angle of the master camera. Using the cooperative control function, the reference cameras can capture images at the narrowest view angle that does not lose any part of the object region, thereby maximizing the resolution of the image. 3-D models are reconstructed by estimating the depth from complementary multiviewpoint images captured by robotic cameras arranged in a two-dimensional array. The model is converted into elemental images to generate the integral 3-D images. In experiments, we reconstructed integral 3-D images of karate players and confirmed that the proposed method satisfied the above requirements.

  1. Effectiveness of bowl trapping and netting for inventory of a bee community

    USGS Publications Warehouse

    Grundel, R.; Frohnapple, K.J.; Jean, R.P.; Pavlovic, N.B.

    2011-01-01

    Concern over the status of bees has increased the need to inventory bee communities and, consequently, has increased the need to understand effectiveness of different bee sampling methods. We sampled bees using bowl traps and netting at 25 northwest Indiana sites ranging from open grasslands to forests. Assemblages of bees captured in bowl traps and by netting were very similar, but this similarity was driven by similar relative abundances of commonly captured species. Less common species were often not shared between collection methods (bowls, netting) and only about half of the species were shared between methods. About one-quarter of species were more often captured by one of the two collection methods. Rapid accumulation of species was aided by sampling at temporal and habitat extremes. In particular, collecting samples early and late in the adult flight season and in open and forest habitats was effective in capturing the most species with the fewest samples. The number of samples estimated necessary to achieve a complete inventory using bowls and netting together was high. For example, ≈72% of species estimated capturable in bowls were captured among the 3,159 bees collected in bowls in this study, but ≈30,000–35,000 additional bees would need to be collected to achieve a 100% complete inventory. For bowl trapping, increasing the number of sampling dates or sampling sites was more effective than adding more bowls per sampling date in completing the inventory with the fewest specimens collected.

  2. Laser Capture Microdissection for Protein and NanoString RNA analysis

    PubMed Central

    Golubeva, Yelena; Salcedo, Rosalba; Mueller, Claudius; Liotta, Lance A.; Espina, Virginia

    2013-01-01

    Laser capture microdissection (LCM) allows the precise procurement of enriched cell populations from a heterogeneous tissue, or live cell culture, under direct microscopic visualization. Histologically enriched cell populations can be procured by harvesting cells of interest directly, or isolating specific cells by ablating unwanted cells. The basic components of laser microdissection technology are a) visualization of cells via light microscopy, b) transfer of laser energy to a thermolabile polymer with either the formation of a polymer-cell composite (capture method) or transfer of laser energy via an ultraviolet laser to photovolatize a region of tissue (cutting method), and c) removal of cells of interest from the heterogeneous tissue section. The capture and cutting methods (instruments) for laser microdissection differ in the manner by which cells of interest are removed from the heterogeneous sample. Laser energy in the capture method is infrared (810nm), while in the cutting mode the laser is ultraviolet (355nm). Infrared lasers melt a thermolabile polymer that adheres to the cells of interest, whereas ultraviolet lasers ablate cells for either removal of unwanted cells or excision of a defined area of cells. LCM technology is applicable to an array of applications including mass spectrometry, DNA genotyping and loss-of-heterozygosity analysis, RNA transcript profiling, cDNA library generation, proteomics discovery, and signal kinase pathway profiling. This chapter describes laser capture microdissection using an ArcturusXT instrument for protein LCM sample analysis, and using a mmi CellCut Plus® instrument for RNA analysis via NanoString technology. PMID:23027006

  3. Characteristics of work-related fatal and hospitalised injuries not captured in workers' compensation data.

    PubMed

    Koehoorn, M; Tamburic, L; Xu, F; Alamgir, H; Demers, P A; McLeod, C B

    2015-06-01

    (1) To identify work-related fatal and non-fatal hospitalised injuries using multiple data sources, (2) to compare case-ascertainment from external data sources with accepted workers' compensation claims and (3) to investigate the characteristics of work-related fatal and hospitalised injuries not captured by workers' compensation. Work-related fatal injuries were ascertained from vital statistics, coroners and hospital discharge databases using payment and diagnosis codes and injury and work descriptions; and work-related (non-fatal) injuries were ascertained from the hospital discharge database using admission, diagnosis and payment codes. Injuries for British Columbia residents aged 15-64 years from 1991 to 2009 ascertained from the above external data sources were compared to accepted workers' compensation claims using per cent captured, validity analyses and logistic regression. The majority of work-related fatal injuries identified in the coroners data (83%) and the majority of work-related hospitalised injuries (95%) were captured as an accepted workers' compensation claim. A work-related coroner report was a positive predictor (88%), and the responsibility of payment field in the hospital discharge record a sensitive indicator (94%), for a workers' compensation claim. Injuries not captured by workers' compensation were associated with female gender, type of work (natural resources and other unspecified work) and injury diagnosis (eg, airway-related, dislocations and undetermined/unknown injury). Some work-related injuries captured by external data sources were not found in workers' compensation data in British Columbia. This may be the result of capturing injuries or workers that are ineligible for workers' compensation, or the result of injuries that go unreported to the compensation system. Hospital discharge records and coroner reports may provide opportunities to identify workers (or family members) with an unreported work-related injury and to provide them with information for submitting a workers' compensation claim. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification.

    PubMed

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints.

  5. Influence of trap modifications and environmental predictors on capture success of southern flying squirrels

    USGS Publications Warehouse

    Jacques, Christopher N.; Zweep, James S.; Scheihing, Mary E.; Rechkemmer, Will T.; Jenkins, Sean E.; Klaver, Robert W.; Dubay, Shelli A.

    2017-01-01

    Sherman traps are the most commonly used live traps in studies of small mammals and have been successfully used in the capture of arboreal species such as the southern flying squirrel (Glaucomys volans). However, southern flying squirrels spend proportionately less time foraging on the ground, which necessitates above-ground trapping methods and modifications of capture protocols. Further, quantitative estimates of the factors affecting capture success of flying squirrel populations have focused solely on effects of trapping methodologies. We developed and evaluated the efficacy of a portable Sherman trap design for capturing southern flying squirrels during 2015–2016 at the Alice L. Kibbe Field Station, Illinois, USA. Additionally, we used logistic regression to quantify potential effects of time-dependent (e.g., weather) and time-independent (e.g., habitat, extrinsic) factors on capture success of southern flying squirrels. We recorded 165 capture events (119 F, 44 M, 2 unknown) using our modified Sherman trap design. Probability of capture success decreased 0.10/1° C increase in daily maximum temperature and by 0.09/unit increase (km/hr) in wind speed. Conversely, probability of capture success increased by 1.2/1° C increase in daily minimum temperature. The probability of capturing flying squirrels was negatively associated with trap orientation. When tree-mounted traps are required, our modified trap design is a safe, efficient, and cost-effective method of capturing animals when moderate weather (temp and wind speed) conditions prevail. Further, we believe that strategic placement of traps (e.g., northeast side of tree) and quantitative information on site-specific (e.g., trap location) characteristics (e.g., topographical features, slope, aspect, climatologic factors) could increase southern flying squirrel capture success. © 2017 The Wildlife Society.

  6. 40 CFR 63.4361 - How do I determine the emission capture system efficiency?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... from the web coating/printing operation surfaces they are applied to occurs within the capture system... system efficiency? 63.4361 Section 63.4361 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... emission capture system efficiency? You must use the procedures and test methods in this section to...

  7. 40 CFR 63.4765 - How do I determine the emission capture system efficiency?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., substitute TVH for each occurrence of the term volatile organic compounds (VOC) in the methods. (3) Use... building enclosure. During the capture efficiency measurement, all organic compound emitting operations... enclosure is a building enclosure. During the capture efficiency measurement, all organic compound emitting...

  8. 40 CFR 63.4765 - How do I determine the emission capture system efficiency?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., substitute TVH for each occurrence of the term volatile organic compounds (VOC) in the methods. (3) Use... building enclosure. During the capture efficiency measurement, all organic compound emitting operations... enclosure is a building enclosure. During the capture efficiency measurement, all organic compound emitting...

  9. Apparatus and method for detecting full-capture radiation events

    DOEpatents

    Odell, D.M.C.

    1994-10-11

    An apparatus and method are disclosed for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events. 4 figs.

  10. FlyCap: Markerless Motion Capture Using Multiple Autonomous Flying Cameras.

    PubMed

    Xu, Lan; Liu, Yebin; Cheng, Wei; Guo, Kaiwen; Zhou, Guyue; Dai, Qionghai; Fang, Lu

    2017-07-18

    Aiming at automatic, convenient and non-instrusive motion capture, this paper presents a new generation markerless motion capture technique, the FlyCap system, to capture surface motions of moving characters using multiple autonomous flying cameras (autonomous unmanned aerial vehicles(UAVs) each integrated with an RGBD video camera). During data capture, three cooperative flying cameras automatically track and follow the moving target who performs large-scale motions in a wide space. We propose a novel non-rigid surface registration method to track and fuse the depth of the three flying cameras for surface motion tracking of the moving target, and simultaneously calculate the pose of each flying camera. We leverage the using of visual-odometry information provided by the UAV platform, and formulate the surface tracking problem in a non-linear objective function that can be linearized and effectively minimized through a Gaussian-Newton method. Quantitative and qualitative experimental results demonstrate the plausible surface and motion reconstruction results.

  11. Apparatus and method for detecting full-capture radiation events

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    An apparatus and method for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events.

  12. Description and evaluation of a peracetic acid air sampling and analysis method.

    PubMed

    Nordling, John; Kinsky, Owen R; Osorio, Magdalena; Pechacek, Nathan

    2017-12-01

    Peracetic acid (PAA) is a corrosive chemical with a pungent odor, which is extensively used in occupational settings and causes various health hazards in exposed workers. Currently, there is no US government agency recommended method that could be applied universally for the sampling and analysis of PAA. Legacy methods for determining airborne PAA vapor levels frequently suffered from cross-reactivity with other chemicals, particularly hydrogen peroxide (H 2 O 2 ). Therefore, to remove the confounding factor of cross-reactivity, a new viable, sensitive method was developed for assessment of PAA exposure levels, based on the differential reaction kinetics of PAA with methyl p-tolylsulfide (MTS), relative to H 2 O 2 , to preferentially derive methyl p-tolysulfoxide (MTSO). By quantifying MTSO concentration produced in the liquid capture solution from an air sampler, using an internal standard, and utilizing the reaction stoichiometry of PAA and MTS, the original airborne concentration of PAA is determined. After refining this liquid trap high-performance liquid chromatography (HPLC) method in the laboratory, it was tested in five workplace settings where PAA products were used. PAA levels ranged from the detection limit of 0.013 parts per million (ppm) to 0.4 ppm. The results indicate a viable and potentially dependable method to assess the concentrations of PAA vapors under occupational exposure scenarios, though only a small number of field measurements were taken while field testing this method. However, the low limit of detection and precision offered by this method makes it a strong candidate for further testing and validation to expand the uses of this liquid trap HPLC method.

  13. Nuclear Astrophysics At ISAC With DRAGON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Auria, John M.

    2005-05-24

    The unique DRAGON (recoil mass separator) facility is now available to provide measurements of radiative capture reactions involving short-lived exotic reactants which are considered important in explosive stellar scenarios such as novae and X-ray bursts. A description of the first study completed, the 1H(21Na,22Mg){gamma} reaction, will be summarized and updated. In addition, the planned program for DRAGON will be presented along with a summary of the upgrade of the ISAC Radioactive Beams laboratory.

  14. Autonomous subpixel satellite track end point determination for space-based images.

    PubMed

    Simms, Lance M

    2011-08-01

    An algorithm for determining satellite track end points with subpixel resolution in spaced-based images is presented. The algorithm allows for significant curvature in the imaged track due to rotation of the spacecraft capturing the image. The motivation behind the subpixel end point determination is first presented, followed by a description of the methodology used. Results from running the algorithm on real ground-based and simulated spaced-based images are shown to highlight its effectiveness.

  15. Hall viscosity and geometric response in the Chern-Simons matrix model of the Laughlin states

    NASA Astrophysics Data System (ADS)

    Lapa, Matthew F.; Hughes, Taylor L.

    2018-05-01

    We study geometric aspects of the Laughlin fractional quantum Hall (FQH) states using a description of these states in terms of a matrix quantum mechanics model known as the Chern-Simons matrix model (CSMM). This model was proposed by Polychronakos as a regularization of the noncommutative Chern-Simons theory description of the Laughlin states proposed earlier by Susskind. Both models can be understood as describing the electrons in a FQH state as forming a noncommutative fluid, i.e., a fluid occupying a noncommutative space. Here, we revisit the CSMM in light of recent work on geometric response in the FQH effect, with the goal of determining whether the CSMM captures this aspect of the physics of the Laughlin states. For this model, we compute the Hall viscosity, Hall conductance in a nonuniform electric field, and the Hall viscosity in the presence of anisotropy (or intrinsic geometry). Our calculations show that the CSMM captures the guiding center contribution to the known values of these quantities in the Laughlin states, but lacks the Landau orbit contribution. The interesting correlations in a Laughlin state are contained entirely in the guiding center part of the state/wave function, and so we conclude that the CSMM accurately describes the most important aspects of the physics of the Laughlin FQH states, including the Hall viscosity and other geometric properties of these states, which are of current interest.

  16. fMRI capture of auditory hallucinations: Validation of the two-steps method.

    PubMed

    Leroy, Arnaud; Foucher, Jack R; Pins, Delphine; Delmaire, Christine; Thomas, Pierre; Roser, Mathilde M; Lefebvre, Stéphanie; Amad, Ali; Fovet, Thomas; Jaafari, Nemat; Jardri, Renaud

    2017-10-01

    Our purpose was to validate a reliable method to capture brain activity concomitant with hallucinatory events, which constitute frequent and disabling experiences in schizophrenia. Capturing hallucinations using functional magnetic resonance imaging (fMRI) remains very challenging. We previously developed a method based on a two-steps strategy including (1) multivariate data-driven analysis of per-hallucinatory fMRI recording and (2) selection of the components of interest based on a post-fMRI interview. However, two tests still need to be conducted to rule out critical pitfalls of conventional fMRI capture methods before this two-steps strategy can be adopted in hallucination research: replication of these findings on an independent sample and assessment of the reliability of the hallucination-related patterns at the subject level. To do so, we recruited a sample of 45 schizophrenia patients suffering from frequent hallucinations, 20 schizophrenia patients without hallucinations and 20 matched healthy volunteers; all participants underwent four different experiments. The main findings are (1) high accuracy in reporting unexpected sensory stimuli in an MRI setting; (2) good detection concordance between hypothesis-driven and data-driven analysis methods (as used in the two-steps strategy) when controlled unexpected sensory stimuli are presented; (3) good agreement of the two-steps method with the online button-press approach to capture hallucinatory events; (4) high spatial consistency of hallucinatory-related networks detected using the two-steps method on two independent samples. By validating the two-steps method, we advance toward the possible transfer of such technology to new image-based therapies for hallucinations. Hum Brain Mapp 38:4966-4979, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. Adiabatically describing rare earths using microscopic deformations

    NASA Astrophysics Data System (ADS)

    Nobre, Gustavo; Dupuis, Marc; Herman, Michal; Brown, David

    2017-09-01

    Recent works showed that reactions on well-deformed nuclei in the rare-earth region are very well described by an adiabatic method. This assumes a spherical optical potential (OP) accounting for non-rotational degrees of freedom while the deformed configuration is described by couplings to states of the g.s. rotational band. This method has, apart from the global OP, only the deformation parameters as inputs, with no additional fit- ted variables. For this reason, it has only been applied to nuclei with well-measured deformations. With the new computational capabilities, microscopic large-scale calculations of deformation parameters within the HFB method based on the D1S Gogny force are available in the literature. We propose to use such microscopic deformations in our adi- abatic method, allowing us to reproduce the cross sections agreements observed in stable nuclei, and to reliably extend this description to nuclei far from stability, describing the whole rare-earth region. Since all cross sections, such as capture and charge exchange, strongly depend on the correct calculation of absorption from the incident channel (from direct reaction mechanisms), this approach significantly improves the accuracy of cross sections and transitions relevant to astrophysical studies. The work at BNL was sponsored by the Office of Nuclear Physics, Office of Science of the US Department of Energy, under Contract No. DE-AC02-98CH10886 with Brookhaven Science Associates, LLC.

  18. Acceleration of integral imaging based incoherent Fourier hologram capture using graphic processing unit.

    PubMed

    Jeong, Kyeong-Min; Kim, Hee-Seung; Hong, Sung-In; Lee, Sung-Keun; Jo, Na-Young; Kim, Yong-Soo; Lim, Hong-Gi; Park, Jae-Hyeung

    2012-10-08

    Speed enhancement of integral imaging based incoherent Fourier hologram capture using a graphic processing unit is reported. Integral imaging based method enables exact hologram capture of real-existing three-dimensional objects under regular incoherent illumination. In our implementation, we apply parallel computation scheme using the graphic processing unit, accelerating the processing speed. Using enhanced speed of hologram capture, we also implement a pseudo real-time hologram capture and optical reconstruction system. The overall operation speed is measured to be 1 frame per second.

  19. [INVITED] Computational intelligence for smart laser materials processing

    NASA Astrophysics Data System (ADS)

    Casalino, Giuseppe

    2018-03-01

    Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

  20. The State and Trends of Barcode, RFID, Biometric and Pharmacy Automation Technologies in US Hospitals

    PubMed Central

    Uy, Raymonde Charles Y.; Kury, Fabricio P.; Fontelo, Paul A.

    2015-01-01

    The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions. PMID:26958264

  1. Collision cross sections of N2 by H+ impact at keV energies within time-dependent density-functional theory

    NASA Astrophysics Data System (ADS)

    Yu, W.; Gao, C.-Z.; Zhang, Y.; Zhang, F. S.; Hutton, R.; Zou, Y.; Wei, B.

    2018-03-01

    We calculate electron capture and ionization cross sections of N2 impacted by the H+ projectile at keV energies. To this end, we employ the time-dependent density-functional theory coupled nonadiabatically to molecular dynamics. To avoid the explicit treatment of the complex density matrix in the calculation of cross sections, we propose an approximate method based on the assumption of constant ionization rate over the period of the projectile passing the absorbing boundary. Our results agree reasonably well with experimental data and semi-empirical results within the measurement uncertainties in the considered energy range. The discrepancies are mainly attributed to the inadequate description of exchange-correlation functional and the crude approximation for constant ionization rate. Although the present approach does not predict the experiments quantitatively for collision energies below 10 keV, it is still helpful to calculate total cross sections of ion-molecule collisions within a certain energy range.

  2. Collective coordinates theory for discrete soliton ratchets in the sine-Gordon model

    NASA Astrophysics Data System (ADS)

    Sánchez-Rey, Bernardo; Quintero, Niurka R.; Cuevas-Maraver, Jesús; Alejo, Miguel A.

    2014-10-01

    A collective coordinate theory is developed for soliton ratchets in the damped discrete sine-Gordon model driven by a biharmonic force. An ansatz with two collective coordinates, namely the center and the width of the soliton, is assumed as an approximated solution of the discrete nonlinear equation. The dynamical equations of these two collective coordinates, obtained by means of the generalized travelling wave method, explain the mechanism underlying the soliton ratchet and capture qualitatively all the main features of this phenomenon. The numerical simulation of these equations accounts for the existence of a nonzero depinning threshold, the nonsinusoidal behavior of the average velocity as a function of the relative phase between the harmonics of the driver, the nonmonotonic dependence of the average velocity on the damping, and the existence of nontransporting regimes beyond the depinning threshold. In particular, it provides a good description of the intriguing and complex pattern of subspaces corresponding to different dynamical regimes in parameter space.

  3. Collective coordinates theory for discrete soliton ratchets in the sine-Gordon model.

    PubMed

    Sánchez-Rey, Bernardo; Quintero, Niurka R; Cuevas-Maraver, Jesús; Alejo, Miguel A

    2014-10-01

    A collective coordinate theory is developed for soliton ratchets in the damped discrete sine-Gordon model driven by a biharmonic force. An ansatz with two collective coordinates, namely the center and the width of the soliton, is assumed as an approximated solution of the discrete nonlinear equation. The dynamical equations of these two collective coordinates, obtained by means of the generalized travelling wave method, explain the mechanism underlying the soliton ratchet and capture qualitatively all the main features of this phenomenon. The numerical simulation of these equations accounts for the existence of a nonzero depinning threshold, the nonsinusoidal behavior of the average velocity as a function of the relative phase between the harmonics of the driver, the nonmonotonic dependence of the average velocity on the damping, and the existence of nontransporting regimes beyond the depinning threshold. In particular, it provides a good description of the intriguing and complex pattern of subspaces corresponding to different dynamical regimes in parameter space.

  4. Environmental DNA sampling protocol - filtering water to capture DNA from aquatic organisms

    USGS Publications Warehouse

    Laramie, Matthew B.; Pilliod, David S.; Goldberg, Caren S.; Strickler, Katherine M.

    2015-09-29

    Environmental DNA (eDNA) analysis is an effective method of determining the presence of aquatic organisms such as fish, amphibians, and other taxa. This publication is meant to guide researchers and managers in the collection, concentration, and preservation of eDNA samples from lentic and lotic systems. A sampling workflow diagram and three sampling protocols are included as well as a list of suggested supplies. Protocols include filter and pump assembly using: (1) a hand-driven vacuum pump, ideal for sample collection in remote sampling locations where no electricity is available and when equipment weight is a primary concern; (2) a peristaltic pump powered by a rechargeable battery-operated driver/drill, suitable for remote sampling locations when weight consideration is less of a concern; (3) a 120-volt alternating current (AC) powered peristaltic pump suitable for any location where 120-volt AC power is accessible, or for roadside sampling locations. Images and detailed descriptions are provided for each step in the sampling and preservation process.

  5. Feasibility and acceptability of cell phone diaries to measure HIV risk behavior among female sex workers.

    PubMed

    Roth, Alexis M; Hensel, Devon J; Fortenberry, J Dennis; Garfein, Richard S; Gunn, Jayleen K L; Wiehe, Sarah E

    2014-12-01

    Individual, social, and structural factors affecting HIV risk behaviors among female sex workers (FSWs) are difficult to assess using retrospective surveys methods. To test the feasibility and acceptability of cell phone diaries to collect information about sexual events, we recruited 26 FSWs in Indianapolis, Indiana (US). Over 4 weeks, FSWs completed twice daily digital diaries about their mood, drug use, sexual interactions, and daily activities. Feasibility was assessed using repeated measures general linear modeling and descriptive statistics examined event-level contextual information and acceptability. Of 1,420 diaries expected, 90.3 % were completed by participants and compliance was stable over time (p > .05 for linear trend). Sexual behavior was captured in 22 % of diaries and participant satisfaction with diary data collection was high. These data provide insight into event-level factors impacting HIV risk among FSWs. We discuss implications for models of sexual behavior and individually tailored interventions to prevent HIV in this high-risk group.

  6. Fast-ion Dα spectrum diagnostic in the EAST

    NASA Astrophysics Data System (ADS)

    Hou, Y. M.; Wu, C. R.; Huang, J.; Heidbrink, W. W.; von Hellermann, M. G.; Xu, Z.; Jin, Z.; Chang, J. F.; Zhu, Y. B.; Gao, W.; Chen, Y. J.; Lyu, B.; Hu, R. J.; Zhang, P. F.; Zhang, L.; Gao, W.; Wu, Z. W.; Yu, Y.; Ye, M. Y.

    2016-11-01

    In toroidal magnetic fusion devices, fast-ion D-alpha diagnostic (FIDA) is a powerful method to study the fast-ion feature. The fast-ion characteristics can be inferred from the Doppler shifted spectrum of Dα light according to charge exchange recombination process between fast ions and probe beam. Since conceptual design presented in the last HTPD conference, significant progress has been made to apply FIDA systems on the Experimental Advanced Superconducting Tokamak (EAST). Both co-current and counter-current neutral beam injectors are available, and each can deliver 2-4 MW beam power with 50-80 keV beam energy. Presently, two sets of high throughput spectrometer systems have been installed on EAST, allowing to capture passing and trapped fast-ion characteristics simultaneously, using Kaiser HoloSpec transmission grating spectrometer and Bunkoukeiki FLP-200 volume phase holographic spectrometer coupled with Princeton Instruments ProEM 1024B eXcelon and Andor DU-888 iXon3 1024 CCD camera, respectively. This paper will present the details of the hardware descriptions and experimental spectrum.

  7. The impact of nurse prescribing on the clinical setting.

    PubMed

    Creedon, Rena; Byrne, Stephen; Kennedy, Julia; McCarthy, Suzanne

    To investigate the impact nurse prescribing has on the organisation, patient and health professional, and to identify factors associated with the growth of nurse prescribing. Systematic search and narrative review. Data obtained through CINAHL, PubMed, Science direct, Online Computer Library Centre (OCLC), databases/websites, and hand searching. English peer-reviewed quantitative, qualitative and mixed-method articles published from September 2009 through to August 2014 exploring nurse prescribing from the perspective of the organisation, health professional and patient were included. Following a systematic selection process, studies identified were also assessed for quality by applying Cardwell's framework. From the initial 443 citations 37 studies were included in the review. Most studies were descriptive in nature. Commonalities addressed were stakeholders' views, prescribing in practice, jurisdiction, education and benefits/barriers. Prescriptive authority for nurses continues to be a positive addition to clinical practice. However, concerns have emerged regarding appropriate support, relationships and jurisdictional issues. A more comprehensive understanding of nurse and midwife prescribing workloads is required to capture the true impact and cost-effectiveness of the initiative.

  8. Mapping transiently formed and sparsely populated conformations on a complex energy landscape

    PubMed Central

    Wang, Yong; Papaleo, Elena; Lindorff-Larsen, Kresten

    2016-01-01

    Determining the structures, kinetics, thermodynamics and mechanisms that underlie conformational exchange processes in proteins remains extremely difficult. Only in favourable cases is it possible to provide atomic-level descriptions of sparsely populated and transiently formed alternative conformations. Here we benchmark the ability of enhanced-sampling molecular dynamics simulations to determine the free energy landscape of the L99A cavity mutant of T4 lysozyme. We find that the simulations capture key properties previously measured by NMR relaxation dispersion methods including the structure of a minor conformation, the kinetics and thermodynamics of conformational exchange, and the effect of mutations. We discover a new tunnel that involves the transient exposure towards the solvent of an internal cavity, and show it to be relevant for ligand escape. Together, our results provide a comprehensive view of the structural landscape of a protein, and point forward to studies of conformational exchange in systems that are less characterized experimentally. DOI: http://dx.doi.org/10.7554/eLife.17505.001 PMID:27552057

  9. Function-Space-Based Solution Scheme for the Size-Modified Poisson-Boltzmann Equation in Full-Potential DFT.

    PubMed

    Ringe, Stefan; Oberhofer, Harald; Hille, Christoph; Matera, Sebastian; Reuter, Karsten

    2016-08-09

    The size-modified Poisson-Boltzmann (MPB) equation is an efficient implicit solvation model which also captures electrolytic solvent effects. It combines an account of the dielectric solvent response with a mean-field description of solvated finite-sized ions. We present a general solution scheme for the MPB equation based on a fast function-space-oriented Newton method and a Green's function preconditioned iterative linear solver. In contrast to popular multigrid solvers, this approach allows us to fully exploit specialized integration grids and optimized integration schemes. We describe a corresponding numerically efficient implementation for the full-potential density-functional theory (DFT) code FHI-aims. We show that together with an additional Stern layer correction the DFT+MPB approach can describe the mean activity coefficient of a KCl aqueous solution over a wide range of concentrations. The high sensitivity of the calculated activity coefficient on the employed ionic parameters thereby suggests to use extensively tabulated experimental activity coefficients of salt solutions for a systematic parametrization protocol.

  10. Vibration of a spatial elastica constrained inside a straight tube

    NASA Astrophysics Data System (ADS)

    Chen, Jen-San; Fang, Joyce

    2014-04-01

    In this paper we study the dynamic behavior of a clamped-clamped spatial elastica under edge thrust constrained inside a straight cylindrical tube. Attention is focused on the calculation of the natural frequencies and mode shapes of the planar and spatial one-point-contact deformations. The main issue in determining the natural frequencies of a constrained rod is the movement of the contact point during vibration. In order to capture the physical essence of the contact-point movement, an Eulerian description of the equations of motion based on director theory is formulated. After proper linearization of the equations of motion, boundary conditions, and contact conditions, the natural frequencies and mode shapes of the elastica can be obtained by solving a system of eighteen first-order differential equations with shooting method. It is concluded that the planar one-point-contact deformation becomes unstable and evolves to a spatial deformation at a bifurcation point in both displacement and force control procedures.

  11. Differential characterization of emerging skin diseases of rainbow trout--a standardized approach to capturing disease characteristics and development of case definitions.

    PubMed

    Oidtmann, B; Lapatra, S E; Verner-Jeffreys, D; Pond, M; Peeler, E J; Noguera, P A; Bruno, D W; St-Hilaire, S; Schubiger, C B; Snekvik, K; Crumlish, M; Green, D M; Metselaar, M; Rodger, H; Schmidt-Posthaus, H; Galeotti, M; Feist, S W

    2013-11-01

    Farmed and wild salmonids are affected by a variety of skin conditions, some of which have significant economic and welfare implications. In many cases, the causes are not well understood, and one example is cold water strawberry disease of rainbow trout, also called red mark syndrome, which has been recorded in the UK since 2003. To date, there are no internationally agreed methods for describing these conditions, which has caused confusion for farmers and health professionals, who are often unclear as to whether they are dealing with a new or a previously described condition. This has resulted, inevitably, in delays to both accurate diagnosis and effective treatment regimes. Here, we provide a standardized methodology for the description of skin conditions of rainbow trout of uncertain aetiology. We demonstrate how the approach can be used to develop case definitions, using coldwater strawberry disease as an example. © 2013 Crown copyright.

  12. A maximum entropy thermodynamics of small systems.

    PubMed

    Dixit, Purushottam D

    2013-05-14

    We present a maximum entropy approach to analyze the state space of a small system in contact with a large bath, e.g., a solvated macromolecular system. For the solute, the fluctuations around the mean values of observables are not negligible and the probability distribution P(r) of the state space depends on the intricate details of the interaction of the solute with the solvent. Here, we employ a superstatistical approach: P(r) is expressed as a marginal distribution summed over the variation in β, the inverse temperature of the solute. The joint distribution P(β, r) is estimated by maximizing its entropy. We also calculate the first order system-size corrections to the canonical ensemble description of the state space. We test the development on a simple harmonic oscillator interacting with two baths with very different chemical identities, viz., (a) Lennard-Jones particles and (b) water molecules. In both cases, our method captures the state space of the oscillator sufficiently well. Future directions and connections with traditional statistical mechanics are discussed.

  13. Quality improvement and practice-based research in neurology using the electronic medical record

    PubMed Central

    Frigerio, Roberta; Kazmi, Nazia; Meyers, Steven L.; Sefa, Meredith; Walters, Shaun A.; Silverstein, Jonathan C.

    2015-01-01

    Abstract We describe quality improvement and practice-based research using the electronic medical record (EMR) in a community health system–based department of neurology. Our care transformation initiative targets 10 neurologic disorders (brain tumors, epilepsy, migraine, memory disorders, mild traumatic brain injury, multiple sclerosis, neuropathy, Parkinson disease, restless legs syndrome, and stroke) and brain health (risk assessments and interventions to prevent Alzheimer disease and related disorders in targeted populations). Our informatics methods include building and implementing structured clinical documentation support tools in the EMR; electronic data capture; enrollment, data quality, and descriptive reports; quality improvement projects; clinical decision support tools; subgroup-based adaptive assignments and pragmatic trials; and DNA biobanking. We are sharing EMR tools and deidentified data with other departments toward the creation of a Neurology Practice-Based Research Network. We discuss practical points to assist other clinical practices to make quality improvements and practice-based research in neurology using the EMR a reality. PMID:26576324

  14. An Integrated Approach to Swept Wing Icing Simulation

    NASA Technical Reports Server (NTRS)

    Potapczuk, Mark G.; Broeren, Andy P.

    2017-01-01

    This paper describes the various elements of a simulation approach used to develop a database of ice shape geometries and the resulting aerodynamic performance data for a representative commercial transport wing model exposed to a variety of icing conditions. This effort included testing in the NASA Icing Research Tunnel, the Wichita State University Walter H. Beech Wind Tunnel, and the ONERA F1 Subsonic Wind Tunnel as well as the use of ice accretion codes, an inviscid design code, and computational fluid dynamics codes. Additionally, methods for capturing full three-dimensional ice shape geometries, geometry interpolation along the span of the wing, and creation of artificial ice shapes based upon that geometric data were developed for this effort. The icing conditions used for this effort were representative of actual ice shape encounter scenarios and run the gamut from ice roughness to full three-dimensional scalloped ice shapes. The effort is still underway so this paper is a status report of work accomplished to date and a description of the remaining elements of the effort.

  15. A Descriptive Morphometric Approach of the Skull and Mandible of the Common Opossum (Didelphis Marsupialis Linnaeus, 1758) in the Caribbean and its Clinical Application during Regional Anaesthesia.

    PubMed

    Mohamed, Reda

    2018-03-09

    The aim of this study was to determine the morphometric values of the skull and the mandible of the common opossum from the Caribbean island of Trinidad and Tobago. The skulls and mandibles were obtained from ten opossums captured for research purposes. The skulls and mandibles were prepared and cleaned using standard method. Some anatomical landmarks of the skulls and mandibles were identified and measured. The results were important for identification of the common opossum via comparison and discussion of our results with that of other marsupial species. Furthermore, the results had clinical importance with regard to regional nerve blocks of the infraorbital, inferior alveolar, and mental nerves for dental extraction and head surgery. This study concluded that by using the anatomical landmarks of the infraorbital and mental foramina it will be easier for the veterinarian surgeons during the application of local anesthetic agent for the infraorbital, inferior alveolar, and mental nerve blocks.

  16. Weak Fault Feature Extraction of Rolling Bearings Based on an Improved Kurtogram

    PubMed Central

    Chen, Xianglong; Feng, Fuzhou; Zhang, Bingzhi

    2016-01-01

    Kurtograms have been verified to be an efficient tool in bearing fault detection and diagnosis because of their superiority in extracting transient features. However, the short-time Fourier Transform is insufficient in time-frequency analysis and kurtosis is deficient in detecting cyclic transients. Those factors weaken the performance of the original kurtogram in extracting weak fault features. Correlated Kurtosis (CK) is then designed, as a more effective solution, in detecting cyclic transients. Redundant Second Generation Wavelet Packet Transform (RSGWPT) is deemed to be effective in capturing more detailed local time-frequency description of the signal, and restricting the frequency aliasing components of the analysis results. The authors in this manuscript, combining the CK with the RSGWPT, propose an improved kurtogram to extract weak fault features from bearing vibration signals. The analysis of simulation signals and real application cases demonstrate that the proposed method is relatively more accurate and effective in extracting weak fault features. PMID:27649171

  17. A Descriptive Morphometric Approach of the Skull and Mandible of the Common Opossum (Didelphis Marsupialis Linnaeus, 1758) in the Caribbean and its Clinical Application during Regional Anaesthesia

    PubMed Central

    Mohamed, Reda

    2018-01-01

    The aim of this study was to determine the morphometric values of the skull and the mandible of the common opossum from the Caribbean island of Trinidad and Tobago. The skulls and mandibles were obtained from ten opossums captured for research purposes. The skulls and mandibles were prepared and cleaned using standard method. Some anatomical landmarks of the skulls and mandibles were identified and measured. The results were important for identification of the common opossum via comparison and discussion of our results with that of other marsupial species. Furthermore, the results had clinical importance with regard to regional nerve blocks of the infraorbital, inferior alveolar, and mental nerves for dental extraction and head surgery. This study concluded that by using the anatomical landmarks of the infraorbital and mental foramina it will be easier for the veterinarian surgeons during the application of local anesthetic agent for the infraorbital, inferior alveolar, and mental nerve blocks. PMID:29522485

  18. A System for Rapidly and Accurately Collecting Patients’ Race and Ethnicity

    PubMed Central

    Baker, David W.; Cameron, Kenzie A.; Feinglass, Joseph; Thompson, Jason A.; Georgas, Patricia; Foster, Shawn; Pierce, Deborah; Hasnain-Wynia, Romana

    2006-01-01

    Objectives. We assessed the feasibility of collecting race/ethnicity data from patients using their own preferred racial/ethnic terms. Methods. The 424 patients described their race/ethnicity using their own categories, and we compared their descriptions with their responses to the questions (1) “Do you consider yourself Latino or Hispanic?” and (2) “Which category best describes your race?” (7 response options in our computer interview). We also determined patients’ preferences between the 2 approaches. Results.seconds. Rates of missing values and categorization as “other” race were lower than with the closed questions. Agreement between racial/ethnic categorization with open-ended and closed responses was 93% (κ =0.88). Latino/Hispanic and multiracial/multiethnic individuals were more likely to prefer using their own categories to describe their race/ethnicity. Conclusions. Collecting race/ethnicity data using patients’ own racial/ethnic categories is feasible with the use of computerized systems to capture verbatim responses and results in lower rates of missing and unusable data than do standard questions. PMID:16449590

  19. Transparency, usability, and reproducibility: Guiding principles for improving comparative databases using primates as examples.

    PubMed

    Borries, Carola; Sandel, Aaron A; Koenig, Andreas; Fernandez-Duque, Eduardo; Kamilar, Jason M; Amoroso, Caroline R; Barton, Robert A; Bray, Joel; Di Fiore, Anthony; Gilby, Ian C; Gordon, Adam D; Mundry, Roger; Port, Markus; Powell, Lauren E; Pusey, Anne E; Spriggs, Amanda; Nunn, Charles L

    2016-09-01

    Recent decades have seen rapid development of new analytical methods to investigate patterns of interspecific variation. Yet these cutting-edge statistical analyses often rely on data of questionable origin, varying accuracy, and weak comparability, which seem to have reduced the reproducibility of studies. It is time to improve the transparency of comparative data while also making these improved data more widely available. We, the authors, met to discuss how transparency, usability, and reproducibility of comparative data can best be achieved. We propose four guiding principles: 1) data identification with explicit operational definitions and complete descriptions of methods; 2) inclusion of metadata that capture key characteristics of the data, such as sample size, geographic coordinates, and nutrient availability (for example, captive versus wild animals); 3) documentation of the original reference for each datum; and 4) facilitation of effective interactions with the data via user friendly and transparent interfaces. We urge reviewers, editors, publishers, database developers and users, funding agencies, researchers publishing their primary data, and those performing comparative analyses to embrace these standards to increase the transparency, usability, and reproducibility of comparative studies. © 2016 Wiley Periodicals, Inc.

  20. A survey of the prevalence of fatigue, its precursors and individual coping mechanisms among U.S. manufacturing workers.

    PubMed

    Lu, Lin; Megahed, Fadel M; Sesek, Richard F; Cavuoto, Lora A

    2017-11-01

    Advanced manufacturing has resulted in significant changes on the shop-floor, influencing work demands and the working environment. The corresponding safety-related effects, including fatigue, have not been captured on an industry-wide scale. This paper presents results of a survey of U.S. manufacturing workers for the: prevalence of fatigue, its root causes and significant factors, and adopted individual fatigue coping methods. The responses from 451 manufacturing employees were analyzed using descriptive data analysis, bivariate analysis and Market Basket Analysis. 57.9% of respondents indicated that they were somewhat fatigued during the past week. They reported the ankles/feet, lower back and eyes were frequently affected body parts and a lack of sleep, work stress and shift schedule were top selected root causes for fatigue. In order to respond to fatigue when it is present, respondents reported coping by drinking caffeinated drinks, stretching/doing exercises and talking with coworkers. Frequent combinations of fatigue causes and individual coping methods were identified. These results may inform the design of fatigue monitoring and mitigation strategies and future research related to fatigue development. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Hot QCD equations of state and relativistic heavy ion collisions

    NASA Astrophysics Data System (ADS)

    Chandra, Vinod; Kumar, Ravindra; Ravishankar, V.

    2007-11-01

    We study two recently proposed equations of state obtained from high-temperature QCD and show how they can be adapted to use them for making predictions for relativistic heavy ion collisions. The method involves extracting equilibrium distribution functions for quarks and gluons from the equation of state (EOS), which in turn will allow a determination of the transport and other bulk properties of the quark gluon-plasma. Simultaneously, the method also yields a quasiparticle description of interacting quarks and gluons. The first EOS is perturbative in the QCD coupling constant and has contributions of O(g5). The second EOS is an improvement over the first, with contributions up to O[g6ln(1/g)]; it incorporates the nonperturbative hard thermal contributions. The interaction effects are shown to be captured entirely by the effective chemical potentials for the gluons and the quarks, in both cases. The chemical potential is seen to be highly sensitive to the EOS. As an application, we determine the screening lengths, which are, indeed, the most important diagnostics for QGP. The screening lengths are seen to behave drastically differently depending on the EOS considered and therefore yield a way to distinguish the two equations of state in heavy ion collisions.

  2. Utilising three-dimensional printing techniques when providing unique assistive devices: A case report.

    PubMed

    Day, Sarah Jane; Riley, Shaun Patrick

    2018-02-01

    The evolution of three-dimensional printing into prosthetics has opened conversations about the availability and cost of prostheses. This report will discuss how a prosthetic team incorporated additive manufacture techniques into the treatment of a patient with a partial hand amputation to create and test a unique assistive device which he could use to hold his French horn. Case description and methods: Using a process of shape capture, photogrammetry, computer-aided design and finite element analysis, a suitable assistive device was designed and tested. The design was fabricated using three-dimensional printing. Patient satisfaction was measured using a Pugh's Matrix™, and a cost comparison was made between the process used and traditional manufacturing. Findings and outcomes: Patient satisfaction was high. The three-dimensional printed devices were 56% cheaper to fabricate than a similar laminated device. Computer-aided design and three-dimensional printing proved to be an effective method for designing, testing and fabricating a unique assistive device. Clinical relevance CAD and 3D printing techniques can enable devices to be designed, tested and fabricated cheaper than when using traditional techniques. This may lead to improvements in quality and accessibility.

  3. Automated Microfluidic Filtration and Immunocytochemistry Detection System for Capture and Enumeration of Circulating Tumor Cells and Other Rare Cell Populations in Blood.

    PubMed

    Pugia, Michael; Magbanua, Mark Jesus M; Park, John W

    2017-01-01

    Isolation by size using a filter membrane offers an antigen-independent method for capturing rare cells present in blood of cancer patients. Multiple cell types, including circulating tumor cells (CTCs), captured on the filter membrane can be simultaneously identified via immunocytochemistry (ICC) analysis of specific cellular biomarkers. Here, we describe an automated microfluidic filtration method combined with a liquid handling system for sequential ICC assays to detect and enumerate non-hematologic rare cells in blood.

  4. Variable pressure ionization detector for gas chromatography

    DOEpatents

    Buchanan, Michelle V.; Wise, Marcus B.

    1988-01-01

    Method and apparatus for differentiating organic compounds based on their electron affinity. An electron capture detector cell (ECD) is operated at pressures ranging from atmospheric to less than 1 torr. Through variation of the pressure within the ECD cell, the organic compounds are induced to either capture or emit electrons. Differentiation of isomeric compounds can be obtianed when, at a given pressure, one isomer is in the emission mode and the other is in the capture mode. Output of the ECD is recorded by chromatogram. The invention also includes a method for obtaining the zero-crossing pressure of a compound, defined as the pressure at which the competing emission and capture reactions are balanced and which may be correlated to the electron affinity of a compound.

  5. CO2 Capture Using Electric Fields: Low-Cost Electrochromic Film on Plastic for Net-Zero Energy Building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-01-01

    Broad Funding Opportunity Announcement Project: Two faculty members at Lehigh University created a new technique called supercapacitive swing adsorption (SSA) that uses electrical charges to encourage materials to capture and release CO2. Current CO2 capture methods include expensive processes that involve changes in temperature or pressure. Lehigh University’s approach uses electric fields to improve the ability of inexpensive carbon sorbents to trap CO2. Because this process uses electric fields and not electric current, the overall energy consumption is projected to be much lower than conventional methods. Lehigh University is now optimizing the materials to maximize CO2 capture and minimize themore » energy needed for the process.« less

  6. Aspects of Scale Invariance in Physics and Biology

    NASA Astrophysics Data System (ADS)

    Alba, Vasyl

    We study three systems that have scale invariance. The first system is a conformal field theory in d > 3 dimensions. We prove that if there is a unique stress-energy tensor and at least one higher-spin conserved current in the theory, then the correlation functions of the stress-energy tensors and the conserved currents of higher-spin must coincide with one of the following possibilities: a) a theory of n free bosons, b) a theory of n free fermions or c) a theory of n (d-2)/2-forms. The second system is the primordial gravitational wave background in a theory with inflation. We show that the scale invariant spectrum of primordial gravitational waves is isotropic only in the zero-order approximation, and it gets a small correction due to the primordial scalar fluctuations. When anisotropy is measured experimentally, our result will allow us to distinguish between different inflationary models. The third system is a biological system. The question we are asking is whether there is some simplicity or universality underlying the complexities of natural animal behavior. We use the walking fruit fly (Drosophila melanogaster) as a model system. Based on the result that unsupervised flies' behaviors can be categorized into one hundred twenty-two discrete states (stereotyped movements), which all individuals from a single species visit repeatedly, we demonstrated that the sequences of states are strongly non-Markovian. In particular, correlations persist for an order of magnitude longer than expected from a model of random state-to-state transitions. The correlation function has a power-law decay, which is a hint of some kind of criticality in the system. We develop a generalization of the information bottleneck method that allows us to cluster these states into a small number of clusters. This more compact description preserves a lot of temporal correlation. We found that it is enough to use a two-cluster representation of the data to capture long-range correlations, which opens a way for a more quantitative description of the system. Usage of the maximal entropy method allowed us to find a description that closely resembles a famous inverse-square Ising model in 1d in a small magnetic field.

  7. Policy capturing as a method of quantifying the determinants of landscape preference

    Treesearch

    Dennis B. Propst

    1979-01-01

    Policy Capturing, a potential methodology for evaluating landscape preference, was described and tested. This methodology results in a mathematical model that theoretically represents the human decision-making process. Under experimental conditions, judges were asked to express their preferences for scenes of the Blue Ridge Parkway. An equation which "captures,...

  8. A new capture fraction method to map how pumpage affects surface water flow.

    PubMed

    Leake, Stanley A; Reeves, Howard W; Dickinson, Jesse E

    2010-01-01

    All groundwater pumped is balanced by removal of water somewhere, initially from storage in the aquifer and later from capture in the form of increase in recharge and decrease in discharge. Capture that results in a loss of water in streams, rivers, and wetlands now is a concern in many parts of the United States. Hydrologists commonly use analytical and numerical approaches to study temporal variations in sources of water to wells for select points of interest. Much can be learned about coupled surface/groundwater systems, however, by looking at the spatial distribution of theoretical capture for select times of interest. Development of maps of capture requires (1) a reasonably well-constructed transient or steady state model of an aquifer with head-dependent flow boundaries representing surface water features or evapotranspiration and (2) an automated procedure to run the model repeatedly and extract results, each time with a well in a different location. This paper presents new methods for simulating and mapping capture using three-dimensional groundwater flow models and presents examples from Arizona, Oregon, and Michigan.

  9. Synthesis of amino-rich silica coated magnetic nanoparticles and their application in the capture of DNA for PCR

    USDA-ARS?s Scientific Manuscript database

    Magnetic separation has great advantages over traditional bioseparation methods and has become popular in the development of methods for the detection of bacterial pathogens, viruses, and transgenic crops. Functionalization of magnetic nanoparticles is a key factor in allowing efficient capture of t...

  10. Examining the Use of Lecture Capture Technology: Implications for Teaching and Learning

    ERIC Educational Resources Information Center

    Groen, Jovan F.; Quigley, Brenna; Herry, Yves

    2016-01-01

    This study sought to provide a better understanding of how lecture capture technology is used by students and how its use is related to student satisfaction, attendance, and academic performance. Using a mixed method design with both quantitative and qualitative methods to collect data, instruments included a student questionnaire, interviews and…

  11. A successful trap design for capturing large terrestrial snakes

    Treesearch

    Shirley J. Burgdorf; D. Craig Rudolph; Richard N. Conner; Daniel Saenz; Richard R. Schaefer

    2005-01-01

    Large scale trapping protocols for snakes can be expensive and require large investments of personnel and time. Typical methods, such as pitfall and small funnel traps, are not useful or suitable for capturing large snakes. A method was needed to survey multiple blocks of habitat for the Louisiana Pine Snake (Pituophis ruthveni), throughout its...

  12. High purity microfluidic sorting and analysis of circulating tumor cells: towards routine mutation detection.

    PubMed

    Autebert, Julien; Coudert, Benoit; Champ, Jérôme; Saias, Laure; Guneri, Ezgi Tulukcuoglu; Lebofsky, Ronald; Bidard, François-Clément; Pierga, Jean-Yves; Farace, Françoise; Descroix, Stéphanie; Malaquin, Laurent; Viovy, Jean-Louis

    2015-05-07

    A new generation of the Ephesia cell capture technology optimized for CTC capture and genetic analysis is presented, characterized in depth and compared with the CellSearch system as a reference. This technology uses magnetic particles bearing tumour-cell specific EpCAM antibodies, self-assembled in a regular array in a microfluidic flow cell. 48,000 high aspect-ratio columns are generated using a magnetic field in a high throughput (>3 ml h(-1)) device and act as sieves to specifically capture the cells of interest through antibody-antigen interactions. Using this device optimized for CTC capture and analysis, we demonstrated the capture of epithelial cells with capture efficiency above 90% for concentrations as low as a few cells per ml. We showed the high specificity of capture with only 0.26% of non-epithelial cells captured for concentrations above 10 million cells per ml. We investigated the capture behavior of cells in the device, and correlated the cell attachment rate with the EpCAM expression on the cell membranes for six different cell lines. We developed and characterized a two-step blood processing method to allow for rapid processing of 10 ml blood tubes in less than 4 hours, and showed a capture rate of 70% for as low as 25 cells spiked in 10 ml blood tubes, with less than 100 contaminating hematopoietic cells. Using this device and procedure, we validated our system on patient samples using an automated cell immunostaining procedure and a semi-automated cell counting method. Our device captured CTCs in 75% of metastatic prostate cancer patients and 80% of metastatic breast cancer patients, and showed similar or better results than the CellSearch device in 10 out of 13 samples. Finally, we demonstrated the possibility of detecting cancer-related PIK3CA gene mutation in 20 cells captured in the chip with a good correlation between the cell count and the quantitation value Cq of the post-capture qPCR.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arbanas, G; Dietrich, F S; Kerman, A K

    A method for computing direct-semidirect (DSD) neutron radiative capture is presented and applied to thermal neutron capture on {sup 19}F, {sup 27}Al, {sup 28,29.30}Si, {sup 35,37}Cl, {sup 39,41}K, {sup 56}Fe, and {sup 238}U, in support of data evaluation effort at the O.R.N.L. The DSD method includes both direct and semidirect capture; the latter is a core-polarization term in which the giant dipole resonance is formed. We study the effects of a commonly used ''density'' approximation to the EM operator and find it to be unsatisfactory for the nuclei considered here. We also study the magnitude of semidirect capture relative tomore » the pure direct capture. Furthermore, we compare our results with those obtained from another direct capture code (Tedca [17]). We also compare our results with those obtained from analytical expression for external capture derived by Lane and Lynn [3], and its extension to include internal capture [7]. To estimate the effect of nuclear deformation on direct capture, we computed direct thermal capture on {sup 238}U with and without imposition of spherical symmetry. Direct capture for a spherically symmetric {sup 238}U was approximately 6 mb, while a quadrupole deformation of 0.215 on the shape of {sup 238}U lowers this cross section down to approximately 2 mb. This result suggests that effects of nuclear deformation on direct capture warrant a further study. We also find out that contribution to the direct capture on {sup 238}U from the nuclear interior significantly cancels that coming from the exterior region, and hence both contributions must be taken into account. We reproduced a well known discrepancy between the computed and observed branching ratios in {sup 56}Fe(n,{gamma}). This will lead us to revisit the concept of doorway states in the particle-hole model.« less

  14. From blackbirds to black holes: Investigating capture-recapture methods for time domain astronomy

    NASA Astrophysics Data System (ADS)

    Laycock, Silas G. T.

    2017-07-01

    In time domain astronomy, recurrent transients present a special problem: how to infer total populations from limited observations. Monitoring observations may give a biassed view of the underlying population due to limitations on observing time, visibility and instrumental sensitivity. A similar problem exists in the life sciences, where animal populations (such as migratory birds) or disease prevalence, must be estimated from sparse and incomplete data. The class of methods termed Capture-Recapture is used to reconstruct population estimates from time-series records of encounters with the study population. This paper investigates the performance of Capture-Recapture methods in astronomy via a series of numerical simulations. The Blackbirds code simulates monitoring of populations of transients, in this case accreting binary stars (neutron star or black hole accreting from a stellar companion) under a range of observing strategies. We first generate realistic light-curves for populations of binaries with contrasting orbital period distributions. These models are then randomly sampled at observing cadences typical of existing and planned monitoring surveys. The classical capture-recapture methods, Lincoln-Peterson, Schnabel estimators, related techniques, and newer methods implemented in the Rcapture package are compared. A general exponential model based on the radioactive decay law is introduced which is demonstrated to recover (at 95% confidence) the underlying population abundance and duty cycle, in a fraction of the observing visits (10-50%) required to discover all the sources in the simulation. Capture-Recapture is a promising addition to the toolbox of time domain astronomy, and methods implemented in R by the biostats community can be readily called from within python.

  15. Delineating baseflow contribution areas for streams - A model and methods comparison

    NASA Astrophysics Data System (ADS)

    Chow, Reynold; Frind, Michael E.; Frind, Emil O.; Jones, Jon P.; Sousa, Marcelo R.; Rudolph, David L.; Molson, John W.; Nowak, Wolfgang

    2016-12-01

    This study addresses the delineation of areas that contribute baseflow to a stream reach, also known as stream capture zones. Such areas can be delineated using standard well capture zone delineation methods, with three important differences: (1) natural gradients are smaller compared to those produced by supply wells and are therefore subject to greater numerical errors, (2) stream discharge varies seasonally, and (3) stream discharge varies spatially. This study focuses on model-related uncertainties due to model characteristics, discretization schemes, delineation methods, and particle tracking algorithms. The methodology is applied to the Alder Creek watershed in southwestern Ontario. Four different model codes are compared: HydroGeoSphere, WATFLOW, MODFLOW, and FEFLOW. In addition, two delineation methods are compared: reverse particle tracking and reverse transport, where the latter considers local-scale parameter uncertainty by using a macrodispersion term to produce a capture probability plume. The results from this study indicate that different models can calibrate acceptably well to the same data and produce very similar distributions of hydraulic head, but can produce different capture zones. The stream capture zone is found to be highly sensitive to the particle tracking algorithm. It was also found that particle tracking by itself, if applied to complex systems such as the Alder Creek watershed, would require considerable subjective judgement in the delineation of stream capture zones. Reverse transport is an alternative and more reliable approach that provides probability intervals for the baseflow contribution areas, taking uncertainty into account. The two approaches can be used together to enhance the confidence in the final outcome.

  16. High capacity immobilized amine sorbents

    DOEpatents

    Gray, McMahan L [Pittsburgh, PA; Champagne, Kenneth J [Fredericktown, PA; Soong, Yee [Monroeville, PA; Filburn, Thomas [Granby, CT

    2007-10-30

    A method is provided for making low-cost CO.sub.2 sorbents that can be used in large-scale gas-solid processes. The improved method entails treating an amine to increase the number of secondary amine groups and impregnating the amine in a porous solid support. The method increases the CO.sub.2 capture capacity and decreases the cost of utilizing an amine-enriched solid sorbent in CO.sub.2 capture systems.

  17. RELATIVE CONTRIBUTIONS OF THREE DESCRIPTIVE METHODS: IMPLICATIONS FOR BEHAVIORAL ASSESSMENT

    PubMed Central

    Pence, Sacha T; Roscoe, Eileen M; Bourret, Jason C; Ahearn, William H

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods—the ABC method, the conditional probability method, and the conditional and background probability method—to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior participated. Functional analyses indicated that participants' problem behavior was maintained by social positive reinforcement (n  =  2), social negative reinforcement (n  =  2), or automatic reinforcement (n  =  2). Results showed that for all but 1 participant, descriptive analysis outcomes were similar across methods. In addition, for all but 1 participant, the descriptive analysis outcome differed substantially from the functional analysis outcome. This supports the general finding that descriptive analysis is a poor means of determining functional relations. PMID:19949536

  18. Estimating the number of HIV-infected injection drug users in Bangkok: a capture--recapture method.

    PubMed

    Mastro, T D; Kitayaporn, D; Weniger, B G; Vanichseni, S; Laosunthorn, V; Uneklabh, T; Uneklabh, C; Choopanya, K; Limpakarnjanarat, K

    1994-07-01

    The purpose of the study was to estimate the number of injection drug users infected with the human immunodeficiency virus (HIV) in Bangkok to allow planning for health services for this population. A two-sample capture-recapture method was used. The first capture listed all persons on methadone treatment for opiate addiction from April 17 through May 17, 1991, at 18 facilities in Bangkok. The second capture involved urine testing of persons held at 72 Bangkok police stations from June 3 through September 30, 1991. Persons whose urine tests were positive for opiate metabolites or methadone were included on the second list. The first capture comprised 4064 persons and the recapture 1540 persons. There were 171 persons included on both lists, yielding an estimate of 36,600 opiate users in Bangkok. Existing data indicate that 89% of opiate users in Bangkok inject drugs and that about one third are infected with HIV, yielding an estimate of approximately 12,000 HIV-infected injection drug users in Bangkok in 1991. During the 1990s the number of cases of acquired immunodeficiency syndrome (AIDS) and other HIV-related diseases, including tuberculosis, in the population of HIV-infected injection drug users in Bangkok will increase dramatically, placing new demands on existing health care facilities. The capture-recapture method may be useful in estimating difficult-to-count populations, including injection drug users.

  19. A Class of High-Resolution Explicit and Implicit Shock-Capturing Methods

    NASA Technical Reports Server (NTRS)

    Yee, H. C.

    1994-01-01

    The development of shock-capturing finite difference methods for hyperbolic conservation laws has been a rapidly growing area for the last decade. Many of the fundamental concepts, state-of-the-art developments and applications to fluid dynamics problems can only be found in meeting proceedings, scientific journals and internal reports. This paper attempts to give a unified and generalized formulation of a class of high-resolution, explicit and implicit shock capturing methods, and to illustrate their versatility in various steady and unsteady complex shock waves, perfect gases, equilibrium real gases and nonequilibrium flow computations. These numerical methods are formulated for the purpose of ease and efficient implementation into a practical computer code. The various constructions of high-resolution shock-capturing methods fall nicely into the present framework and a computer code can be implemented with the various methods as separate modules. Included is a systematic overview of the basic design principle of the various related numerical methods. Special emphasis will be on the construction of the basic nonlinear, spatially second and third-order schemes for nonlinear scalar hyperbolic conservation laws and the methods of extending these nonlinear scalar schemes to nonlinear systems via the approximate Riemann solvers and flux-vector splitting approaches. Generalization of these methods to efficiently include real gases and large systems of nonequilibrium flows will be discussed. Some perbolic conservation laws to problems containing stiff source terms and terms and shock waves are also included. The performance of some of these schemes is illustrated by numerical examples for one-, two- and three-dimensional gas-dynamics problems. The use of the Lax-Friedrichs numerical flux to obtain high-resolution shock-capturing schemes is generalized. This method can be extended to nonlinear systems of equations without the use of Riemann solvers or flux-vector splitting approaches and thus provides a large savings for multidimensional, equilibrium real gases and nonequilibrium flow computations.

  20. Bacteriophage-based nanoprobes for rapid bacteria separation

    NASA Astrophysics Data System (ADS)

    Chen, Juhong; Duncan, Bradley; Wang, Ziyuan; Wang, Li-Sheng; Rotello, Vincent M.; Nugen, Sam R.

    2015-10-01

    The lack of practical methods for bacterial separation remains a hindrance for the low-cost and successful development of rapid detection methods from complex samples. Antibody-tagged magnetic particles are commonly used to pull analytes from a liquid sample. While this method is well-established, improvements in capture efficiencies would result in an increase of the overall detection assay performance. Bacteriophages represent a low-cost and more consistent biorecognition element as compared to antibodies. We have developed nanoscale bacteriophage-tagged magnetic probes, where T7 bacteriophages were bound to magnetic nanoparticles. The nanoprobe allowed the specific recognition and attachment to E. coli cells. The phage magnetic nanprobes were directly compared to antibody-conjugated magnetic nanoprobes. The capture efficiencies of bacteriophages and antibodies on nanoparticles for the separation of E. coli K12 at varying concentrations were determined. The results indicated a similar bacteria capture efficiency between the two nanoprobes.The lack of practical methods for bacterial separation remains a hindrance for the low-cost and successful development of rapid detection methods from complex samples. Antibody-tagged magnetic particles are commonly used to pull analytes from a liquid sample. While this method is well-established, improvements in capture efficiencies would result in an increase of the overall detection assay performance. Bacteriophages represent a low-cost and more consistent biorecognition element as compared to antibodies. We have developed nanoscale bacteriophage-tagged magnetic probes, where T7 bacteriophages were bound to magnetic nanoparticles. The nanoprobe allowed the specific recognition and attachment to E. coli cells. The phage magnetic nanprobes were directly compared to antibody-conjugated magnetic nanoprobes. The capture efficiencies of bacteriophages and antibodies on nanoparticles for the separation of E. coli K12 at varying concentrations were determined. The results indicated a similar bacteria capture efficiency between the two nanoprobes. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr03779d

  1. The ambiguity of simplicity in quantum and classical simulation

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, Cina; Mahoney, John R.; Crutchfield, James P.

    2017-04-01

    A system's perceived simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Here, we associate simplicity with small model-memory. We see that the notions of absolute physical simplicity at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham's Razor or to the ;elegance; of competing theories, may be fundamentally subjective. Recent rapid progress in quantum computation and quantum simulation suggest that the ambiguity of simplicity will strongly impact statistical inference and, in particular, model selection.

  2. Scene incongruity and attention.

    PubMed

    Mack, Arien; Clarke, Jason; Erol, Muge; Bert, John

    2017-02-01

    Does scene incongruity, (a mismatch between scene gist and a semantically incongruent object), capture attention and lead to conscious perception? We explored this question using 4 different procedures: Inattention (Experiment 1), Scene description (Experiment 2), Change detection (Experiment 3), and Iconic Memory (Experiment 4). We found no differences between scene incongruity and scene congruity in Experiments 1, 2, and 4, although in Experiment 3 change detection was faster for scenes containing an incongruent object. We offer an explanation for why the change detection results differ from the results of the other three experiments. In all four experiments, participants invariably failed to report the incongruity and routinely mis-described it by normalizing the incongruent object. None of the results supports the claim that semantic incongruity within a scene invariably captures attention and provide strong evidence of the dominant role of scene gist in determining what is perceived. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Confirmation of the identity of the type host of the louse Halipeurus fallacis (Phthiraptera: Philopteridae).

    PubMed

    Palma, Ricardo L

    2018-04-09

    Alexander (1954: 489) recorded a petrel (Aves: Procellariiformes) captured alive on board a ship in the Indian Ocean by Mr W.W.A. Phillips who, after removing some lice, liberated it the following morning. Alexander (1954) identified that petrel as the species "Pterodroma aterrima Bonaparte", now placed in the genus Pseudobulweria. The lice were kept in the collection of the then British Museum (Natural History), now the Natural History Museum, London, England. Jouanin (1955) published a new species of petrel from the Indian Ocean as Bulweria fallax. Jouanin (1957: 19) discussed the identity of the petrel identified by Alexander (1954) as Pterodroma aterrima, stating that the descriptive data given by Alexander (1954) did not clearly fit either P. aterrima or B. fallax. However, considering the geographical coordinates where the bird was captured, Jouanin (1957) believed it was more likely Bulweria fallax.

  4. What is the valence of Mn in GaMnN?

    NASA Astrophysics Data System (ADS)

    Nelson, Ryky; Berlijn, Tom; Moreno, Juana; Jarrell, Mark; Ku, Wei

    2014-03-01

    Motivated by the potential high Curie temperature of GaMnN, we investigate the controversial Mn-valence in this diluted magnetic semiconductor. From a first-principles Wannier functions analysis of the high energy Hilbert space we find unambiguously the charge state of Mn to be close to 2 + (d5), but in a mixed spin configuration with average magnetic moments of 4 μB. Using more extended Wannier orbitals to capture the lower-energy physics, we further demonstrate the feasibility of both the effective d4 description (appropriate to deal with the local magnetic moment and Jahn-Teller distortion), and the effective d5 description (relevant to study long-range magnetic order). Our derivation highlights the general richness of low-energy sectors in interacting many-body systems and the generic need for multiple effective descriptions, and advocates for a diminished relevance of atomic valence measured by various experimental probes. This research is supported in part by LA-SiGMA, NSF Award Number #EPS-1003897. TB was supported by DOE CMCSN and as a Wigner Fellow at the Oak Ridge National Laboratory.

  5. Relative efficiency of anuran sampling methods in a restinga habitat (Jurubatiba, Rio de Janeiro, Brazil).

    PubMed

    Rocha, C F D; Van Sluys, M; Hatano, F H; Boquimpani-Freitas, L; Marra, R V; Marques, R V

    2004-11-01

    Studies on anurans in restinga habitats are few and, as a result, there is little information on which methods are more efficient for sampling them in this environment. Ten methods are usually used for sampling anuran communities in tropical and sub-tropical areas. In this study we evaluate which methods are more appropriate for this purpose in the restinga environment of Parque Nacional da Restinga de Jurubatiba. We analyzed six methods among those usually used for anuran samplings. For each method, we recorded the total amount of time spent (in min.), the number of researchers involved, and the number of species captured. We calculated a capture efficiency index (time necessary for a researcher to capture an individual frog) in order to make comparable the data obtained. Of the methods analyzed, the species inventory (9.7 min/searcher /ind.- MSI; richness = 6; abundance = 23) and the breeding site survey (9.5 MSI; richness = 4; abundance = 22) were the most efficient. The visual encounter inventory (45.0 MSI) and patch sampling (65.0 MSI) methods were of comparatively lower efficiency restinga, whereas the plot sampling and the pit-fall traps with drift-fence methods resulted in no frog capture. We conclude that there is a considerable difference in efficiency of methods used in the restinga environment and that the complete species inventory method is highly efficient for sampling frogs in the restinga studied and may be so in other restinga environments. Methods that are usually efficient in forested areas seem to be of little value in open restinga habitats.

  6. Microscopic approach based on a multiscale algebraic version of the resonating group model for radiative capture reactions

    NASA Astrophysics Data System (ADS)

    Solovyev, Alexander S.; Igashov, Sergey Yu.

    2017-12-01

    A microscopic approach to description of radiative capture reactions based on a multiscale algebraic version of the resonating group model is developed. The main idea of the approach is to expand wave functions of discrete spectrum and continuum for a nuclear system over different bases of the algebraic version of the resonating group model. These bases differ from each other by values of oscillator radius playing a role of scale parameter. This allows us in a unified way to calculate total and partial cross sections (astrophysical S factors) as well as branching ratio for the radiative capture reaction, to describe phase shifts for the colliding nuclei in the initial channel of the reaction, and at the same time to reproduce breakup thresholds of the final nucleus. The approach is applied to the theoretical study of the mirror 3H(α ,γ )7Li and 3He(α ,γ )7Be reactions, which are of great interest to nuclear astrophysics. The calculated results are compared with existing experimental data and with our previous calculations in the framework of the single-scale algebraic version of the resonating group model.

  7. Characterization of chromosomal architecture in Arabidopsis by chromosome conformation capture

    PubMed Central

    2013-01-01

    Background The packaging of long chromatin fibers in the nucleus poses a major challenge, as it must fulfill both physical and functional requirements. Until recently, insights into the chromosomal architecture of plants were mainly provided by cytogenetic studies. Complementary to these analyses, chromosome conformation capture technologies promise to refine and improve our view on chromosomal architecture and to provide a more generalized description of nuclear organization. Results Employing circular chromosome conformation capture, this study describes chromosomal architecture in Arabidopsis nuclei from a genome-wide perspective. Surprisingly, the linear organization of chromosomes is reflected in the genome-wide interactome. In addition, we study the interplay of the interactome and epigenetic marks and report that the heterochromatic knob on the short arm of chromosome 4 maintains a pericentromere-like interaction profile and interactome despite its euchromatic surrounding. Conclusion Despite the extreme condensation that is necessary to pack the chromosomes into the nucleus, the Arabidopsis genome appears to be packed in a predictive manner, according to the following criteria: heterochromatin and euchromatin represent two distinct interactomes; interactions between chromosomes correlate with the linear position on the chromosome arm; and distal chromosome regions have a higher potential to interact with other chromosomes. PMID:24267747

  8. ODE Constrained Mixture Modelling: A Method for Unraveling Subpopulation Structures and Dynamics

    PubMed Central

    Hasenauer, Jan; Hasenauer, Christine; Hucho, Tim; Theis, Fabian J.

    2014-01-01

    Functional cell-to-cell variability is ubiquitous in multicellular organisms as well as bacterial populations. Even genetically identical cells of the same cell type can respond differently to identical stimuli. Methods have been developed to analyse heterogeneous populations, e.g., mixture models and stochastic population models. The available methods are, however, either incapable of simultaneously analysing different experimental conditions or are computationally demanding and difficult to apply. Furthermore, they do not account for biological information available in the literature. To overcome disadvantages of existing methods, we combine mixture models and ordinary differential equation (ODE) models. The ODE models provide a mechanistic description of the underlying processes while mixture models provide an easy way to capture variability. In a simulation study, we show that the class of ODE constrained mixture models can unravel the subpopulation structure and determine the sources of cell-to-cell variability. In addition, the method provides reliable estimates for kinetic rates and subpopulation characteristics. We use ODE constrained mixture modelling to study NGF-induced Erk1/2 phosphorylation in primary sensory neurones, a process relevant in inflammatory and neuropathic pain. We propose a mechanistic pathway model for this process and reconstructed static and dynamical subpopulation characteristics across experimental conditions. We validate the model predictions experimentally, which verifies the capabilities of ODE constrained mixture models. These results illustrate that ODE constrained mixture models can reveal novel mechanistic insights and possess a high sensitivity. PMID:24992156

  9. Understanding and Leveraging a Supplier’s CMMI Efforts: A Guidebook for Acquirers (Revised for V1.3)

    DTIC Science & Technology

    2011-09-01

    and SCAMPI (e.g., ISO /IEC 15288, 12207 , 15504; ISO 9001, EIA 632, and IEEE 1220), or if there are processes to be implemented that are not captured...process descriptions and tailoring as well as any formal audit results. ISO -9001:2008 is a quality management standard for development created and...maintained by the International Organisation for Standardisation ( ISO ). The American National Standard equivalent is ANSI/ ISO /ASQ Q9001-2008

  10. History of neuropsychological study of sport-related concussion.

    PubMed

    Webbe, Frank M; Zimmer, Adam

    2015-01-01

    Although the medical literature has a long history of description and comment on concussion, the occurrence of concussion within the context of sports other than boxing was not judged to be problematic until the 1980s. Neuropsychological assessment played a critical and integral role in identifying the cognitive sequelae of concussion and mapping out the short- and long-term vagaries in recovery. This paper captures that history and expands upon current applications of neuropsychological assessment in the diagnosis and management of sport-related concussion.

  11. Complex Critical Exponents for Percolation Transitions in Josephson-Junction Arrays, Antiferromagnets, and Interacting Bosons

    NASA Astrophysics Data System (ADS)

    Fernandes, Rafael M.; Schmalian, Jörg

    2011-02-01

    We show that the critical behavior of the XY quantum-rotor model undergoing a percolation transition is dramatically affected by its topological Berry phase 2πρ. In particular, for irrational ρ, its low-energy excitations emerge as spinless fermions with fractal spectrum. As a result, critical properties not captured by the usual Ginzburg-Landau-Wilson description of phase transitions arise, such as complex critical exponents, log-periodic oscillations and dynamically broken scale invariance.

  12. Monte Carlo Treatment of Displacement Damage in Bandgap Engineered HgCdTe Detectors

    NASA Technical Reports Server (NTRS)

    Fodness, Bryan C.; Marshall, Paul W.; Reed, Robert A.; Jordan, Thomas M.; Pickel, James C.; Jun, Insoo; Xapsos, Michael A.; Burke, Edward A.

    2003-01-01

    The conclusion are: 1. Description of NIEL calculation for short, mid, and longwave HgCdTe material compositions. 2. Full recoil spectra details captured and analyzed Importance of variance in high Z materials. 3. Can be applied directly to calculate damage distributions in arrays. 4. Future work will provide comparisons of measured array damage with calculated NIEL and damage energy distributions. 5. Technique to assess the full recoil spectrum behavior is extendable to other materials.

  13. Inpatient Behavioral Health Recapture A Busiess Case Analysis at Evans Army Community Hospital Fort Carson, Colorado

    DTIC Science & Technology

    2009-07-20

    behavioral health unit. Description Sq. Ft. Quantity Total Sq. Ft. Patient Rooms (2 Bed) 400 3 1200 Patient Rooms (1 Bed) 305 4 1220 Staff Lounge ...overhead expense as captured in EAS IV repository (support services E account). According to the medical EACH Inpt Psych 38 expense and performance...Retrieved on May 7, 2009 from http://www.nytimes.com/2008/11/21/us/21army.html. Hart, S. E ., & Connors, R. E . (1996). Resourcing decision model for

  14. Hydrodynamics of the Dirac spectrum

    DOE PAGES

    Liu, Yizhuang; Warchoł, Piotr; Zahed, Ismail

    2015-12-15

    We discuss a hydrodynamical description of the eigenvalues of the Dirac spectrum in even dimensions in the vacuum and in the large N (volume) limit. The linearized hydrodynamics supports sound waves. The hydrodynamical relaxation of the eigenvalues is captured by a hydrodynamical (tunneling) minimum configuration which follows from a pertinent form of Euler equation. As a result, the relaxation from a phase of unbroken chiral symmetry to a phase of broken chiral symmetry occurs over a time set by the speed of sound.

  15. The Physician's Workstation: Recording a Physical Examination Using a Controlled Vocabulary

    PubMed Central

    Cimino, James J.; Barnett, G. Octo

    1987-01-01

    A system has been developed which runs on MS-DOS personal computers and serves as an experimental model of a physician's workstation. The program provides an interface to a controlled vocabulary which allows rapid selection of appropriate terms and modifiers for entry of clinical information. Because it captures patient descriptions, it has the ability to serve as an intermediary between the physician and computer-based medical knowledge resources. At present, the vocabulary permits rapid, reliable representation of cardiac physical examination findings.

  16. System and process for capture of acid gasses at elevated pressure from gaseous process streams

    DOEpatents

    Heldebrant, David J.; Koech, Phillip K.; Linehan, John C.; Rainbolt, James E.; Bearden, Mark D.; Zheng, Feng

    2016-09-06

    A system, method, and material that enables the pressure-activated reversible chemical capture of acid gasses such as CO.sub.2 from gas volumes such as streams, flows or any other volume. Once the acid gas is chemically captured, the resulting product typically a zwitterionic salt, can be subjected to a reduced pressure whereupon the resulting product will release the captures acid gas and the capture material will be regenerated. The invention includes this process as well as the materials and systems for carrying out and enabling this process.

  17. Numerical study on determining formation porosity using a boron capture gamma ray technique and MCNP.

    PubMed

    Liu, Juntao; Zhang, Feng; Wang, Xinguang; Han, Fei; Yuan, Zhelong

    2014-12-01

    Formation porosity can be determined using the boron capture gamma ray counting ratio with a near to far detector in a pulsed neutron-gamma element logging tool. The thermal neutron distribution, boron capture gamma spectroscopy and porosity response for formations with different water salinity and wellbore diameter characteristics were simulated using the Monte Carlo method. We found that a boron lining improves the signal-to-noise ratio and that the boron capture gamma ray counting ratio has a higher sensitivity for determining porosity than total capture gamma. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John

    1991-01-01

    Docking concepts include capture, berthing, and docking. The definitions of these terms, consistent with AIAA, are as follows: (1) capture (grasping)--the use of a manipulator to make initial contact and attachment between transfer vehicle and a platform; (2) berthing--positioning of a transfer vehicle or payload into platform restraints using a manipulator; and (3) docking--propulsive mechanical connection between vehicle and platform. The combination of the capture and berthing operations is effectively the same as docking; i.e., capture (grasping) + berthing = docking. These concepts are discussed in terms of Martin Marietta's ability to develop validation methods using robotics testbeds.

  19. Estimation of the size of the female sex worker population in Rwanda using three different methods.

    PubMed

    Mutagoma, Mwumvaneza; Kayitesi, Catherine; Gwiza, Aimé; Ruton, Hinda; Koleros, Andrew; Gupta, Neil; Balisanga, Helene; Riedel, David J; Nsanzimana, Sabin

    2015-10-01

    HIV prevalence is disproportionately high among female sex workers compared to the general population. Many African countries lack useful data on the size of female sex worker populations to inform national HIV programmes. A female sex worker size estimation exercise using three different venue-based methodologies was conducted among female sex workers in all provinces of Rwanda in August 2010. The female sex worker national population size was estimated using capture-recapture and enumeration methods, and the multiplier method was used to estimate the size of the female sex worker population in Kigali. A structured questionnaire was also used to supplement the data. The estimated number of female sex workers by the capture-recapture method was 3205 (95% confidence interval: 2998-3412). The female sex worker size was estimated at 3348 using the enumeration method. In Kigali, the female sex worker size was estimated at 2253 (95% confidence interval: 1916-2524) using the multiplier method. Nearly 80% of all female sex workers in Rwanda were found to be based in the capital, Kigali. This study provided a first-time estimate of the female sex worker population size in Rwanda using capture-recapture, enumeration, and multiplier methods. The capture-recapture and enumeration methods provided similar estimates of female sex worker in Rwanda. Combination of such size estimation methods is feasible and productive in low-resource settings and should be considered vital to inform national HIV programmes. © The Author(s) 2015.

  20. Effective Fingerprint Quality Estimation for Diverse Capture Sensors

    PubMed Central

    Xie, Shan Juan; Yoon, Sook; Shin, Jinwook; Park, Dong Sun

    2010-01-01

    Recognizing the quality of fingerprints in advance can be beneficial for improving the performance of fingerprint recognition systems. The representative features to assess the quality of fingerprint images from different types of capture sensors are known to vary. In this paper, an effective quality estimation system that can be adapted for different types of capture sensors is designed by modifying and combining a set of features including orientation certainty, local orientation quality and consistency. The proposed system extracts basic features, and generates next level features which are applicable for various types of capture sensors. The system then uses the Support Vector Machine (SVM) classifier to determine whether or not an image should be accepted as input to the recognition system. The experimental results show that the proposed method can perform better than previous methods in terms of accuracy. In the meanwhile, the proposed method has an ability to eliminate residue images from the optical and capacitive sensors, and the coarse images from thermal sensors. PMID:22163632

Top