Science.gov

Sample records for multiple method integration

  1. Multiple methods integration for structural mechanics analysis and design

    NASA Technical Reports Server (NTRS)

    Housner, J. M.; Aminpour, M. A.

    1991-01-01

    A new research area of multiple methods integration is proposed for joining diverse methods of structural mechanics analysis which interact with one another. Three categories of multiple methods are defined: those in which a physical interface are well defined; those in which a physical interface is not well-defined, but selected; and those in which the interface is a mathematical transformation. Two fundamental integration procedures are presented that can be extended to integrate various methods (e.g., finite elements, Rayleigh Ritz, Galerkin, and integral methods) with one another. Since the finite element method will likely be the major method to be integrated, its enhanced robustness under element distortion is also examined and a new robust shell element is demonstrated.

  2. Method and system of integrating information from multiple sources

    DOEpatents

    Alford, Francine A.; Brinkerhoff, David L.

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  3. Integrating Multiple Teaching Methods into a General Chemistry Classroom.

    ERIC Educational Resources Information Center

    Francisco, Joseph S.; Nicoll, Gayle; Trautmann, Marcella

    1998-01-01

    Four different methods of teaching--cooperative learning, class discussions, concept maps, and lectures--were integrated into a freshman-level general chemistry course to compare students' levels of participation. Findings support the idea that multiple modes of learning foster the metacognitive skills necessary for mastering general chemistry.…

  4. A multistage gene normalization system integrating multiple effective methods.

    PubMed

    Li, Lishuang; Liu, Shanshan; Li, Lihua; Fan, Wenting; Huang, Degen; Zhou, Huiwei

    2013-01-01

    Gene/protein recognition and normalization is an important preliminary step for many biological text mining tasks. In this paper, we present a multistage gene normalization system which consists of four major subtasks: pre-processing, dictionary matching, ambiguity resolution and filtering. For the first subtask, we apply the gene mention tagger developed in our earlier work, which achieves an F-score of 88.42% on the BioCreative II GM testing set. In the stage of dictionary matching, the exact matching and approximate matching between gene names and the EntrezGene lexicon have been combined. For the ambiguity resolution subtask, we propose a semantic similarity disambiguation method based on Munkres' Assignment Algorithm. At the last step, a filter based on Wikipedia has been built to remove the false positives. Experimental results show that the presented system can achieve an F-score of 90.1%, outperforming most of the state-of-the-art systems. PMID:24349160

  5. A Multistage Gene Normalization System Integrating Multiple Effective Methods

    PubMed Central

    Li, Lishuang; Liu, Shanshan; Li, Lihua; Fan, Wenting; Huang, Degen; Zhou, Huiwei

    2013-01-01

    Gene/protein recognition and normalization is an important preliminary step for many biological text mining tasks. In this paper, we present a multistage gene normalization system which consists of four major subtasks: pre-processing, dictionary matching, ambiguity resolution and filtering. For the first subtask, we apply the gene mention tagger developed in our earlier work, which achieves an F-score of 88.42% on the BioCreative II GM testing set. In the stage of dictionary matching, the exact matching and approximate matching between gene names and the EntrezGene lexicon have been combined. For the ambiguity resolution subtask, we propose a semantic similarity disambiguation method based on Munkres' Assignment Algorithm. At the last step, a filter based on Wikipedia has been built to remove the false positives. Experimental results show that the presented system can achieve an F-score of 90.1%, outperforming most of the state-of-the-art systems. PMID:24349160

  6. Numerical solution of optimal control problems using multiple-interval integral Gegenbauer pseudospectral methods

    NASA Astrophysics Data System (ADS)

    Tang, Xiaojun

    2016-04-01

    The main purpose of this work is to provide multiple-interval integral Gegenbauer pseudospectral methods for solving optimal control problems. The latest developed single-interval integral Gauss/(flipped Radau) pseudospectral methods can be viewed as special cases of the proposed methods. We present an exact and efficient approach to compute the mesh pseudospectral integration matrices for the Gegenbauer-Gauss and flipped Gegenbauer-Gauss-Radau points. Numerical results on benchmark optimal control problems confirm the ability of the proposed methods to obtain highly accurate solutions.

  7. Jacobian integration method increases the statistical power to measure gray matter atrophy in multiple sclerosis☆

    PubMed Central

    Nakamura, Kunio; Guizard, Nicolas; Fonov, Vladimir S.; Narayanan, Sridar; Collins, D. Louis; Arnold, Douglas L.

    2013-01-01

    Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS) and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing–remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4–5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials. PMID:24266007

  8. Noisy Speech Recognition Based on Integration/Selection of Multiple Noise Suppression Methods Using Noise GMMs

    NASA Astrophysics Data System (ADS)

    Kitaoka, Norihide; Hamaguchi, Souta; Nakagawa, Seiichi

    To achieve high recognition performance for a wide variety of noise and for a wide range of signal-to-noise ratio, this paper presents methods for integration of four noise reduction algorithms: spectral subtraction with smoothing of time direction, temporal domain SVD-based speech enhancement, GMM-based speech estimation and KLT-based comb-filtering. In this paper, we proposed two types of combination methods of noise suppression algorithms: selection of front-end processor and combination of results from multiple recognition processes. Recognition results on the CENSREC-1 task showed the effectiveness of our proposed methods.kn-abstract=

  9. System and method for integrating and accessing multiple data sources within a data warehouse architecture

    DOEpatents

    Musick, Charles R.; Critchlow, Terence; Ganesh, Madhaven; Slezak, Tom; Fidelis, Krzysztof

    2006-12-19

    A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.

  10. A graphical model method for integrating multiple sources of genome-scale data

    PubMed Central

    Dvorkin, Daniel; Biehs, Brian; Kechris, Katerina

    2016-01-01

    Making effective use of multiple data sources is a major challenge in modern bioinformatics. Genome-wide data such as measures of transcription factor binding, gene expression, and sequence conservation, which are used to identify binding regions and genes that are important to major biological processes such as development and disease, can be difficult to use together due to the different biological meanings and statistical distributions of the heterogeneous data types, but each can provide valuable information for understanding the processes under study. Here we present methods for integrating multiple data sources to gain a more complete picture of gene regulation and expression. Our goal is to identify genes and cis-regulatory regions which play specific biological roles. We describe a graphical mixture model approach for data integration, examine the effect of using different model topologies, and discuss methods for evaluating the effectiveness of the models. Model fitting is computationally efficient and produces results which have clear biological and statistical interpretations. The Hedgehog and Dorsal signaling pathways in Drosophila, which are critical in embryonic development, are used as examples. PMID:23934610

  11. Determination of cloud effective particle size from the multiple-scattering effect on lidar integration-method temperature measurements.

    PubMed

    Reichardt, Jens; Reichardt, Susanne

    2006-04-20

    A method is presented that permits the determination of the cloud effective particle size from Raman- or Rayleigh-integration temperature measurements that exploits the dependence of the multiple-scattering contributions to the lidar signals from heights above the cloud on the particle size of the cloud. Independent temperature information is needed for the determination of size. By use of Raman-integration temperatures, the technique is applied to cirrus measurements. The magnitude of the multiple-scattering effect and the above-cloud lidar signal strength limit the method's range of applicability to cirrus optical depths from 0.1 to 0.5. Our work implies that records of stratosphere temperature obtained with lidar may be affected by multiple scattering in clouds up to heights of 30 km and beyond. PMID:16633433

  12. Method for Visually Integrating Multiple Data Acquisition Technologies for Real Time and Retrospective Analysis

    NASA Technical Reports Server (NTRS)

    Bogart, Edward H. (Inventor); Pope, Alan T. (Inventor)

    2000-01-01

    A system for display on a single video display terminal of multiple physiological measurements is provided. A subject is monitored by a plurality of instruments which feed data to a computer programmed to receive data, calculate data products such as index of engagement and heart rate, and display the data in a graphical format simultaneously on a single video display terminal. In addition live video representing the view of the subject and the experimental setup may also be integrated into the single data display. The display may be recorded on a standard video tape recorder for retrospective analysis.

  13. Accurate and efficient Nyström volume integral equation method for the Maxwell equations for multiple 3-D scatterers

    NASA Astrophysics Data System (ADS)

    Chen, Duan; Cai, Wei; Zinser, Brian; Cho, Min Hyung

    2016-09-01

    In this paper, we develop an accurate and efficient Nyström volume integral equation (VIE) method for the Maxwell equations for a large number of 3-D scatterers. The Cauchy Principal Values that arise from the VIE are computed accurately using a finite size exclusion volume together with explicit correction integrals consisting of removable singularities. Also, the hyper-singular integrals are computed using interpolated quadrature formulae with tensor-product quadrature nodes for cubes, spheres and cylinders, that are frequently encountered in the design of meta-materials. The resulting Nyström VIE method is shown to have high accuracy with a small number of collocation points and demonstrates p-convergence for computing the electromagnetic scattering of these objects. Numerical calculations of multiple scatterers of cubic, spherical, and cylindrical shapes validate the efficiency and accuracy of the proposed method.

  14. Hybrid MCDA Methods to Integrate Multiple Ecosystem Services in Forest Management Planning: A Critical Review

    NASA Astrophysics Data System (ADS)

    Uhde, Britta; Andreas Hahn, W.; Griess, Verena C.; Knoke, Thomas

    2015-08-01

    Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.

  15. Hybrid MCDA Methods to Integrate Multiple Ecosystem Services in Forest Management Planning: A Critical Review.

    PubMed

    Uhde, Britta; Hahn, W Andreas; Griess, Verena C; Knoke, Thomas

    2015-08-01

    Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information. PMID:25896820

  16. Integration of Multiple Field Methods in Characterizing a Field Site with Bayesian Inverse Modeling

    NASA Astrophysics Data System (ADS)

    Savoy, H.; Dietrich, P.; Osorio-Murillo, C. A.; Kalbacher, T.; Kolditz, O.; Ames, D. P.; Rubin, Y.

    2014-12-01

    A hydraulic property of a field can be expressed as a space random function (SRF), and the parameters of that SRF can be constrained by the Method of Anchored Distributions (MAD). MAD is a general Bayesian inverse modeling technique that quantifies the uncertainty of SRF parameters by integrating various direct local data along with indirect non-local data. An example is given with a high-resolution 3D aquifer analog with known hydraulic conductivity (K) and porosity (n) at every location. MAD is applied using different combinations of simulated measurements of K, n, and different scales of hydraulic head that represent different field methods. The ln(K) and n SRF parameters are characterized with each of the method combinations to assess the influence of the methods on the SRFs and their implications. The forward modeling equations are solved by the numerical modeling software OpenGeoSys (opengeosys.org) and MAD is applied with the software MAD# (mad.codeplex.com). The inverse modeling results are compared to the aquifer analog for success evaluation. The goal of the study is to show how integrating combinations of multi-scale and multi-type measurements from the field via MAD can be used to reduce the uncertainty in field-scale SRFs, as well as point values, of hydraulic properties.

  17. Numerical integration of a relativistic two-body problem via a multiple scales method

    NASA Astrophysics Data System (ADS)

    Abouelmagd, Elbaz I.; Elshaboury, S. M.; Selim, H. H.

    2016-01-01

    We offer an analytical study on the dynamics of a two-body problem perturbed by small post-Newtonian relativistic term. We prove that, while the angular momentum is not conserved, the motion is planar. We also show that the energy is subject to small changes due to the relativistic effect. We also offer a periodic solution to this problem, obtained by a method based on the separation of time scales. We demonstrate that our solution is more general than the method developed in the book by Brumberg (Essential Relativistic Celestial Mechanics, Hilger, Bristol, 1991). The practical applicability of this model may be in studies of the long-term evolution of relativistic binaries (neutron stars or black holes).

  18. Formulation of an explicit-multiple-time-step time integration method for use in a global primitive equation grid model

    NASA Technical Reports Server (NTRS)

    Chao, W. C.

    1982-01-01

    With appropriate modifications, a recently proposed explicit-multiple-time-step scheme (EMTSS) is incorporated into the UCLA model. In this scheme, the linearized terms in the governing equations that generate the gravity waves are split into different vertical modes. Each mode is integrated with an optimal time step, and at periodic intervals these modes are recombined. The other terms are integrated with a time step dictated by the CFL condition for low-frequency waves. This large time step requires a special modification of the advective terms in the polar region to maintain stability. Test runs for 72 h show that EMTSS is a stable, efficient and accurate scheme.

  19. Multiple detectors "Influence Method".

    PubMed

    Rios, I J; Mayer, R E

    2016-05-01

    The "Influence Method" is conceived for the absolute determination of a nuclear particle flux in the absence of known detector efficiency and without the need to register coincidences of any kind. This method exploits the influence of the presence of one detector in the count rate of another detector, when they are placed one behind the other and define statistical estimators for the absolute number of incident particles and for the efficiency (Rios and Mayer, 2015a). Its detailed mathematical description was recently published (Rios and Mayer, 2015b) and its practical implementation in the measurement of a moderated neutron flux arising from an isotopic neutron source was exemplified in (Rios and Mayer, 2016). With the objective of further reducing the measurement uncertainties, in this article we extend the method for the case of multiple detectors placed one behind the other. The new estimators for the number of particles and the detection efficiency are herein derived. PMID:26943904

  20. Fast and Broadband Signal Integrity Analysis of Multiple Vias in Heterogeneous 3D IC and Die-Level Packaging by Using Generalized Foldy-Lax Scattering Method

    NASA Astrophysics Data System (ADS)

    Chang, Xin

    This dissertation proposal is concerned with the use of fast and broadband full-wave electromagnetic methods for modeling high speed interconnects (e.g, vertical vias and horizontal traces) and passive components (e.g, decoupling capacitors) for structures of PCB and packages, in 3D IC, Die-level packaging and SIW based devices, to effectively modeling the designs signal integrity (SI) and power integrity (PI) aspects. The main contributions finished in this thesis is to create a novel methodology, which hybridizes the Foldy-Lax multiple scattering equations based fast full wave method, method of moment (MoM) based 1D technology, modes decoupling based geometry decomposition and cavity modes expansions, to model and simulate the electromagnetic scattering effects for the irregular power/ground planes, multiple vias and traces, for fast and accurate analysis of link level simulation on multilayer electronic structures. For the modeling details, the interior massively-coupled multiple vias problem is modeled most-analytically by using the Foldy-Lax multiple scattering equations. The dyadic Green's functions of the magnetic field are expressed in terms of waveguide modes in the vertical direction and vector cylindrical wave expansions or cavity modes expansions in the horizontal direction, combined with 2D MoM realized by 1D technology. For the incident field of the case of vias in the arbitrarily shaped antipad in finite large cavity/waveguide, the exciting and scattering field coefficients are calculated based on the transformation which converts surface integration of magnetic surface currents in antipad into 1D line integration of surface charges on the vias and on the ground plane. Geometry decomposition method is applied to model and integrate both the vertical and horizontal interconnects/traces in arbitrarily shaped power/ground planes. Moreover, a new form of multiple scattering equations is derived for solving coupling effects among mixed metallic

  1. Integral 3D display using multiple LCDs

    NASA Astrophysics Data System (ADS)

    Okaichi, Naoto; Miura, Masato; Arai, Jun; Mishina, Tomoyuki

    2015-03-01

    The quality of the integral 3D images created by a 3D imaging system was improved by combining multiple LCDs to utilize a greater number of pixels than that possible with one LCD. A prototype of the display device was constructed by using four HD LCDs. An integral photography (IP) image displayed by the prototype is four times larger than that reconstructed by a single display. The pixel pitch of the HD display used is 55.5 μm, and the number of elemental lenses is 212 horizontally and 119 vertically. The 3D image pixel count is 25,228, and the viewing angle is 28°. Since this method is extensible, it is possible to display an integral 3D image of higher quality by increasing the number of LCDs. Using this integral 3D display structure makes it possible to make the whole device thinner than a projector-based display system. It is therefore expected to be applied to the home television in the future.

  2. REVA DATA INTEGRATION METHODS

    EPA Science Inventory

    The core of the research effort in the Regional Vulnerability Assessment Program (ReVA) is a set of data integration methods ranging from simple overlays to complex multivariate statistics. These methods are described in the EPA publication titled, "Regional Vulnerability Assess...

  3. Multiple-stage integrating accelerometer

    DOEpatents

    Devaney, H.F.

    1984-06-27

    An accelerometer assembly is provided for use in activating a switch in response to multiple acceleration pulses in series. The accelerometer includes a housing forming a chamber. An inertial mass or piston is slidably disposed in the chamber and spring biased toward a first or reset position. A damping system is also provided to damp piston movement in response to first and subsequent acceleration pulses. Additionally, a cam, including a Z-shaped slot, and cooperating follower pin slidably received therein are mounted to the piston and the housing. The middle or cross-over leg of the Z-shaped slot cooperates with the follower pin to block or limit piston movement and prevent switch activation in response to a lone acceleration pulse. The switch of the assembly is only activated after two or more separate acceleration pulses are sensed and the piston reaches the end of the chamber opposite the reset position.

  4. Multiple-stage integrating accelerometer

    DOEpatents

    Devaney, Howard F.

    1986-01-01

    An accelerometer assembly is provided for use in activating a switch in response to multiple acceleration pulses in series. The accelerometer includes a housing forming a chamber. An inertial mass or piston is slidably disposed in the chamber and spring biased toward a first or reset position. A damping system is also provided to damp piston movement in response to first and subsequent acceleration pulses. Additionally, a cam, including a Z-shaped slot, and cooperating follower pin slidably received therein are mounted to the piston and the housing. The middle or cross-over leg of the Z-shaped slot cooperates with the follower pin to block or limit piston movement and prevent switch activation in response to a lone acceleration pulse. The switch of the assembly is only activated after two or more separate acceleration pulses are sensed and the piston reaches the end of the chamber opposite the reset position.

  5. Interstitial integrals in the multiple-scattering model

    SciTech Connect

    Swanson, J.R.; Dill, D.

    1982-08-15

    We present an efficient method for the evaluation of integrals involving multiple-scattering wave functions over the interstitial region. Transformation of the multicenter interstitial wave functions to a single center representation followed by a geometric projection reduces the integrals to products of analytic angular integrals and numerical radial integrals. The projection function, which has the value 1 in the interstitial region and 0 elsewhere, has a closed-form partial-wave expansion. The method is tested by comparing its results with exact normalization and dipole integrals; the differences are 2% at worst and typically less than 1%. By providing an efficient means of calculating Coulomb integrals, the method allows treatment of electron correlations using a multiple scattering basis set.

  6. Applying Quadrature Rules with Multiple Nodes to Solving Integral Equations

    SciTech Connect

    Hashemiparast, S. M.; Avazpour, L.

    2008-09-01

    There are many procedures for the numerical solution of Fredholm integral equations. The main idea in these procedures is accuracy of the solution. In this paper, we use Gaussian quadrature with multiple nodes to improve the solution of these integral equations. The application of this method is illustrated via some examples, the related tables are given at the end.

  7. Improving Inferences from Multiple Methods.

    ERIC Educational Resources Information Center

    Shotland, R. Lance; Mark, Melvin M.

    1987-01-01

    Multiple evaluation methods (MEMs) can cause an inferential challenge, although there are strategies to strengthen inferences. Practical and theoretical issues involved in the use by social scientists of MEMs, three potential problems in drawing inferences from MEMs, and short- and long-term strategies for alleviating these problems are outlined.…

  8. Method for deploying multiple spacecraft

    NASA Technical Reports Server (NTRS)

    Sharer, Peter J. (Inventor)

    2007-01-01

    A method for deploying multiple spacecraft is disclosed. The method can be used in a situation where a first celestial body is being orbited by a second celestial body. The spacecraft are loaded onto a single spaceship that contains the multiple spacecraft and the spacecraft is launched from the second celestial body towards a third celestial body. The spacecraft are separated from each other while in route to the third celestial body. Each of the spacecraft is then subjected to the gravitational field of the third celestial body and each of the spacecraft assumes a different, independent orbit about the first celestial body. In those situations where the spacecraft are launched from Earth, the Sun can act as the first celestial body, the Earth can act as the second celestial body and the Moon can act as the third celestial body.

  9. Multiplication method for sparse interferometric fringes.

    PubMed

    Liu, Cong; Zhang, Xingyi; Zhou, Youhe

    2016-04-01

    Fringe analysis in the interferometry has been of long-standing interest to the academic community. However, the process of sparse fringe is always a headache in the measurement, especially when the specimen is very small. Through theoretical derivation and experimental measurements, our work demonstrates a new method for fringe multiplication. Theoretically, arbitrary integral-multiple fringe multiplication can be acquired by using the interferogram phase as the parameter. We simulate digital images accordingly and find that not only the skeleton lines of the multiplied fringe are very convenient to extract, but also the main frequency of which can be easily separated from the DC component. Meanwhile, the experimental results have a good agreement with the theoretic ones in a validation using the classical photoelasticity. PMID:27137055

  10. Multiple ray cluster rendering for interactive integral imaging system.

    PubMed

    Jiao, Shaohui; Wang, Xiaoguang; Zhou, Mingcai; Li, Weiming; Hong, Tao; Nam, Dongkyung; Lee, Jin-Ho; Wu, Enhua; Wang, Haitao; Kim, Ji-Yeun

    2013-04-22

    In this paper, we present an efficient Computer Generated Integral Imaging (CGII) method, called multiple ray cluster rendering (MRCR). Based on the MRCR, an interactive integral imaging system is realized, which provides accurate 3D image satisfying the changeable observers' positions in real time. The MRCR method can generate all the elemental image pixels within only one rendering pass by ray reorganization of multiple ray clusters and 3D content duplication. It is compatible with various graphic contents including mesh, point cloud, and medical data. Moreover, multi-sampling method is embedded in MRCR method for acquiring anti-aliased 3D image result. To our best knowledge, the MRCR method outperforms the existing CGII methods in both the speed performance and the display quality. Experimental results show that the proposed CGII method can achieve real-time computational speed for large-scale 3D data with about 50,000 points. PMID:23609712

  11. Multiple identities and the integration of personality.

    PubMed

    Gregg, G S

    1995-09-01

    Life-history interviews show narrators to shift among multiple, often contradictory self-representations. This article outlines a model that accounts for how a relatively small set of self-symbols and metaphors can form a grammar-like system that simultaneously defines and integrates multiple identities. Drawing on generative theories from linguistics, anthropology, and music, the model proposes that this system provides a unitary deep structure that can be configured in various arrangements to yield multiple surface structures. Each "surface" identity constructs an individual's emotions and social relations--and what he or she accepts as "Me" and rejects as "not-Me"--into a distinct pattern, with identity per se appearing as a dialogic or fugue-like structure of opposed voices. Study-of-lives interviews conducted by the author in urban America and rural Morocco are used to present the model and to demonstrate the pivotal role played by multistable or "structurally ambiguous" symbols in anchoring reversible self-representations which integrate personality as a system of organized contraction. The musical analogy is emphasized in order to build a bridge toward current research in cognitive science and toward efforts to formulate a "state integration" theory of personality development. PMID:7562365

  12. Accelerated adaptive integration method.

    PubMed

    Kaus, Joseph W; Arrar, Mehrnoosh; McCammon, J Andrew

    2014-05-15

    Conformational changes that occur upon ligand binding may be too slow to observe on the time scales routinely accessible using molecular dynamics simulations. The adaptive integration method (AIM) leverages the notion that when a ligand is either fully coupled or decoupled, according to λ, barrier heights may change, making some conformational transitions more accessible at certain λ values. AIM adaptively changes the value of λ in a single simulation so that conformations sampled at one value of λ seed the conformational space sampled at another λ value. Adapting the value of λ throughout a simulation, however, does not resolve issues in sampling when barriers remain high regardless of the λ value. In this work, we introduce a new method, called Accelerated AIM (AcclAIM), in which the potential energy function is flattened at intermediate values of λ, promoting the exploration of conformational space as the ligand is decoupled from its receptor. We show, with both a simple model system (Bromocyclohexane) and the more complex biomolecule Thrombin, that AcclAIM is a promising approach to overcome high barriers in the calculation of free energies, without the need for any statistical reweighting or additional processors. PMID:24780083

  13. Accelerated Adaptive Integration Method

    PubMed Central

    2015-01-01

    Conformational changes that occur upon ligand binding may be too slow to observe on the time scales routinely accessible using molecular dynamics simulations. The adaptive integration method (AIM) leverages the notion that when a ligand is either fully coupled or decoupled, according to λ, barrier heights may change, making some conformational transitions more accessible at certain λ values. AIM adaptively changes the value of λ in a single simulation so that conformations sampled at one value of λ seed the conformational space sampled at another λ value. Adapting the value of λ throughout a simulation, however, does not resolve issues in sampling when barriers remain high regardless of the λ value. In this work, we introduce a new method, called Accelerated AIM (AcclAIM), in which the potential energy function is flattened at intermediate values of λ, promoting the exploration of conformational space as the ligand is decoupled from its receptor. We show, with both a simple model system (Bromocyclohexane) and the more complex biomolecule Thrombin, that AcclAIM is a promising approach to overcome high barriers in the calculation of free energies, without the need for any statistical reweighting or additional processors. PMID:24780083

  14. Integrated management of multiple reservoir field developments

    SciTech Connect

    Lyons, S.L.; Chan, H.M.; Harper, J.L.; Boyett, B.A.; Dowson, P.R.; Bette, S.

    1995-10-01

    This paper consists of two sections. The authors first describe the coupling of a pipeline network model to a reservoir simulator and then the application of this new simulator to optimize the production strategy of two Mobil field developments. Mobil`s PEGASUS simulator is an integrated all purpose reservoir simulator that handles black-oil, compositional, faulted and naturally fractured reservoirs. The authors have extended the simulator to simultaneously model multiple reservoirs coupled with surface pipeline networks and processes. This allows them to account for the effects of geology, well placement, and surface production facilities on well deliverability in a fully integrated fashion. They have also developed a gas contract allocation system that takes the user-specified constraints, target rates and swing factors and automatically assigns rates to the individual wells of each reservoir. This algorithm calculates the overall deliverability and automatically reduces the user-specified target rates to meet the deliverability constraints. The algorithm and solution technique are described. This enhanced simulator has been applied to model a Mobil field development in the Southern Gas Basin, offshore United Kingdom, which consists of three separate gas reservoirs connected via a pipeline network. The simulator allowed the authors to accurately determine the impact on individual reservoir and total field performance by varying the development timing of these reservoirs. Several development scenarios are shown to illustrate the capabilities of PEGASUS. Another application of this technology is in the field developments in North Sumatra, Indonesia. Here the objective is to economically optimize the development of multiple fields to feed the PT Arun LNG facility. Consideration of a range of gas compositions, well productivity`s, and facilities constraints in an integrated fashion results in improved management of these assets. Model specifics are discussed.

  15. Comparing three feedback internal multiple elimination methods

    NASA Astrophysics Data System (ADS)

    Song, Jiawen; Verschuur, Eric; Chen, Xiaohong

    2013-08-01

    Multiple reflections have posed a great challenge for current seismic imaging and inversion methods. Compared to surface multiples, internal multiples are more difficult to remove due to poorer move-out discrimination with primaries and we are left with wave equation-based prediction and subtraction methods. In this paper, we focus on the comparison of three data-driven internal multiple elimination (IME) methods based on the feedback model, where two are well established prediction-and-subtraction methods using back-propagated data and surface data, referred to as CFP-based method and surface-based method, respectively, and the third one, an inversion-based method, has been recently extended from estimation of primaries by sparse inversion (EPSI). All these three methods are based on the separation of events from above and below a certain level, after which internal multiples are predicted by convolutions and correlations. We begin with theory review of layer-related feedback IME methods, where implementation steps for each method are discussed, and involved event separation are further analyzed. Then, recursive application of the three IME methods is demonstrated on synthetic data and field data. It shows that the two well established prediction-and-subtraction methods provide similar primary estimation results, with most of the internal multiples being removed while multiple leakage and primary distortion have been observed where primaries and internal multiples interfere. In contrast, generalized EPSI provides reduced multiple leakage and better primary restoration which is of great value for current seismic amplitude-preserved processing. As a main conclusion, with adaptive subtraction avoided, the inversion-based method is more effective than the prediction-and-subtraction methods for internal multiple elimination when primaries and internal multiples overlap. However, the inversion-based method is quite computationally intensive, and more researches on

  16. Multiple network interface core apparatus and method

    SciTech Connect

    Underwood, Keith D.; Hemmert, Karl Scott

    2011-04-26

    A network interface controller and network interface control method comprising providing a single integrated circuit as a network interface controller and employing a plurality of network interface cores on the single integrated circuit.

  17. Multiple protocol fluorometer and method

    DOEpatents

    Kolber, Zbigniew S.; Falkowski, Paul G.

    2000-09-19

    A multiple protocol fluorometer measures photosynthetic parameters of phytoplankton and higher plants using actively stimulated fluorescence protocols. The measured parameters include spectrally-resolved functional and optical absorption cross sections of PSII, extent of energy transfer between reaction centers of PSII, F.sub.0 (minimal), F.sub.m (maximal) and F.sub.v (variable) components of PSII fluorescence, photochemical and non-photochemical quenching, size of the plastoquinone (PQ) pool, and the kinetics of electron transport between Q.sub.a and PQ pool and between PQ pool and PSI. The multiple protocol fluorometer, in one embodiment, is equipped with an excitation source having a controlled spectral output range between 420 nm and 555 nm and capable of generating flashlets having a duration of 0.125-32 .mu.s, an interval between 0.5 .mu.s and 2 seconds, and peak optical power of up to 2 W/cm.sup.2. The excitation source is also capable of generating, simultaneous with the flashlets, a controlled continuous, background illumination.

  18. Predicting Protein Function via Semantic Integration of Multiple Networks.

    PubMed

    Yu, Guoxian; Fu, Guangyuan; Wang, Jun; Zhu, Hailong

    2016-01-01

    Determining the biological functions of proteins is one of the key challenges in the post-genomic era. The rapidly accumulated large volumes of proteomic and genomic data drives to develop computational models for automatically predicting protein function in large scale. Recent approaches focus on integrating multiple heterogeneous data sources and they often get better results than methods that use single data source alone. In this paper, we investigate how to integrate multiple biological data sources with the biological knowledge, i.e., Gene Ontology (GO), for protein function prediction. We propose a method, called SimNet, to Semantically i ntegrate multiple functional association Networks derived from heterogenous data sources. SimNet firstly utilizes GO annotations of proteins to capture the semantic similarity between proteins and introduces a semantic kernel based on the similarity. Next, SimNet constructs a composite network, obtained as a weighted summation of individual networks, and aligns the network with the kernel to get the weights assigned to individual networks. Then, it applies a network-based classifier on the composite network to predict protein function. Experiment results on heterogenous proteomic data sources of Yeast, Human, Mouse, and Fly show that, SimNet not only achieves better (or comparable) results than other related competitive approaches, but also takes much less time. The Matlab codes of SimNet are available at https://sites.google.com/site/guoxian85/simnet. PMID:26800544

  19. Convergence and Discriminant: Assessing Multiple Traits Using Multiple Methods

    ERIC Educational Resources Information Center

    Pae, Hye K.

    2012-01-01

    Multiple traits of language proficiency as well as test method effects were concurrently analyzed to investigate interrelations of construct validity, convergent validity, and discriminant validity using multitrait-multimethod (MTMM) matrices. A total of 585 test takers' scores were derived from the field test of the "Pearson Test of English…

  20. Complementary and Integrative Medicine - Multiple Languages: MedlinePlus

    MedlinePlus

    ... Are Here: Home → Multiple Languages → All Health Topics → Complementary and Integrative Medicine URL of this page: https://www.nlm.nih. ... V W XYZ List of All Topics All Complementary and Integrative Medicine - Multiple Languages To use the sharing features on ...

  1. Code Division Multiple Access system candidate for integrated modular avionics

    NASA Astrophysics Data System (ADS)

    Mendez, Antonio J.; Gagliardi, Robert M.

    1991-02-01

    There are government and industry trends towards avionics modularity and integrated avionics. Key requirements implicit in these trends are suitable data communication concepts compatible with the integration concept. In this paper we explore the use ofCode Division Multiple Access (CDMA) techniques as an alternative to collision detection and collision avoidance multiple access techniques.

  2. Integrating Multiple Intelligences in EFL/ESL Classrooms

    ERIC Educational Resources Information Center

    Bas, Gokhan

    2008-01-01

    This article deals with the integration of the theory of Multiple Intelligences in EFL/ESL classrooms. In this study, after the theory of multiple intelligences was presented shortly, the integration of this theory into English classrooms. Intelligence types in MI Theory were discussed and some possible application ways of these intelligence types…

  3. Complementary and Integrative Medicine - Multiple Languages: MedlinePlus

    MedlinePlus

    ... Are Here: Home → Multiple Languages → All Health Topics → Complementary and Integrative Medicine URL of this page: https://medlineplus.gov/languages/ ... V W XYZ List of All Topics All Complementary and Integrative Medicine - Multiple Languages To use the sharing features on ...

  4. Integrated Instruction: Multiple Intelligences and Technology

    ERIC Educational Resources Information Center

    McCoog, Ian J.

    2007-01-01

    Advancements in technology have changed the day to day operation of society. The ways in which we teach and learn have begun the same process. For this reason, we must reexamine instruction. In this article, the author analyzes the changing environment of educational technology and how to incorporate the theory of multiple intelligences. The…

  5. Integrating Learning Styles and Multiple Intelligences.

    ERIC Educational Resources Information Center

    Silver, Harvey; Strong, Richard; Perini, Matthew

    1997-01-01

    Multiple-intelligences theory (MI) explores how cultures and disciplines shape human potential. Both MI and learning-style theories reject dominant ideologies of intelligence. Whereas learning styles are concerned with differences in the learning process, MI centers on learning content and products. Blending learning styles and MI theories via…

  6. Integrating Qualitative and Quantitative Evaluation Methods in Substance Abuse Research.

    ERIC Educational Resources Information Center

    Dennis, Michael L.; And Others

    1994-01-01

    Some specific opportunities and techniques are described for combining and integrating qualitative and quantitative methods from the design stage of a substance abuse program evaluation through implementation and reporting. The multiple problems and requirements of such an evaluation make integrated methods essential. (SLD)

  7. Building a cognitive map by assembling multiple path integration systems.

    PubMed

    Wang, Ranxiao Frances

    2016-06-01

    Path integration and cognitive mapping are two of the most important mechanisms for navigation. Path integration is a primitive navigation system which computes a homing vector based on an animal's self-motion estimation, while cognitive map is an advanced spatial representation containing richer spatial information about the environment that is persistent and can be used to guide flexible navigation to multiple locations. Most theories of navigation conceptualize them as two distinctive, independent mechanisms, although the path integration system may provide useful information for the integration of cognitive maps. This paper demonstrates a fundamentally different scenario, where a cognitive map is constructed in three simple steps by assembling multiple path integrators and extending their basic features. The fact that a collection of path integration systems can be turned into a cognitive map suggests the possibility that cognitive maps may have evolved directly from the path integration system. PMID:26442503

  8. Integral Methodological Pluralism in Science Education Research: Valuing Multiple Perspectives

    ERIC Educational Resources Information Center

    Davis, Nancy T.; Callihan, Laurie P.

    2013-01-01

    This article examines the multiple methodologies used in educational research and proposes a model that includes all of them as contributing to understanding educational contexts and research from multiple perspectives. The model, based on integral theory (Wilber in a theory of everything. Shambhala, Boston, 2000) values all forms of research as…

  9. Integral methodological pluralism in science education research: valuing multiple perspectives

    NASA Astrophysics Data System (ADS)

    Davis, Nancy T.; Callihan, Laurie P.

    2013-09-01

    This article examines the multiple methodologies used in educational research and proposes a model that includes all of them as contributing to understanding educational contexts and research from multiple perspectives. The model, based on integral theory (Wilber in a theory of everything. Shambhala, Boston, 2000) values all forms of research as true, but partial. Consideration of objective (exterior) forms of research and data and subjective (interior) forms of research and data are further divided into individual and collective domains. Taking this categorization system one step further reveals eight indigenous perspectives that form a framework for considering research methodologies. Each perspective has unique questions, data sources, methods and quality criteria designed to reveal what is "true" from that view. As science educators who guide our students' research, this framework offers a useful guide to explain differences in types of research, the purpose and validity of each. It allows professional science educators to appreciate multiple forms of research while maintaining rigorous quality criteria. Use of this framework can also help avoid problems of imposing quality criteria of one methodology on research data and questions gathered using another methodology. This model is explored using the second author's dissertation research. Finally a decision chart is provided to use with those who are starting inquiries to guide their thinking and choice of appropriate methodologies to use when conducting research.

  10. The Effects of Tasks on Integrating Information from Multiple Documents

    ERIC Educational Resources Information Center

    Cerdan, Raquel; Vidal-Abarca, Eduardo

    2008-01-01

    The authors examine 2 issues: (a) how students integrate information from multiple scientific documents to describe and explain a physical phenomenon that represents a subset of the information in the documents; and (b) the role of 2 sorts of tasks to achieve this type of integration, either writing an essay on a question requiring integration…

  11. Temporal Characterization of Hydrates System Dynamics beneath Seafloor Mounds. Integrating Time-Lapse Electrical Resistivity Methods and In Situ Observations of Multiple Oceanographic Parameters

    SciTech Connect

    Lutken, Carol; Macelloni, Leonardo; D'Emidio, Marco; Dunbar, John; Higley, Paul

    2015-01-31

    detect short-term changes within the hydrates system, identify relationships/impacts of local oceanographic parameters on the hydrates system, and improve our understanding of how seafloor instability is affected by hydrates-driven changes. A 2009 DCR survey of MC118 demonstrated that we could image resistivity anomalies to a depth of 75m below the seafloor in water depths of 1km. We reconfigured this system to operate autonomously on the seafloor in a pre-programmed mode, for periods of months. We designed and built a novel seafloor lander and deployment capability that would allow us to investigate the seafloor at potential deployment sites and deploy instruments only when conditions met our criteria. This lander held the DCR system, controlling computers, and battery power supply, as well as instruments to record oceanographic parameters. During the first of two cruises to the study site, we conducted resistivity surveying, selected a monitoring site, and deployed the instrumented lander and DCR, centered on what appeared to be the most active locations within the site, programmed to collect a DCR profile, weekly. After a 4.5-month residence on the seafloor, the team recovered all equipment. Unfortunately, several equipment failures occurred prior to recovery of the instrument packages. Prior to the failures, however, two resistivity profiles were collected together with oceanographic data. Results show, unequivocally, that significant changes can occur in both hydrate volume and distribution during time periods as brief as one week. Occurrences appear to be controlled by both deep and near-surface structure. Results have been integrated with seismic data from the area and show correspondence in space of hydrate and structures, including faults and gas chimneys.

  12. A multiple index integrating different levels of organization.

    PubMed

    Cortes, Rui; Hughes, Samantha; Coimbra, Ana; Monteiro, Sandra; Pereira, Vítor; Lopes, Marisa; Pereira, Sandra; Pinto, Ana; Sampaio, Ana; Santos, Cátia; Carrola, João; de Jesus, Joaquim; Varandas, Simone

    2016-10-01

    Many methods in freshwater biomonitoring tend to be restricted to a few levels of biological organization, limiting the potential spectrum of measurable of cause-effect responses to different anthropogenic impacts. We combined distinct organisational levels, covering biological biomarkers (histopathological and biochemical reactions in liver and fish gills), community based bioindicators (fish guilds, invertebrate metrics/traits and chironomid pupal exuviae) and ecosystem functional indicators (decomposition rates) to assess ecological status at designated Water Framework Directive monitoring sites, covering a gradient of human impact across several rivers in northern Portugal. We used Random Forest to rank the variables that contributed more significantly to successfully predict the different classes of ecological status and also to provide specific cut levels to discriminate each WFD class based on reference condition. A total of 59 Biological Quality Elements and functional indicators were determined using this procedure and subsequently applied to develop the integrated Multiple Ecological Level Index (MELI Index), a potentially powerful bioassessment tool. PMID:27344015

  13. A selective integrated tempering method.

    PubMed

    Yang, Lijiang; Qin Gao, Yi

    2009-12-01

    In this paper, based on the integrated tempering sampling we introduce a selective integrated tempering sampling (SITS) method for the efficient conformation sampling and thermodynamics calculations for a subsystem in a large one, such as biomolecules solvated in aqueous solutions. By introducing a potential surface scaled with temperature, the sampling over the configuration space of interest (e.g., the solvated biomolecule) is selectively enhanced but the rest of the system (e.g., the solvent) stays largely unperturbed. The applications of this method to biomolecular systems allow highly efficient sampling over both energy and configuration spaces of interest. Comparing to the popular and powerful replica exchange molecular dynamics (REMD), the method presented in this paper is significantly more efficient in yielding relevant thermodynamics quantities (such as the potential of mean force for biomolecular conformational changes in aqueous solutions). It is more important that SITS but not REMD yielded results that are consistent with the traditional umbrella sampling free energy calculations when explicit solvent model is used since SITS avoids the sampling of the irrelevant phase space (such as the boiling water at high temperatures). PMID:19968339

  14. Methods for comparing multiple digital PCR experiments.

    PubMed

    Burdukiewicz, Michał; Rödiger, Stefan; Sobczyk, Piotr; Menschikowski, Mario; Schierack, Peter; Mackiewicz, Paweł

    2016-09-01

    The estimated mean copy per partition (λ) is the essential information from a digital PCR (dPCR) experiment because λ can be used to calculate the target concentration in a sample. However, little information is available how to statistically compare dPCR runs of multiple runs or reduplicates. The comparison of λ values from several runs is a multiple comparison problem, which can be solved using the binary structure of dPCR data. We propose and evaluate two novel methods based on Generalized Linear Models (GLM) and Multiple Ratio Tests (MRT) for comparison of digital PCR experiments. We enriched our MRT framework with computation of simultaneous confidence intervals suitable for comparing multiple dPCR runs. The evaluation of both statistical methods support that MRT is faster and more robust for dPCR experiments performed in large scale. Our theoretical results were confirmed by the analysis of dPCR measurements of dilution series. Both methods were implemented in the dpcR package (v. 0.2) for the open source R statistical computing environment. PMID:27551672

  15. Adaptive wavelet methods - Matrix-vector multiplication

    NASA Astrophysics Data System (ADS)

    Černá, Dana; Finěk, Václav

    2012-12-01

    The design of most adaptive wavelet methods for elliptic partial differential equations follows a general concept proposed by A. Cohen, W. Dahmen and R. DeVore in [3, 4]. The essential steps are: transformation of the variational formulation into the well-conditioned infinite-dimensional l2 problem, finding of the convergent iteration process for the l2 problem and finally derivation of its finite dimensional version which works with an inexact right hand side and approximate matrix-vector multiplications. In our contribution, we shortly review all these parts and wemainly pay attention to approximate matrix-vector multiplications. Effective approximation of matrix-vector multiplications is enabled by an off-diagonal decay of entries of the wavelet stiffness matrix. We propose here a new approach which better utilize actual decay of matrix entries.

  16. Prioritizing Cancer Therapeutic Small Molecules by Integrating Multiple OMICS Datasets

    PubMed Central

    Lv, Sali; Xu, Yanjun; Chen, Xin; Li, Yan; Li, Ronghong; Wang, Qianghu

    2012-01-01

    Abstract Drug design is crucial for the effective discovery of anti-cancer drugs. The success or failure of drug design often depends on the leading compounds screened in pre-clinical studies. Many efforts, such as in vivo animal experiments and in vitro drug screening, have improved this process, but these methods are usually expensive and laborious. In the post-genomics era, it is possible to seek leading compounds for large-scale candidate small-molecule screening with multiple OMICS datasets. In the present study, we developed a computational method of prioritizing small molecules as leading compounds by integrating transcriptomics and toxicogenomics data. This method provides priority lists for the selection of leading compounds, thereby reducing the time required for drug design. We found 11 known therapeutic small molecules for breast cancer in the top 100 candidates in our list, 2 of which were in the top 10. Furthermore, another 3 of the top 10 small molecules were recorded as closely related to cancer treatment in the DrugBank database. A comparison of the results of our approach with permutation tests and shared gene methods demonstrated that our OMICS data-based method is quite competitive. In addition, we applied our method to a prostate cancer dataset. The results of this analysis indicated that our method surpasses both the shared gene method and random selection. These analyses suggest that our method may be a valuable tool for directing experimental studies in cancer drug design, and we believe this time- and cost-effective computational strategy will be helpful in future studies in cancer therapy. PMID:22917481

  17. From multiple unitarity cuts to the coproduct of Feynman integrals

    NASA Astrophysics Data System (ADS)

    Abreu, Samuel; Britto, Ruth; Duhr, Claude; Gardi, Einan

    2014-10-01

    We develop techniques for computing and analyzing multiple unitarity cuts of Feynman integrals, and reconstructing the integral from these cuts. We study the relations among unitarity cuts of a Feynman integral computed via diagrammatic cutting rules, the discontinuity across the corresponding branch cut, and the coproduct of the integral. For single unitarity cuts, these relations are familiar. Here we show that they can be generalized to sequences of unitarity cuts in different channels. Using concrete one- and two-loop scalar integral examples we demonstrate that it is possible to reconstruct a Feynman integral from either single or double unitarity cuts. Our results offer insight into the analytic structure of Feynman integrals as well as a new approach to computing them.

  18. Research on model of combining multiple neural networks by fuzzy integral-MNNF

    NASA Astrophysics Data System (ADS)

    Fu, Yue; Chai, Bianfang

    2013-03-01

    The method of multiple neural network Fusion using Fuzzy Integral (MNNF) presented by this paper is to improve the detection performance of data mining-based intrusion detection system. The basic idea of MNNF is to mine on distinct feature training dataset by neural networks separately, and detect TCP/IP data by different neural networks, and then nonlinearly combine the results from multiple neural networks by fuzzy integral. The experiment results show that this technique is superior to single neural networks for intrusion detection in terms of classification accuracy. Compared with other combination methods such as Majority, Average, Borda count, fuzzy integral is better than one of them.

  19. A Fuzzy Logic Framework for Integrating Multiple Learned Models

    SciTech Connect

    Bobi Kai Den Hartog

    1999-03-01

    The Artificial Intelligence field of Integrating Multiple Learned Models (IMLM) explores ways to combine results from sets of trained programs. Aroclor Interpretation is an ill-conditioned problem in which trained programs must operate in scenarios outside their training ranges because it is intractable to train them completely. Consequently, they fail in ways related to the scenarios. We developed a general-purpose IMLM solution, the Combiner, and applied it to Aroclor Interpretation. The Combiner's first step, Scenario Identification (M), learns rules from very sparse, synthetic training data consisting of results from a suite of trained programs called Methods. S1 produces fuzzy belief weights for each scenario by approximately matching the rules. The Combiner's second step, Aroclor Presence Detection (AP), classifies each of three Aroclors as present or absent in a sample. The third step, Aroclor Quantification (AQ), produces quantitative values for the concentration of each Aroclor in a sample. AP and AQ use automatically learned empirical biases for each of the Methods in each scenario. Through fuzzy logic, AP and AQ combine scenario weights, automatically learned biases for each of the Methods in each scenario, and Methods' results to determine results for a sample.

  20. Lamp method and apparatus using multiple reflections

    DOEpatents

    MacLennan, D.A.; Turner, B.; Kipling, K.

    1999-05-11

    A method wherein the light in a sulfur or selenium lamp is reflected through the fill a multiplicity of times to convert ultraviolet radiation to visible is disclosed. A light emitting device comprised of an electrodeless envelope which bears a light reflecting covering around a first portion which does not crack due to differential thermal expansion and which has a second portion which comprises a light transmissive aperture. 20 figs.

  1. Lamp method and apparatus using multiple reflections

    DOEpatents

    MacLennan, Donald A.; Turner, Brian; Kipling, Kent

    1999-01-01

    A method wherein the light in a sulfur or selenium lamp is reflected through the fill a multiplicity of times to convert ultraviolet radiation to visible. A light emitting device comprised of an electrodeless envelope which bears a light reflecting covering around a first portion which does not crack due to differential thermal expansion and which has a second portion which comprises a light transmissive aperture.

  2. An integrated map correlation method and multiple-source sites drainage-area ratio method for estimating streamflows at ungauged catchments: A case study of the Western Black Sea Region, Turkey.

    PubMed

    Ergen, Kayra; Kentel, Elcin

    2016-01-15

    Stream gauges measure the temporal variation of water quantity; thus they are vital in managing water resources. The stream gauge network in Turkey includes a limited number of gauges and often streamflow estimates need to be generated at ungauged locations where reservoirs, small hydropower plants, weirs, etc. are planned. Prediction of streamflows at ungauged locations generally relies on donor gauges where flow is assumed to be similar to that at the ungauged location. Generally, donor stream gauges are selected based on geographical proximity. However, closer stream gauges are not always the most-correlated ones. The Map Correlation Method (MCM) enables development of a map that shows the spatial distribution of the correlation between a selected stream gauge and any other location within the study region. In this study, a new approach which combines MCM with the multiple-source site drainage-area ratio (DAR) method is used to estimate daily streamflows at ungauged catchments in the Western Black Sea Region. Daily streamflows predicted by the combined three-source sites DAR with MCM approach give higher Nash-Sutcliffe Efficiency (NSE) values than those predicted using the nearest stream gauge as the donor stream gauge, for most of the trial cases. Hydrographs and flow duration curves predicted using this approach are usually in better agreement with the observed hydrographs and flow duration curves than those predicted using the nearest catchment. PMID:26520038

  3. Multiple time scale methods in tokamak magnetohydrodynamics

    SciTech Connect

    Jardin, S.C.

    1984-01-01

    Several methods are discussed for integrating the magnetohydrodynamic (MHD) equations in tokamak systems on other than the fastest time scale. The dynamical grid method for simulating ideal MHD instabilities utilizes a natural nonorthogonal time-dependent coordinate transformation based on the magnetic field lines. The coordinate transformation is chosen to be free of the fast time scale motion itself, and to yield a relatively simple scalar equation for the total pressure, P = p + B/sup 2//2..mu../sub 0/, which can be integrated implicitly to average over the fast time scale oscillations. Two methods are described for the resistive time scale. The zero-mass method uses a reduced set of two-fluid transport equations obtained by expanding in the inverse magnetic Reynolds number, and in the small ratio of perpendicular to parallel mobilities and thermal conductivities. The momentum equation becomes a constraint equation that forces the pressure and magnetic fields and currents to remain in force balance equilibrium as they evolve. The large mass method artificially scales up the ion mass and viscosity, thereby reducing the severe time scale disparity between wavelike and diffusionlike phenomena, but not changing the resistive time scale behavior. Other methods addressing the intermediate time scales are discussed.

  4. An Alternative Method for Multiplication of Rhotrices. Classroom Notes

    ERIC Educational Resources Information Center

    Sani, B.

    2004-01-01

    In this article, an alternative multiplication method for rhotrices is proposed. The method establishes some relationships between rhotrices and matrices. This article has discussed a modified multiplication method for rhotrices. The method has a direct relationship with matrix multiplication, and so rhotrices under this multiplication procedure…

  5. Multiple frequency method for operating electrochemical sensors

    SciTech Connect

    Martin, Louis P.

    2012-05-15

    A multiple frequency method for the operation of a sensor to measure a parameter of interest using calibration information including the steps of exciting the sensor at a first frequency providing a first sensor response, exciting the sensor at a second frequency providing a second sensor response, using the second sensor response at the second frequency and the calibration information to produce a calculated concentration of the interfering parameters, using the first sensor response at the first frequency, the calculated concentration of the interfering parameters, and the calibration information to measure the parameter of interest.

  6. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  7. High integrity carrier phase navigation using multiple civil GPS signals

    NASA Astrophysics Data System (ADS)

    Jung, Jaewoo

    2000-11-01

    A navigation system should guide users to their destinations accurately and reliably. Among the many available navigation aids, the Global Positioning System stands out due to its unique capabilities. It is a satellite-based navigation system which covers the entire Earth with horizontal accuracy of 20 meters for stand alone civil users. Today, the GPS provides only one civil signal, but two more signals will be available in the near future. GPS will provide a second signal at 1227.60 MHz (L2) and a third signal at 1176.45 MHz (Lc), in addition to the current signal at 1575.42 MHz (L1). The focus of this thesis is exploring the possibility of using beat frequencies of these signals to provide navigation aid to users with high accuracy and integrity. To achieve high accuracy, the carrier phase differential GPS is used. The integer ambiguity is resolved using the Cascade Integer Resolution (CIR), which is defined in this thesis. The CIR is an instantaneous, geometry-free integer resolution method utilizing beat frequencies of GPS signals. To insure high integrity, the probability of incorrect integer ambiguity resolution using the CIR is analyzed. The CIR can immediately resolve the Lc integer ambiguity up to 2.4 km from the reference receiver, the Widelane (L1-L2) integer ambiguity up to 22 km, and the Extra Widelane (L2-Lc) integer ambiguity from there on, with probability of incorrect integer resolution of 10-4 . The optimal use of algebraic combinations of multiple GPS signals are also investigated in this thesis. Finally, the gradient of residual differential ionospheric error is estimated to stimated to increase performance of the CIR.

  8. Integration of multiple sensor fusion in controller design.

    PubMed

    Abdelrahman, Mohamed; Kandasamy, Parameshwaran

    2003-04-01

    The main focus of this research is to reduce the risk of a catastrophic response of a feedback control system when some of the feedback data from the system sensors are not reliable, while maintaining a reasonable performance of the control system. In this paper a methodology for integrating multiple sensor fusion into the controller design is presented. The multiple sensor fusion algorithm produces, in addition to the estimate of the measurand, a parameter that measures the confidence in the estimated value. This confidence is integrated as a parameter into the controller to produce fast system response when the confidence in the estimate is high, and a slow response when the confidence in the estimate is low. Conditions for the stability of the system with the developed controller are discussed. This methodology is demonstrated on a cupola furnace model. The simulations illustrate the advantages of the new methodology. PMID:12708539

  9. Downsizing of an integrated tracking unit for multiple applications

    NASA Astrophysics Data System (ADS)

    Steinway, William J.; Thomas, James E.; Nicoloff, Michael J.; Patz, Mark D.

    1997-02-01

    This paper describes the specifications and capabilities of the integrated tracking unit (ITU) and its multiple applications are presented. The original ITU was developed by Coleman Research Corporation (CRC) for several federal law enforcement agencies over a four-year period and it has been used for friendly and unfriendly vehicle and person position tracking. The ITU has been down-sized to reduce its physical size, weight, and power requirements with respect to the first generation unit. The ITU consists of a global positioning system (GPS) receiver for precise position location and a cellular phone to transmit voice and data to a PC base station with a modem interface. This paper describes the down-sizing of the unit introduced in CRC's 'An Integrated Tracking Unit for Multiple Applications' paper presented at the 1995 Counterdrug Technology Assessment Center's symposium in Nashua, NH. This paper provides a description of the ITU and tested applications.

  10. Deconstructing Calculation Methods, Part 3: Multiplication

    ERIC Educational Resources Information Center

    Thompson, Ian

    2008-01-01

    In this third of a series of four articles, the author deconstructs the primary national strategy's approach to written multiplication. The approach to multiplication, as set out on pages 12 to 15 of the primary national strategy's "Guidance paper" "Calculation" (DfES, 2007), is divided into six stages: (1) mental multiplication using…

  11. Integrated multiple-input multiple-output visible light communications systems: recent progress and results

    NASA Astrophysics Data System (ADS)

    O'Brien, Dominic; Haas, Harald; Rajbhandari, Sujan; Chun, Hyunchae; Faulkner, Grahame; Cameron, Katherine; Jalajakumari, Aravind V. N.; Henderson, Robert; Tsonev, Dobroslav; Ijaz, Muhammad; Chen, Zhe; Xie, Enyuan; McKendry, Jonathan J. D.; Herrnsdorf, Johannes; Gu, Erdan; Dawson, Martin D.

    2015-01-01

    Solid state lighting systems typically use multiple Light Emitting Diode (LED) die within a single lamp, and multiple lamps within a coverage space. This infrastructure forms the transmitters for Visible Light Communications (VLC), and the availability of low-cost detector arrays offers the possibility of building Multiple Input Multiple Output (MIMO) transmission systems. Different approaches to optical MIMO are being investigated as part of a UK government funded research programme, `Ultra-Parallel Visible Light Communications' (UPVLC). In this paper we present a brief review of the area and report results from systems that use integrated subsystems developed as part of the project. The scalability of these approaches and future directions will also be discussed.

  12. Case studies: Soil mapping using multiple methods

    NASA Astrophysics Data System (ADS)

    Petersen, Hauke; Wunderlich, Tina; Hagrey, Said A. Al; Rabbel, Wolfgang; Stümpel, Harald

    2010-05-01

    Soil is a non-renewable resource with fundamental functions like filtering (e.g. water), storing (e.g. carbon), transforming (e.g. nutrients) and buffering (e.g. contamination). Degradation of soils is meanwhile not only to scientists a well known fact, also decision makers in politics have accepted this as a serious problem for several environmental aspects. National and international authorities have already worked out preservation and restoration strategies for soil degradation, though it is still work of active research how to put these strategies into real practice. But common to all strategies the description of soil state and dynamics is required as a base step. This includes collecting information from soils with methods ranging from direct soil sampling to remote applications. In an intermediate scale mobile geophysical methods are applied with the advantage of fast working progress but disadvantage of site specific calibration and interpretation issues. In the framework of the iSOIL project we present here some case studies for soil mapping performed using multiple geophysical methods. We will present examples of combined field measurements with EMI-, GPR-, magnetic and gammaspectrometric techniques carried out with the mobile multi-sensor-system of Kiel University (GER). Depending on soil type and actual environmental conditions, different methods show a different quality of information. With application of diverse methods we want to figure out, which methods or combination of methods will give the most reliable information concerning soil state and properties. To investigate the influence of varying material we performed mapping campaigns on field sites with sandy, loamy and loessy soils. Classification of measured or derived attributes show not only the lateral variability but also gives hints to a variation in the vertical distribution of soil material. For all soils of course soil water content can be a critical factor concerning a succesful

  13. Automatic numerical integration methods for Feynman integrals through 3-loop

    NASA Astrophysics Data System (ADS)

    de Doncker, E.; Yuasa, F.; Kato, K.; Ishikawa, T.; Olagbemi, O.

    2015-05-01

    We give numerical integration results for Feynman loop diagrams through 3-loop such as those covered by Laporta [1]. The methods are based on automatic adaptive integration, using iterated integration and extrapolation with programs from the QUADPACK package, or multivariate techniques from the ParInt package. The Dqags algorithm from QuadPack accommodates boundary singularities of fairly general types. PARINT is a package for multivariate integration layered over MPI (Message Passing Interface), which runs on clusters and incorporates advanced parallel/distributed techniques such as load balancing among processes that may be distributed over a network of nodes. Results are included for 3-loop self-energy diagrams without IR (infra-red) or UV (ultra-violet) singularities. A procedure based on iterated integration and extrapolation yields a novel method of numerical regularization for integrals with UV terms, and is applied to a set of 2-loop self-energy diagrams with UV singularities.

  14. Surveillance systems integrating multiple sensors for enhanced situational awareness

    NASA Astrophysics Data System (ADS)

    Van Anda, J. B.; Van Anda, J. D.

    2005-05-01

    In the modern world of high value security systems a successful installation requires the sensors to produce more than just good IR images, preprocessed data from these images, imagery in multiple bands fused in intelligent ways with each other and with non imaging information such as Laser ranging is required. This paper describes a system where LW uncooled, color TV, low light level TV, and laser ranging information are fused in a integral Pan and Tilt system to provide a sensor suite with exceptional capabilities for seamlessly integration into an advanced security system. Advances integrated in this system includes the advances sensor suite, sensible symbology for situational awareness in case of operator intervention, parallax and focus tracking through zoom and sensor changes to enhance auto tracking and motion detection algorithms.

  15. Method of descent for integrable lattices

    NASA Astrophysics Data System (ADS)

    Bogoyavlensky, Oleg

    2009-05-01

    A method of descent for constructing integrable Hamiltonian systems is introduced. The derived periodic and nonperiodic lattices possess Lax representations with spectral parameter and have plenty of first integrals. Examples of Liouville-integrable four-dimensional Hamiltonian Lotka-Volterra systems are presented.

  16. Integrating Multiple Evidence Sources to Predict Adverse Drug Reactions Based on a Systems Pharmacology Model

    PubMed Central

    Cao, D-S; Xiao, N; Li, Y-J; Zeng, W-B; Liang, Y-Z; Lu, A-P; Xu, Q-S; Chen, AF

    2015-01-01

    Identifying potential adverse drug reactions (ADRs) is critically important for drug discovery and public health. Here we developed a multiple evidence fusion (MEF) method for the large-scale prediction of drug ADRs that can handle both approved drugs and novel molecules. MEF is based on the similarity reference by collaborative filtering, and integrates multiple similarity measures from various data types, taking advantage of the complementarity in the data. We used MEF to integrate drug-related and ADR-related data from multiple levels, including the network structural data formed by known drug–ADR relationships for predicting likely unknown ADRs. On cross-validation, it obtains high sensitivity and specificity, substantially outperforming existing methods that utilize single or a few data types. We validated our prediction by their overlap with drug–ADR associations that are known in databases. The proposed computational method could be used for complementary hypothesis generation and rapid analysis of potential drug–ADR interactions. PMID:26451329

  17. Integration methods for molecular dynamics

    SciTech Connect

    Leimkuhler, B.J.; Reich, S.; Skeel, R.D.

    1996-12-31

    Classical molecular dynamics simulation of a macromolecule requires the use of an efficient time-stepping scheme that can faithfully approximate the dynamics over many thousands of timesteps. Because these problems are highly nonlinear, accurate approximation of a particular solution trajectory on meaningful time intervals is neither obtainable nor desired, but some restrictions, such as symplecticness, can be imposed on the discretization which tend to imply good long term behavior. The presence of a variety of types and strengths of interatom potentials in standard molecular models places severe restrictions on the timestep for numerical integration used in explicit integration schemes, so much recent research has concentrated on the search for alternatives that possess (1) proper dynamical properties, and (2) a relative insensitivity to the fastest components of the dynamics. We survey several recent approaches. 48 refs., 2 figs.

  18. NEXT Propellant Management System Integration With Multiple Ion Thrusters

    NASA Technical Reports Server (NTRS)

    Sovey, James S.; Soulas, George C.; Herman, Daniel A.

    2011-01-01

    As a critical part of the NEXT test validation process, a multiple-string integration test was performed on the NEXT propellant management system and ion thrusters. The objectives of this test were to verify that the PMS is capable of providing stable flow control to multiple thrusters operating over the NEXT system throttling range and to demonstrate to potential users that the NEXT PMS is ready for transition to flight. A test plan was developed for the sub-system integration test for verification of PMS and thruster system performance and functionality requirements. Propellant management system calibrations were checked during the single and multi-thruster testing. The low pressure assembly total flow rates to the thruster(s) were within 1.4 percent of the calibrated support equipment flow rates. The inlet pressures to the main, cathode, and neutralizer ports of Thruster PM1R were measured as the PMS operated in 1-thruster, 2-thruster, and 3-thruster configurations. It was found that the inlet pressures to Thruster PM1R for 2-thruster and 3-thruster operation as well as single thruster operation with the PMS compare very favorably indicating that flow rates to Thruster PM1R were similar in all cases. Characterizations of discharge losses, accelerator grid current, and neutralizer performance were performed as more operating thrusters were added to the PMS. There were no variations in these parameters as thrusters were throttled and single and multiple thruster operations were conducted. The propellant management system power consumption was at a fixed voltage to the DCIU and a fixed thermal throttle temperature of 75 C. The total power consumed by the PMS was 10.0, 17.9, and 25.2 W, respectively, for single, 2-thruster, and 3-thruster operation with the PMS. These sub-system integration tests of the PMS, the DCIU Simulator, and multiple thrusters addressed, in part, the NEXT PMS and propulsion system performance and functionality requirements.

  19. A Collocation Method for Volterra Integral Equations

    NASA Astrophysics Data System (ADS)

    Kolk, Marek

    2010-09-01

    We propose a piecewise polynomial collocation method for solving linear Volterra integral equations of the second kind with logarithmic kernels which, in addition to a diagonal singularity, may have a singularity at the initial point of the interval of integration. An attainable order of the convergence of the method is studied. We illustrate our results with a numerical example.

  20. Research in Mathematics Education: Multiple Methods for Multiple Uses

    ERIC Educational Resources Information Center

    Battista, Michael; Smith, Margaret S.; Boerst, Timothy; Sutton, John; Confrey, Jere; White, Dorothy; Knuth, Eric; Quander, Judith

    2009-01-01

    Recent federal education policies and reports have generated considerable debate about the meaning, methods, and goals of "scientific research" in mathematics education. Concentrating on the critical problem of determining which educational programs and practices reliably improve students' mathematics achievement, these policies and reports focus…

  1. Multiple cue use and integration in pigeons (Columba livia).

    PubMed

    Legge, Eric L G; Madan, Christopher R; Spetch, Marcia L; Ludvig, Elliot A

    2016-05-01

    Encoding multiple cues can improve the accuracy and reliability of navigation and goal localization. Problems may arise, however, if one cue is displaced and provides information which conflicts with other cues. Here we investigated how pigeons cope with cue conflict by training them to locate a goal relative to two landmarks and then varying the amount of conflict between the landmarks. When the amount of conflict was small, pigeons tended to integrate both cues in their search patterns. When the amount of conflict was large, however, pigeons used information from both cues independently. This context-dependent strategy for resolving spatial cue conflict agrees with Bayes optimal calculations for using information from multiple sources. PMID:26908004

  2. Integrated control system and method

    SciTech Connect

    Wang, Paul Sai Keat; Baldwin, Darryl; Kim, Myoungjin

    2013-10-29

    An integrated control system for use with an engine connected to a generator providing electrical power to a switchgear is disclosed. The engine receives gas produced by a gasifier. The control system includes an electronic controller associated with the gasifier, engine, generator, and switchgear. A gas flow sensor monitors a gas flow from the gasifier to the engine through an engine gas control valve and provides a gas flow signal to the electronic controller. A gas oversupply sensor monitors a gas oversupply from the gasifier and provides an oversupply signal indicative of gas not provided to the engine. A power output sensor monitors a power output of the switchgear and provide a power output signal. The electronic controller changes gas production of the gasifier and the power output rating of the switchgear based on the gas flow signal, the oversupply signal, and the power output signal.

  3. Robust rotational-velocity-Verlet integration methods

    NASA Astrophysics Data System (ADS)

    Rozmanov, Dmitri; Kusalik, Peter G.

    2010-05-01

    Two rotational integration algorithms for rigid-body dynamics are proposed in velocity-Verlet formulation. The first method uses quaternion dynamics and was derived from the original rotational leap-frog method by Svanberg [Mol. Phys. 92, 1085 (1997)]; it produces time consistent positions and momenta. The second method is also formulated in terms of quaternions but it is not quaternion specific and can be easily adapted for any other orientational representation. Both the methods are tested extensively and compared to existing rotational integrators. The proposed integrators demonstrated performance at least at the level of previously reported rotational algorithms. The choice of simulation parameters is also discussed.

  4. Fast integral methods for integrated optical systems simulations: a review

    NASA Astrophysics Data System (ADS)

    Kleemann, Bernd H.

    2015-09-01

    Boundary integral equation methods (BIM) or simply integral methods (IM) in the context of optical design and simulation are rigorous electromagnetic methods solving Helmholtz or Maxwell equations on the boundary (surface or interface of the structures between two materials) for scattering or/and diffraction purposes. This work is mainly restricted to integral methods for diffracting structures such as gratings, kinoforms, diffractive optical elements (DOEs), micro Fresnel lenses, computer generated holograms (CGHs), holographic or digital phase holograms, periodic lithographic structures, and the like. In most cases all of the mentioned structures have dimensions of thousands of wavelengths in diameter. Therefore, the basic methods necessary for the numerical treatment are locally applied electromagnetic grating diffraction algorithms. Interestingly, integral methods belong to the first electromagnetic methods investigated for grating diffraction. The development started in the mid 1960ies for gratings with infinite conductivity and it was mainly due to the good convergence of the integral methods especially for TM polarization. The first integral equation methods (IEM) for finite conductivity were the methods by D. Maystre at Fresnel Institute in Marseille: in 1972/74 for dielectric, and metallic gratings, and later for multiprofile, and other types of gratings and for photonic crystals. Other methods such as differential and modal methods suffered from unstable behaviour and slow convergence compared to BIMs for metallic gratings in TM polarization from the beginning to the mid 1990ies. The first BIM for gratings using a parametrization of the profile was developed at Karl-Weierstrass Institute in Berlin under a contract with Carl Zeiss Jena works in 1984-1986 by A. Pomp, J. Creutziger, and the author. Due to the parametrization, this method was able to deal with any kind of surface grating from the beginning: whether profiles with edges, overhanging non

  5. Integrative and regularized principal component analysis of multiple sources of data.

    PubMed

    Liu, Binghui; Shen, Xiaotong; Pan, Wei

    2016-06-15

    Integration of data of disparate types has become increasingly important to enhancing the power for new discoveries by combining complementary strengths of multiple types of data. One application is to uncover tumor subtypes in human cancer research in which multiple types of genomic data are integrated, including gene expression, DNA copy number, and DNA methylation data. In spite of their successes, existing approaches based on joint latent variable models require stringent distributional assumptions and may suffer from unbalanced scales (or units) of different types of data and non-scalability of the corresponding algorithms. In this paper, we propose an alternative based on integrative and regularized principal component analysis, which is distribution-free, computationally efficient, and robust against unbalanced scales. The new method performs dimension reduction simultaneously on multiple types of data, seeking data-adaptive sparsity and scaling. As a result, in addition to feature selection for each type of data, integrative clustering is achieved. Numerically, the proposed method compares favorably against its competitors in terms of accuracy (in identifying hidden clusters), computational efficiency, and robustness against unbalanced scales. In particular, compared with a popular method, the new method was competitive in identifying tumor subtypes associated with distinct patient survival patterns when applied to a combined analysis of DNA copy number, mRNA expression, and DNA methylation data in a glioblastoma multiforme study. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26756854

  6. Integrative Data Analysis: The Simultaneous Analysis of Multiple Data Sets

    PubMed Central

    Curran, Patrick J.; Hussong, Andrea M.

    2009-01-01

    Both quantitative and methodological techniques exist that foster the development and maintenance of a cumulative knowledge base within the psychological sciences. Most noteworthy of these techniques is meta-analysis which allows for the synthesis of summary statistics drawn from multiple studies when the original data are not available. However, when the original data can be obtained from multiple studies, many advantages stem from the statistical analysis of the pooled data. The authors define integrative data analysis (IDA) as the analysis of multiple data sets that have been pooled into one. Although variants of IDA have been incorporated into other scientific disciplines, the use of these techniques are much less evident in psychology. In this paper the authors present an overview of IDA as it may be applied within the psychological sciences; a discussion of the relative advantages and disadvantages of IDA; a description of analytic strategies for analyzing pooled individual data; and offer recommendations for the use of IDA in practice. PMID:19485623

  7. EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS

    EPA Science Inventory

    NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...

  8. Multiple attenuation to reflection seismic data using Radon filter and Wave Equation Multiple Rejection (WEMR) method

    SciTech Connect

    Erlangga, Mokhammad Puput

    2015-04-16

    Separation between signal and noise, incoherent or coherent, is important in seismic data processing. Although we have processed the seismic data, the coherent noise is still mixing with the primary signal. Multiple reflections are a kind of coherent noise. In this research, we processed seismic data to attenuate multiple reflections in the both synthetic and real seismic data of Mentawai. There are several methods to attenuate multiple reflection, one of them is Radon filter method that discriminates between primary reflection and multiple reflection in the τ-p domain based on move out difference between primary reflection and multiple reflection. However, in case where the move out difference is too small, the Radon filter method is not enough to attenuate the multiple reflections. The Radon filter also produces the artifacts on the gathers data. Except the Radon filter method, we also use the Wave Equation Multiple Elimination (WEMR) method to attenuate the long period multiple reflection. The WEMR method can attenuate the long period multiple reflection based on wave equation inversion. Refer to the inversion of wave equation and the magnitude of the seismic wave amplitude that observed on the free surface, we get the water bottom reflectivity which is used to eliminate the multiple reflections. The WEMR method does not depend on the move out difference to attenuate the long period multiple reflection. Therefore, the WEMR method can be applied to the seismic data which has small move out difference as the Mentawai seismic data. The small move out difference on the Mentawai seismic data is caused by the restrictiveness of far offset, which is only 705 meter. We compared the real free multiple stacking data after processing with Radon filter and WEMR process. The conclusion is the WEMR method can more attenuate the long period multiple reflection than the Radon filter method on the real (Mentawai) seismic data.

  9. Integrability: mathematical methods for studying solitary waves theory

    NASA Astrophysics Data System (ADS)

    Wazwaz, Abdul-Majid

    2014-03-01

    In recent decades, substantial experimental research efforts have been devoted to linear and nonlinear physical phenomena. In particular, studies of integrable nonlinear equations in solitary waves theory have attracted intensive interest from mathematicians, with the principal goal of fostering the development of new methods, and physicists, who are seeking solutions that represent physical phenomena and to form a bridge between mathematical results and scientific structures. The aim for both groups is to build up our current understanding and facilitate future developments, develop more creative results and create new trends in the rapidly developing field of solitary waves. The notion of the integrability of certain partial differential equations occupies an important role in current and future trends, but a unified rigorous definition of the integrability of differential equations still does not exist. For example, an integrable model in the Painlevé sense may not be integrable in the Lax sense. The Painlevé sense indicates that the solution can be represented as a Laurent series in powers of some function that vanishes on an arbitrary surface with the possibility of truncating the Laurent series at finite powers of this function. The concept of Lax pairs introduces another meaning of the notion of integrability. The Lax pair formulates the integrability of nonlinear equation as the compatibility condition of two linear equations. However, it was shown by many researchers that the necessary integrability conditions are the existence of an infinite series of generalized symmetries or conservation laws for the given equation. The existence of multiple soliton solutions often indicates the integrability of the equation but other tests, such as the Painlevé test or the Lax pair, are necessary to confirm the integrability for any equation. In the context of completely integrable equations, studies are flourishing because these equations are able to describe the

  10. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  11. Methods for biological data integration: perspectives and challenges

    PubMed Central

    Gligorijević, Vladimir; Pržulj, Nataša

    2015-01-01

    Rapid technological advances have led to the production of different types of biological data and enabled construction of complex networks with various types of interactions between diverse biological entities. Standard network data analysis methods were shown to be limited in dealing with such heterogeneous networked data and consequently, new methods for integrative data analyses have been proposed. The integrative methods can collectively mine multiple types of biological data and produce more holistic, systems-level biological insights. We survey recent methods for collective mining (integration) of various types of networked biological data. We compare different state-of-the-art methods for data integration and highlight their advantages and disadvantages in addressing important biological problems. We identify the important computational challenges of these methods and provide a general guideline for which methods are suited for specific biological problems, or specific data types. Moreover, we propose that recent non-negative matrix factorization-based approaches may become the integration methodology of choice, as they are well suited and accurate in dealing with heterogeneous data and have many opportunities for further development. PMID:26490630

  12. Quadrature rules with multiple nodes for evaluating integrals with strong singularities

    NASA Astrophysics Data System (ADS)

    Milovanovic, Gradimir V.; Spalevic, Miodrag M.

    2006-05-01

    We present a method based on the Chakalov-Popoviciu quadrature formula of Lobatto type, a rather general case of quadrature with multiple nodes, for approximating integrals defined by Cauchy principal values or by Hadamard finite parts. As a starting point we use the results obtained by L. Gori and E. Santi (cf. On the evaluation of Hilbert transforms by means of a particular class of Turan quadrature rules, Numer. Algorithms 10 (1995), 27-39; Quadrature rules based on s-orthogonal polynomials for evaluating integrals with strong singularities, Oberwolfach Proceedings: Applications and Computation of Orthogonal Polynomials, ISNM 131, Birkhauser, Basel, 1999, pp. 109-119). We generalize their results by using some of our numerical procedures for stable calculation of the quadrature formula with multiple nodes of Gaussian type and proposed methods for estimating the remainder term in such type of quadrature formulae. Numerical examples, illustrations and comparisons are also shown.

  13. Multiple Integrated Complementary Healing Approaches: Energetics & Light for bone.

    PubMed

    Gray, Michael G; Lackey, Brett R; Patrick, Evelyn F; Gray, Sandra L; Hurley, Susan G

    2016-01-01

    A synergistic-healing strategy that combines molecular targeting within a system-wide perspective is presented as the Multiple Integrated Complementary Healing Approaches: Energetics And Light (MICHAEL). The basis of the MICHAEL approach is the realization that environmental, nutritional and electromagnetic factors form a regulatory framework involved in bone and nerve healing. The interactions of light, energy, and nutrition with neural, hormonal and cellular pathways will be presented. Energetic therapies including electrical, low-intensity pulsed ultrasound and light based treatments affect growth, differentiation and proliferation of bone and nerve and can be utilized for their healing benefits. However, the benefits of these therapies can be impaired by the absence of nutritional, hormonal and organismal factors. For example, lack of sleep, disrupted circadian rhythms and vitamin-D deficiency can impair healing. Molecular targets, such as the Wnt pathway, protein kinase B and glucocorticoid signaling systems can be modulated by nutritional components, including quercetin, curcumin and Mg(2+) to enhance the healing process. The importance of water and water-regulation will be presented as an integral component. The effects of exercise and acupuncture on bone healing will also be discussed within the context of the MICHAEL approach. PMID:26804592

  14. Tools and Models for Integrating Multiple Cellular Networks

    SciTech Connect

    Gerstein, Mark

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  15. Integral Deferred Correction methods for scientific computing

    NASA Astrophysics Data System (ADS)

    Morton, Maureen Marilla

    Since high order numerical methods frequently can attain accurate solutions more efficiently than low order methods, we develop and analyze new high order numerical integrators for the time discretization of ordinary and partial differential equations. Our novel methods address some of the issues surrounding high order numerical time integration, such as the difficulty of many popular methods' construction and handling the effects of disparate behaviors produce by different terms in the equations to be solved. We are motivated by the simplicity of how Deferred Correction (DC) methods achieve high order accuracy [72, 27]. DC methods are numerical time integrators that, rather than calculating tedious coefficients for order conditions, instead construct high order accurate solutions by iteratively improving a low order preliminary numerical solution. With each iteration, an error equation is solved, the error decreases, and the order of accuracy increases. Later, DC methods were adjusted to include an integral formulation of the residual, which stabilizes the method. These Spectral Deferred Correction (SDC) methods [25] motivated Integral Deferred Corrections (IDC) methods. Typically, SDC methods are limited to increasing the order of accuracy by one with each iteration due to smoothness properties imposed by the gridspacing. However, under mild assumptions, explicit IDC methods allow for any explicit rth order Runge-Kutta (RK) method to be used within each iteration, and then an order of accuracy increase of r is attained after each iteration [18]. We extend these results to the construction of implicit IDC methods that use implicit RK methods, and we prove analogous results for order of convergence. One means of solving equations with disparate parts is by semi-implicit integrators, handling a "fast" part implicitly and a "slow" part explicitly. We incorporate additive RK (ARK) integrators into the iterations of IDC methods in order to construct new arbitrary order

  16. A survey of payload integration methods

    NASA Technical Reports Server (NTRS)

    Engels, R. C.; Harcrow, H. W.

    1981-01-01

    The most prominent payload integration methods are presented and evaluated. The paper outlines the problem and some of the difficulties encountered when analyzing a coupled booster/payload system. Descriptions of both full-scale and short-cut methods are given together with an assessment of their strengths and weaknesses. Finally, an extensive list of references is included.

  17. Integrity of hypothalamic fibers and cognitive fatigue in multiple sclerosis.

    PubMed

    Hanken, Katrin; Eling, Paul; Kastrup, Andreas; Klein, Jan; Hildebrandt, Helmut

    2015-01-01

    Cognitive fatigue is a common and disabling symptom of multiple sclerosis (MS), but little is known about its pathophysiology. The present study investigated whether the posterior hypothalamus, which is considered as the waking center, is associated with MS-related cognitive fatigue. We analyzed the integrity of posterior hypothalamic fibers in 49 patients with relapsing-remitting MS and 14 healthy controls. Diffusion tensor imaging (DTI) parameters were calculated for fibers between the posterior hypothalamus and, respectively, the mesencephalon, pons and prefrontal cortex. In addition, DTI parameters were computed for fibers between the anterior hypothalamus and these regions and for the corpus callosum. Cognitive fatigue was assessed using the Fatigue Scale for Motor and Cognitive Functions. Analyses of variance with repeated measures were performed to investigate the impact of cognitive fatigue on diffusion parameters. Cognitively fatigued patients (75.5%) showed a significantly lower mean axial and radial diffusivity for fibers between the posterior hypothalamus and the mesencephalon than cognitively non-fatigued patients (Group(⁎)Target area(⁎)Diffusion orientation: F=4.047; p=0.023). For fibers of the corpus callosum, MS patients presented significantly higher axial and radial diffusivity than healthy controls (Group(⁎)Diffusion orientation: F=9.904; p<0.001). Depressive mood, used as covariate, revealed significant interaction effects for anterior hypothalamic fibers (Target area(⁎)Diffusion orientation(⁎)Depression: F=5.882; p=0.021; Hemisphere(⁎)Diffusion orientation(⁎) Depression: F=8.744; p=0.008). Changes in integrity of fibers between the posterior hypothalamus and the mesencephalon appear to be associated with MS-related cognitive fatigue. These changes might cause an altered modulation of hypothalamic centers responsible for wakefulness. Furthermore, integrity of anterior hypothalamic fibers might be related to depression in MS. PMID

  18. Multiple Shooting-Local Linearization method for the identification of dynamical systems

    NASA Astrophysics Data System (ADS)

    Carbonell, F.; Iturria-Medina, Y.; Jimenez, J. C.

    2016-08-01

    The combination of the multiple shooting strategy with the generalized Gauss-Newton algorithm turns out in a recognized method for estimating parameters in ordinary differential equations (ODEs) from noisy discrete observations. A key issue for an efficient implementation of this method is the accurate integration of the ODE and the evaluation of the derivatives involved in the optimization algorithm. In this paper, we study the feasibility of the Local Linearization (LL) approach for the simultaneous numerical integration of the ODE and the evaluation of such derivatives. This integration approach results in a stable method for the accurate approximation of the derivatives with no more computational cost than that involved in the integration of the ODE. The numerical simulations show that the proposed Multiple Shooting-Local Linearization method recovers the true parameters value under different scenarios of noisy data.

  19. Upcoming challenges for multiple sequence alignment methods in the high-throughput era

    PubMed Central

    Kemena, Carsten; Notredame, Cedric

    2009-01-01

    This review focuses on recent trends in multiple sequence alignment tools. It describes the latest algorithmic improvements including the extension of consistency-based methods to the problem of template-based multiple sequence alignments. Some results are presented suggesting that template-based methods are significantly more accurate than simpler alternative methods. The validation of existing methods is also discussed at length with the detailed description of recent results and some suggestions for future validation strategies. The last part of the review addresses future challenges for multiple sequence alignment methods in the genomic era, most notably the need to cope with very large sequences, the need to integrate large amounts of experimental data, the need to accurately align non-coding and non-transcribed sequences and finally, the need to integrate many alternative methods and approaches. Contact: cedric.notredame@crg.es PMID:19648142

  20. Efficient integration method for fictitious domain approaches

    NASA Astrophysics Data System (ADS)

    Duczek, Sascha; Gabbert, Ulrich

    2015-10-01

    In the current article, we present an efficient and accurate numerical method for the integration of the system matrices in fictitious domain approaches such as the finite cell method (FCM). In the framework of the FCM, the physical domain is embedded in a geometrically larger domain of simple shape which is discretized using a regular Cartesian grid of cells. Therefore, a spacetree-based adaptive quadrature technique is normally deployed to resolve the geometry of the structure. Depending on the complexity of the structure under investigation this method accounts for most of the computational effort. To reduce the computational costs for computing the system matrices an efficient quadrature scheme based on the divergence theorem (Gauß-Ostrogradsky theorem) is proposed. Using this theorem the dimension of the integral is reduced by one, i.e. instead of solving the integral for the whole domain only its contour needs to be considered. In the current paper, we present the general principles of the integration method and its implementation. The results to several two-dimensional benchmark problems highlight its properties. The efficiency of the proposed method is compared to conventional spacetree-based integration techniques.

  1. Students' Use of "Look Back" Strategies in Multiple Solution Methods

    ERIC Educational Resources Information Center

    Lee, Shin-Yi

    2016-01-01

    The purpose of this study was to investigate the relationship between both 9th-grade and 1st-year undergraduate students' use of "look back" strategies and problem solving performance in multiple solution methods, the difference in their use of look back strategies and problem solving performance in multiple solution methods, and the…

  2. Multiple tag labeling method for DNA sequencing

    DOEpatents

    Mathies, R.A.; Huang, X.C.; Quesada, M.A.

    1995-07-25

    A DNA sequencing method is described which uses single lane or channel electrophoresis. Sequencing fragments are separated in the lane and detected using a laser-excited, confocal fluorescence scanner. Each set of DNA sequencing fragments is separated in the same lane and then distinguished using a binary coding scheme employing only two different fluorescent labels. Also described is a method of using radioisotope labels. 5 figs.

  3. Multiple tag labeling method for DNA sequencing

    DOEpatents

    Mathies, Richard A.; Huang, Xiaohua C.; Quesada, Mark A.

    1995-01-01

    A DNA sequencing method described which uses single lane or channel electrophoresis. Sequencing fragments are separated in said lane and detected using a laser-excited, confocal fluorescence scanner. Each set of DNA sequencing fragments is separated in the same lane and then distinguished using a binary coding scheme employing only two different fluorescent labels. Also described is a method of using radio-isotope labels.

  4. A method for assurance of image integrity in CAD-PACS integration

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng

    2007-03-01

    Computer Aided Detection/Diagnosis (CAD) can greatly assist in the clinical decision making process, and therefore, has drawn tremendous research efforts. However, integrating independent CAD workstation results with the clinical diagnostic workflow still remains challenging. We have presented a CAD-PACS integration toolkit that complies with DICOM standard and IHE profiles. One major issue in CAD-PACS integration is the security of the images used in CAD post-processing and the corresponding CAD result images. In this paper, we present a method for assuring the integrity of both DICOM images used in CAD post-processing and the CAD image results that are in BMP or JPEG format. The method is evaluated in a PACS simulator that simulates clinical PACS workflow. It can also be applied to multiple CAD applications that are integrated with the PACS simulator. The successful development and evaluation of this method will provide a useful approach for assuring image integrity of the CAD-PACS integration in clinical diagnosis.

  5. Evaluation of Scheduling Methods for Multiple Runways

    NASA Technical Reports Server (NTRS)

    Bolender, Michael A.; Slater, G. L.

    1996-01-01

    Several scheduling strategies are analyzed in order to determine the most efficient means of scheduling aircraft when multiple runways are operational and the airport is operating at different utilization rates. The study compares simulation data for two and three runway scenarios to results from queuing theory for an M/D/n queue. The direction taken, however, is not to do a steady-state, or equilibrium, analysis since this is not the case during a rush period at a typical airport. Instead, a transient analysis of the delay per aircraft is performed. It is shown that the scheduling strategy that reduces the delay depends upon the density of the arrival traffic. For light traffic, scheduling aircraft to their preferred runways is sufficient; however, as the arrival rate increases, it becomes more important to separate traffic by weight class. Significant delay reduction is realized when aircraft that belong to the heavy and small weight classes are sent to separate runways with large aircraft put into the 'best' landing slot.

  6. 77 FR 39735 - Certain Integrated Circuit Packages Provided With Multiple Heat-Conducting Paths and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ... COMMISSION Certain Integrated Circuit Packages Provided With Multiple Heat- Conducting Paths and Products... the sale within the United States after importation of certain integrated circuit packages provided... integrated circuit packages provided with multiple heat-conducting paths and products containing same...

  7. MSblender: a probabilistic approach for integrating peptide identifications from multiple database search engines

    PubMed Central

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I.; Marcotte, Edward M.

    2011-01-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for all possible PSMs and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for all detected proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652

  8. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    PubMed

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652

  9. Differential temperature integrating diagnostic method and apparatus

    DOEpatents

    Doss, James D.; McCabe, Charles W.

    1976-01-01

    A method and device for detecting the presence of breast cancer in women by integrating the temperature difference between the temperature of a normal breast and that of a breast having a malignant tumor. The breast-receiving cups of a brassiere are each provided with thermally conductive material next to the skin, with a thermistor attached to the thermally conductive material in each cup. The thermistors are connected to adjacent arms of a Wheatstone bridge. Unbalance currents in the bridge are integrated with respect to time by means of an electrochemical integrator. In the absence of a tumor, both breasts maintain substantially the same temperature, and the bridge remains balanced. If the tumor is present in one breast, a higher temperature in that breast unbalances the bridge and the electrochemical cells integrate the temperature difference with respect to time.

  10. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.

    1990-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  11. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, S. N.; Berke, L.; Gallagher, R. H.

    1991-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EEs) are integrated with the global compatibility conditions (CCs) to form the governing set of equations. In IFM the CCs are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  12. Methods for monitoring multiple gene expression

    DOEpatents

    Berka, Randy; Bachkirova, Elena; Rey, Michael

    2008-06-01

    The present invention relates to methods for monitoring differential expression of a plurality of genes in a first filamentous fungal cell relative to expression of the same genes in one or more second filamentous fungal cells using microarrays containing Trichoderma reesei ESTs or SSH clones, or a combination thereof. The present invention also relates to computer readable media and substrates containing such array features for monitoring expression of a plurality of genes in filamentous fungal cells.

  13. Methods for monitoring multiple gene expression

    DOEpatents

    Berka, Randy; Bachkirova, Elena; Rey, Michael

    2012-05-01

    The present invention relates to methods for monitoring differential expression of a plurality of genes in a first filamentous fungal cell relative to expression of the same genes in one or more second filamentous fungal cells using microarrays containing Trichoderma reesei ESTs or SSH clones, or a combination thereof. The present invention also relates to computer readable media and substrates containing such array features for monitoring expression of a plurality of genes in filamentous fungal cells.

  14. Methods for monitoring multiple gene expression

    DOEpatents

    Berka, Randy; Bachkirova, Elena; Rey, Michael

    2013-10-01

    The present invention relates to methods for monitoring differential expression of a plurality of genes in a first filamentous fungal cell relative to expression of the same genes in one or more second filamentous fungal cells using microarrays containing Trichoderma reesei ESTs or SSH clones, or a combination thereof. The present invention also relates to computer readable media and substrates containing such array features for monitoring expression of a plurality of genes in filamentous fungal cells.

  15. An evaluation of sampling effects on multiple DNA barcoding methods leads to an integrative approach for delimiting species: a case study of the North American tarantula genus Aphonopelma (Araneae, Mygalomorphae, Theraphosidae).

    PubMed

    Hamilton, Chris A; Hendrixson, Brent E; Brewer, Michael S; Bond, Jason E

    2014-02-01

    The North American tarantula genus Aphonopelma provides one of the greatest challenges to species delimitation and downstream identification in spiders because traditional morphological characters appear ineffective for evaluating limits of intra- and interspecific variation in the group. We evaluated the efficacy of numerous molecular-based approaches to species delimitation within Aphonopelma based upon the most extensive sampling of theraphosids to date, while also investigating the sensitivity of randomized taxon sampling on the reproducibility of species boundaries. Mitochondrial DNA (cytochrome c oxidase subunit I) sequences were sampled from 682 specimens spanning the genetic, taxonomic, and geographic breadth of the genus within the United States. The effects of random taxon sampling compared traditional Neighbor-Joining with three modern quantitative species delimitation approaches (ABGD, P ID(Liberal), and GMYC). Our findings reveal remarkable consistency and congruence across various approaches and sampling regimes, while highlighting highly divergent outcomes in GMYC. Our investigation allowed us to integrate methodologies into an efficient, consistent, and more effective general methodological workflow for estimating species boundaries within the mygalomorph spider genus Aphonopelma. Taken alone, these approaches are not particularly useful - especially in the absence of prior knowledge of the focal taxa. Only through the incorporation of multiple lines of evidence, employed in a hypothesis-testing framework, can the identification and delimitation of confident species boundaries be determined. A key point in studying closely related species, and perhaps one of the most important aspects of DNA barcoding, is to combine a sampling strategy that broadly identifies the extent of genetic diversity across the distributions of the species of interest and incorporates previous knowledge into the "species equation" (morphology, molecules, and natural history

  16. Impaired functional integration in multiple sclerosis: a graph theory study.

    PubMed

    Rocca, Maria A; Valsasina, Paola; Meani, Alessandro; Falini, Andrea; Comi, Giancarlo; Filippi, Massimo

    2016-01-01

    Aim of this study was to explore the topological organization of functional brain network connectivity in a large cohort of multiple sclerosis (MS) patients and to assess whether its disruption contributes to disease clinical manifestations. Graph theoretical analysis was applied to resting state fMRI data from 246 MS patients and 55 matched healthy controls (HC). Functional connectivity between 116 cortical and subcortical brain regions was estimated using a bivariate correlation analysis. Global network properties (network degree, global efficiency, hierarchy, path length and assortativity) were abnormal in MS patients vs HC, and contributed to distinguish cognitively impaired MS patients (34%) from HC, but not the main MS clinical phenotypes. Compared to HC, MS patients also showed: (1) a loss of hubs in the superior frontal gyrus, precuneus and anterior cingulum in the left hemisphere; (2) a different lateralization of basal ganglia hubs (mostly located in the left hemisphere in HC, and in the right hemisphere in MS patients); and (3) a formation of hubs, not seen in HC, in the left temporal pole and cerebellum. MS patients also experienced a decreased nodal degree in the bilateral caudate nucleus and right cerebellum. Such a modification of regional network properties contributed to cognitive impairment and phenotypic variability of MS. An impairment of global integration (likely to reflect a reduced competence in information exchange between distant brain areas) occurs in MS and is associated with cognitive deficits. A regional redistribution of network properties contributes to cognitive status and phenotypic variability of these patients. PMID:25257603

  17. Multiple-analyte fluoroimmunoassay using an integrated optical waveguide sensor.

    PubMed

    Plowman, T E; Durstchi, J D; Wang, H K; Christensen, D A; Herron, J N; Reichert, W M

    1999-10-01

    A silicon oxynitride integrated optical waveguide was used to evanescently excite fluorescence from a multianalyte sensor surface in a rapid, sandwich immunoassay format. Multiple analyte immunoassay (MAIA) results for two sets of three different analytes, one employing polyclonal and the other monoclonal capture antibodies, were compared with results for identical analytes performed in a single-analyte immunoassay (SAIA) format. The MAIA protocol was applied in both phosphate-buffered saline and simulated serum solutions. Point-to-point correlation values between the MAIA and SAIA results varied widely for the polyclonal antibodies (R2 = 0.42-0.98) and were acceptable for the monoclonal antibodies (R2 = 0.93-0.99). Differences in calculated receptor affinities were also evident with polyclonal antibodies, but not so with monoclonal antibodies. Polyclonal antibody capture layers tended to demonstrate departure from ideal receptor-ligand binding while monoclonal antibodies generally displayed monovalent binding. A third set of three antibodies, specific for three cardiac proteins routinely used to categorize myocardial infarction, were also evaluated with the two assay protocols. MAIA responses, over clinically significant ranges for creatin kinase MB, cardiac troponin I, and myoglobin agreed well with responses generated with SAIA protocols (R2 = 0.97-0.99). PMID:10517150

  18. Implicit integration methods for dislocation dynamics

    SciTech Connect

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; Hommes, G.; Aubry, S.; Arsenlis, A.

    2015-01-20

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events, and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. Here, this paper investigates the viability of high order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a way of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.

  19. Implicit integration methods for dislocation dynamics

    DOE PAGESBeta

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; Hommes, G.; Aubry, S.; Arsenlis, A.

    2015-01-20

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events, and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. Here, this paper investigates the viability of high order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a waymore » of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.« less

  20. Bioluminescent bioreporter integrated circuit detection methods

    DOEpatents

    Simpson, Michael L.; Paulus, Michael J.; Sayler, Gary S.; Applegate, Bruce M.; Ripp, Steven A.

    2005-06-14

    Disclosed are monolithic bioelectronic devices comprising a bioreporter and an OASIC. These bioluminescent bioreporter integrated circuit are useful in detecting substances such as pollutants, explosives, and heavy-metals residing in inhospitable areas such as groundwater, industrial process vessels, and battlefields. Also disclosed are methods and apparatus for detection of particular analytes, including ammonia and estrogen compounds.

  1. Implicit integration methods for dislocation dynamics

    NASA Astrophysics Data System (ADS)

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; Hommes, G.; Aubry, S.; Arsenlis, A.

    2015-03-01

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. This paper investigates the viability of high-order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a way of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.

  2. Integration of multiple view plus depth data for free viewpoint 3D display

    NASA Astrophysics Data System (ADS)

    Suzuki, Kazuyoshi; Yoshida, Yuko; Kawamoto, Tetsuya; Fujii, Toshiaki; Mase, Kenji

    2014-03-01

    This paper proposes a method for constructing a reasonable scale of end-to-end free-viewpoint video system that captures multiple view and depth data, reconstructs three-dimensional polygon models of objects, and display them on virtual 3D CG spaces. This system consists of a desktop PC and four Kinect sensors. First, multiple view plus depth data at four viewpoints are captured by Kinect sensors simultaneously. Then, the captured data are integrated to point cloud data by using camera parameters. The obtained point cloud data are sampled to volume data that consists of voxels. Since volume data that are generated from point cloud data are sparse, those data are made dense by using global optimization algorithm. Final step is to reconstruct surfaces on dense volume data by discrete marching cubes method. Since accuracy of depth maps affects to the quality of 3D polygon model, a simple inpainting method for improving depth maps is also presented.

  3. Multiple time step integrators in ab initio molecular dynamics

    SciTech Connect

    Luehr, Nathan; Martínez, Todd J.; Markland, Thomas E.

    2014-02-28

    Multiple time-scale algorithms exploit the natural separation of time-scales in chemical systems to greatly accelerate the efficiency of molecular dynamics simulations. Although the utility of these methods in systems where the interactions are described by empirical potentials is now well established, their application to ab initio molecular dynamics calculations has been limited by difficulties associated with splitting the ab initio potential into fast and slowly varying components. Here we present two schemes that enable efficient time-scale separation in ab initio calculations: one based on fragment decomposition and the other on range separation of the Coulomb operator in the electronic Hamiltonian. We demonstrate for both water clusters and a solvated hydroxide ion that multiple time-scale molecular dynamics allows for outer time steps of 2.5 fs, which are as large as those obtained when such schemes are applied to empirical potentials, while still allowing for bonds to be broken and reformed throughout the dynamics. This permits computational speedups of up to 4.4x, compared to standard Born-Oppenheimer ab initio molecular dynamics with a 0.5 fs time step, while maintaining the same energy conservation and accuracy.

  4. The onion method for multiple perturbation theory

    NASA Astrophysics Data System (ADS)

    Cross, R. J.

    1988-04-01

    We develop a method of successive approximations for molecular scattering theory. This consists of a recipe for removing from the Schrödinger equation, one by one, the wave functions of a set of approximate solutions. The radial wave function is expressed as a linear combination of the well-behaved and singular solutions of the first approximation, and a set of coupled differential equations is obtained for the coefficients of the approximate solutions. A similar set of coefficients is obtained for the next approximation, and the exact coefficients are expressed in terms of the approximate coefficients to yield a set of second-level coefficients. The process can be continued like pealing off the layers of an onion. At each stage the coupled differential equations for the coefficients is equivalent to the Schrödinger equation. Finally, one can either ignore the remaining coefficients or approximate the coupled equations by a simple perturbation theory.

  5. Method and apparatus for controlling multiple motors

    DOEpatents

    Jones, Rollin G.; Kortegaard, Bert L.; Jones, David F.

    1987-01-01

    A method and apparatus are provided for simultaneously controlling a plurality of stepper motors. Addressing circuitry generates address data for each motor in a periodic address sequence. Memory circuits respond to the address data for each motor by accessing a corresponding memory location containing a first operational data set functionally related to a direction for moving the motor, speed data, and rate of speed change. First logic circuits respond to the first data set to generate a motor step command. Second logic circuits respond to the command from the first logic circuits to generate a third data set for replacing the first data set in memory with a current operational motor status, which becomes the first data set when the motor is next addressed.

  6. Fidelity of the Integrated Force Method Solution

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale; Halford, Gary; Coroneos, Rula; Patnaik, Surya

    2002-01-01

    The theory of strain compatibility of the solid mechanics discipline was incomplete since St. Venant's 'strain formulation' in 1876. We have addressed the compatibility condition both in the continuum and the discrete system. This has lead to the formulation of the Integrated Force Method. A dual Integrated Force Method with displacement as the primal variable has also been formulated. A modest finite element code (IFM/Analyzers) based on the IFM theory has been developed. For a set of standard test problems the IFM results were compared with the stiffness method solutions and the MSC/Nastran code. For the problems IFM outperformed the existing methods. Superior IFM performance is attributed to simultaneous compliance of equilibrium equation and compatibility condition. MSC/Nastran organization expressed reluctance to accept the high fidelity IFM solutions. This report discusses the solutions to the examples. No inaccuracy was detected in the IFM solutions. A stiffness method code with a small programming effort can be improved to reap the many IFM benefits when implemented with the IFMD elements. Dr. Halford conducted a peer-review on the Integrated Force Method. Reviewers' response is included.

  7. Orthogonal matrix factorization enables integrative analysis of multiple RNA binding proteins

    PubMed Central

    Stražar, Martin; Žitnik, Marinka; Zupan, Blaž; Ule, Jernej; Curk, Tomaž

    2016-01-01

    Motivation: RNA binding proteins (RBPs) play important roles in post-transcriptional control of gene expression, including splicing, transport, polyadenylation and RNA stability. To model protein–RNA interactions by considering all available sources of information, it is necessary to integrate the rapidly growing RBP experimental data with the latest genome annotation, gene function, RNA sequence and structure. Such integration is possible by matrix factorization, where current approaches have an undesired tendency to identify only a small number of the strongest patterns with overlapping features. Because protein–RNA interactions are orchestrated by multiple factors, methods that identify discriminative patterns of varying strengths are needed. Results: We have developed an integrative orthogonality-regularized nonnegative matrix factorization (iONMF) to integrate multiple data sources and discover non-overlapping, class-specific RNA binding patterns of varying strengths. The orthogonality constraint halves the effective size of the factor model and outperforms other NMF models in predicting RBP interaction sites on RNA. We have integrated the largest data compendium to date, which includes 31 CLIP experiments on 19 RBPs involved in splicing (such as hnRNPs, U2AF2, ELAVL1, TDP-43 and FUS) and processing of 3’UTR (Ago, IGF2BP). We show that the integration of multiple data sources improves the predictive accuracy of retrieval of RNA binding sites. In our study the key predictive factors of protein–RNA interactions were the position of RNA structure and sequence motifs, RBP co-binding and gene region type. We report on a number of protein-specific patterns, many of which are consistent with experimentally determined properties of RBPs. Availability and implementation: The iONMF implementation and example datasets are available at https://github.com/mstrazar/ionmf. Contact: tomaz.curk@fri.uni-lj.si Supplementary information: Supplementary data are available

  8. One-step integration of multiple genes into the oleaginous yeast Yarrowia lipolytica.

    PubMed

    Gao, Shuliang; Han, Linna; Zhu, Li; Ge, Mei; Yang, Sheng; Jiang, Yu; Chen, Daijie

    2014-12-01

    Yarrowia lipolytica is an unconventional yeast, and is generally recognized as safe (GRAS). It provides a versatile fermentation platform that is used commercially to produce many added-value products. Here we report a multiple fragment assembly method that allows one-step integration of an entire β-carotene biosynthesis pathway (~11 kb, consisting of four genes) via in vivo homologous recombination into the rDNA locus of the Y. lipolytica chromosome. The highest efficiency was 21%, and the highest production of β-carotene was 2.2 ± 0.3 mg per g dry cell weight. The total procedure was completed in less than one week, as compared to a previously reported sequential gene integration method that required n weeks for n genes. This time-saving method will facilitate synthetic biology, metabolic engineering and functional genomics studies of Y. lipolytica. PMID:25216641

  9. Numerical methods for engine-airframe integration

    SciTech Connect

    Murthy, S.N.B.; Paynter, G.C.

    1986-01-01

    Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison of full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment.

  10. A parallel multiple path tracing method based on OptiX for infrared image generation

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Wang, Xia; Liu, Li; Long, Teng; Wu, Zimu

    2015-12-01

    Infrared image generation technology is being widely used in infrared imaging system performance evaluation, battlefield environment simulation and military personnel training, which require a more physically accurate and efficient method for infrared scene simulation. A parallel multiple path tracing method based on OptiX was proposed to solve the problem, which can not only increase computational efficiency compared to serial ray tracing using CPU, but also produce relatively accurate results. First, the flaws of current ray tracing methods in infrared simulation were analyzed and thus a multiple path tracing method based on OptiX was developed. Furthermore, the Monte Carlo integration was employed to solve the radiation transfer equation, in which the importance sampling method was applied to accelerate the integral convergent rate. After that, the framework of the simulation platform and its sensor effects simulation diagram were given. Finally, the results showed that the method could generate relatively accurate radiation images if a precise importance sampling method was available.

  11. Package for integrated optic circuit and method

    DOEpatents

    Kravitz, S.H.; Hadley, G.R.; Warren, M.E.; Carson, R.F.; Armendariz, M.G.

    1998-08-04

    A structure and method are disclosed for packaging an integrated optic circuit. The package comprises a first wall having a plurality of microlenses formed therein to establish channels of optical communication with an integrated optic circuit within the package. A first registration pattern is provided on an inside surface of one of the walls of the package for alignment and attachment of the integrated optic circuit. The package in one embodiment may further comprise a fiber holder for aligning and attaching a plurality of optical fibers to the package and extending the channels of optical communication to the fibers outside the package. In another embodiment, a fiber holder may be used to hold the fibers and align the fibers to the package. The fiber holder may be detachably connected to the package. 6 figs.

  12. Package for integrated optic circuit and method

    DOEpatents

    Kravitz, Stanley H.; Hadley, G. Ronald; Warren, Mial E.; Carson, Richard F.; Armendariz, Marcelino G.

    1998-01-01

    A structure and method for packaging an integrated optic circuit. The package comprises a first wall having a plurality of microlenses formed therein to establish channels of optical communication with an integrated optic circuit within the package. A first registration pattern is provided on an inside surface of one of the walls of the package for alignment and attachment of the integrated optic circuit. The package in one embodiment may further comprise a fiber holder for aligning and attaching a plurality of optical fibers to the package and extending the channels of optical communication to the fibers outside the package. In another embodiment, a fiber holder may be used to hold the fibers and align the fibers to the package. The fiber holder may be detachably connected to the package.

  13. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    NASA Technical Reports Server (NTRS)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  14. In silico gene prioritization by integrating multiple data sources.

    PubMed

    Chen, Yixuan; Wang, Wenhui; Zhou, Yingyao; Shields, Robert; Chanda, Sumit K; Elston, Robert C; Li, Jing

    2011-01-01

    Identifying disease genes is crucial to the understanding of disease pathogenesis, and to the improvement of disease diagnosis and treatment. In recent years, many researchers have proposed approaches to prioritize candidate genes by considering the relationship of candidate genes and existing known disease genes, reflected in other data sources. In this paper, we propose an expandable framework for gene prioritization that can integrate multiple heterogeneous data sources by taking advantage of a unified graphic representation. Gene-gene relationships and gene-disease relationships are then defined based on the overall topology of each network using a diffusion kernel measure. These relationship measures are in turn normalized to derive an overall measure across all networks, which is utilized to rank all candidate genes. Based on the informativeness of available data sources with respect to each specific disease, we also propose an adaptive threshold score to select a small subset of candidate genes for further validation studies. We performed large scale cross-validation analysis on 110 disease families using three data sources. Results have shown that our approach consistently outperforms other two state of the art programs. A case study using Parkinson disease (PD) has identified four candidate genes (UBB, SEPT5, GPR37 and TH) that ranked higher than our adaptive threshold, all of which are involved in the PD pathway. In particular, a very recent study has observed a deletion of TH in a patient with PD, which supports the importance of the TH gene in PD pathogenesis. A web tool has been implemented to assist scientists in their genetic studies. PMID:21731658

  15. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  16. Generating nonlinear FM chirp radar signals by multiple integrations

    DOEpatents

    Doerry, Armin W.

    2011-02-01

    A phase component of a nonlinear frequency modulated (NLFM) chirp radar pulse can be produced by performing digital integration operations over a time interval defined by the pulse width. Each digital integration operation includes applying to a respectively corresponding input parameter value a respectively corresponding number of instances of digital integration.

  17. High-precision, automated integration of multiple isothermal titration calorimetric thermograms: new features of NITPIC.

    PubMed

    Scheuermann, Thomas H; Brautigam, Chad A

    2015-04-01

    Isothermal titration calorimetry (ITC) has become a standard and widely available tool to measure the thermodynamic parameters of macromolecular associations. Modern applications of the method, including global analysis and drug screening, require the acquisition of multiple sets of data; sometimes these data sets number in the hundreds. Therefore, there is a need for quick, precise, and automated means to process the data, particularly at the first step of data analysis, which is commonly the integration of the raw data to yield an interpretable isotherm. Herein, we describe enhancements to an algorithm that previously has been shown to provide an automated, unbiased, and high-precision means to integrate ITC data. These improvements allow for the speedy and precise serial integration of an unlimited number of ITC data sets, and they have been implemented in the freeware program NITPIC, version 1.1.0. We present a comprehensive comparison of the performance of this software against an older version of NITPIC and a current version of Origin, which is commonly used for integration. The new methods recapitulate the excellent performance of the previous versions of NITPIC while speeding it up substantially, and their precision is significantly better than that of Origin. This new version of NITPIC is therefore well suited to the serial integration of many ITC data sets. PMID:25524420

  18. A fast and high performance multiple data integration algorithm for identifying human disease genes

    PubMed Central

    2015-01-01

    Background Integrating multiple data sources is indispensable in improving disease gene identification. It is not only due to the fact that disease genes associated with similar genetic diseases tend to lie close with each other in various biological networks, but also due to the fact that gene-disease associations are complex. Although various algorithms have been proposed to identify disease genes, their prediction performances and the computational time still should be further improved. Results In this study, we propose a fast and high performance multiple data integration algorithm for identifying human disease genes. A posterior probability of each candidate gene associated with individual diseases is calculated by using a Bayesian analysis method and a binary logistic regression model. Two prior probability estimation strategies and two feature vector construction methods are developed to test the performance of the proposed algorithm. Conclusions The proposed algorithm is not only generated predictions with high AUC scores, but also runs very fast. When only a single PPI network is employed, the AUC score is 0.769 by using F2 as feature vectors. The average running time for each leave-one-out experiment is only around 1.5 seconds. When three biological networks are integrated, the AUC score using F3 as feature vectors increases to 0.830, and the average running time for each leave-one-out experiment takes only about 12.54 seconds. It is better than many existing algorithms. PMID:26399620

  19. Information Integration in Multiple Cue Judgment: A Division of Labor Hypothesis

    ERIC Educational Resources Information Center

    Juslin, Peter; Karlsson, Linnea; Olsson, Henrik

    2008-01-01

    There is considerable evidence that judgment is constrained to additive integration of information. The authors propose an explanation of why serial and additive cognitive integration can produce accurate multiple cue judgment both in additive and non-additive environments in terms of an adaptive division of labor between multiple representations.…

  20. 77 FR 33486 - Certain Integrated Circuit Packages Provided With Multiple Heat-Conducting Paths and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... COMMISSION Certain Integrated Circuit Packages Provided With Multiple Heat- Conducting Paths and Products... With Multiple Heat-Conducting Paths and Products Containing Same, DN 2899; the Commission is soliciting... multiple heat-conducting paths and products containing same. The complaint names as respondents...

  1. Integrating stakeholder values with multiple attributes to quantify watershed performance

    NASA Astrophysics Data System (ADS)

    Shriver, Deborah M.; Randhir, Timothy O.

    2006-08-01

    Integrating stakeholder values into the process of quantifying impairment of ecosystem functions is an important aspect of watershed assessment and planning. This study develops a classification and prioritization model to assess potential impairment in watersheds. A systematic evaluation of a broad set of abiotic, biotic, and human indicators of watershed structure and function was used to identify the level of degradation at a subbasin scale. Agencies and communities can use the method to effectively target and allocate resources to areas of greatest restoration need. The watershed performance measure (WPM) developed in this study is composed of three major components: (1) hydrologic processes (water quantity and quality), (2) biodiversity at a species scale (core and priority habitat for rare and endangered species and species richness) and landscape scale (impacts of fragmentation), and (3) urban impacts as assessed in the built environment (effective impervious area) and population effects (densities and density of toxic waste sites). Simulation modeling using the Soil and Water Assessment Tool (SWAT), monitoring information, and spatial analysis with GIS were used to assess each criterion in developing this model. Weights for attributes of potential impairment were determined through the use of the attribute prioritization procedure with a panel of expert stakeholders. This procedure uses preselected attributes and corresponding stakeholder values and is data intensive. The model was applied to all subbasins of the Chicopee River Watershed of western Massachusetts, an area with a mixture of rural, heavily forested lands, suburban, and urbanized areas. Highly impaired subbasins in one community were identified using this methodology and evaluated for principal forms of degradation and potential restoration policies and BMPs. This attribute-based prioritization method could be used in identifying baselines, prioritization policies, and adaptive community

  2. Integrated Force Method for Indeterminate Structures

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.; Halford, Gary R.; Patnaik, Surya N.

    2008-01-01

    Two methods of solving indeterminate structural-mechanics problems have been developed as products of research on the theory of strain compatibility. In these methods, stresses are considered to be the primary unknowns (in contrast to strains and displacements being considered as the primary unknowns in some prior methods). One of these methods, denoted the integrated force method (IFM), makes it possible to compute stresses, strains, and displacements with high fidelity by use of modest finite-element models that entail relatively small amounts of computation. The other method, denoted the completed Beltrami Mitchell formulation (CBMF), enables direct determination of stresses in an elastic continuum with general boundary conditions, without the need to first calculate displacements as in traditional methods. The equilibrium equation, the compatibility condition, and the material law are the three fundamental concepts of the theory of structures. For almost 150 years, it has been commonly supposed that the theory is complete. However, until now, the understanding of the compatibility condition remained incomplete, and the compatibility condition was confused with the continuity condition. Furthermore, the compatibility condition as applied to structures in its previous incomplete form was inconsistent with the strain formulation in elasticity.

  3. Real object-based 360-degree integral-floating display using multiple depth camera

    NASA Astrophysics Data System (ADS)

    Erdenebat, Munkh-Uchral; Dashdavaa, Erkhembaatar; Kwon, Ki-Chul; Wu, Hui-Ying; Yoo, Kwan-Hee; Kim, Young-Seok; Kim, Nam

    2015-03-01

    A novel 360-degree integral-floating display based on the real object is proposed. The general procedure of the display system is similar with conventional 360-degree integral-floating displays. Unlike previously presented 360-degree displays, the proposed system displays the 3D image generated from the real object in 360-degree viewing zone. In order to display real object in 360-degree viewing zone, multiple depth camera have been utilized to acquire the depth information around the object. Then, the 3D point cloud representations of the real object are reconstructed according to the acquired depth information. By using a special point cloud registration method, the multiple virtual 3D point cloud representations captured by each depth camera are combined as single synthetic 3D point cloud model, and the elemental image arrays are generated for the newly synthesized 3D point cloud model from the given anamorphic optic system's angular step. The theory has been verified experimentally, and it shows that the proposed 360-degree integral-floating display can be an excellent way to display real object in the 360-degree viewing zone.

  4. Identification of Functional Modules by Integration of Multiple Data Sources Using a Bayesian Network Classifier

    PubMed Central

    Wang, Jinlian; Zuo, Yiming; Liu, Lun; Man, Yangao; Tadesse, Mahlet G.; Ressom, Habtom W

    2014-01-01

    Background Prediction of functional modules is indispensable for detecting protein deregulation in human complex diseases such as cancer. Bayesian network (BN) is one of the most commonly used models to integrate heterogeneous data from multiple sources such as protein domain, interactome, functional annotation, genome-wide gene expression, and the literature. Methods and Results In this paper, we present a BN classifier that is customized to: 1) increase the ability to integrate diverse information from different sources, 2) effectively predict protein-protein interactions, 3) infer aberrant networks with scale-free and small world properties, and 4) group molecules into functional modules or pathways based on the primary function and biological features. Application of this model on discovering protein biomarkers of hepatocelluar carcinoma (HCC) leads to the identification of functional modules that provide insights into the mechanism of the development and progression of HCC. These functional modules include cell cycle deregulation, increased angiogenesis (e.g., vascular endothelial growth factor, blood vessel morphogenesis), oxidative metabolic alterations, and aberrant activation of signaling pathways involved in cellular proliferation, survival, and differentiation. Conclusion The discoveries and conclusions derived from our customized BN classifier are consistent with previously published results. The proposed approach for determining BN structure facilitates the integration of heterogeneous data from multiple sources to elucidate the mechanisms of complex diseases. PMID:24736851

  5. Methods of Genomic Competency Integration in Practice

    PubMed Central

    Jenkins, Jean; Calzone, Kathleen A.; Caskey, Sarah; Culp, Stacey; Weiner, Marsha; Badzek, Laurie

    2015-01-01

    Purpose Genomics is increasingly relevant to health care, necessitating support for nurses to incorporate genomic competencies into practice. The primary aim of this project was to develop, implement, and evaluate a year-long genomic education intervention that trained, supported, and supervised institutional administrator and educator champion dyads to increase nursing capacity to integrate genomics through assessments of program satisfaction and institutional achieved outcomes. Design Longitudinal study of 23 Magnet Recognition Program® Hospitals (21 intervention, 2 controls) participating in a 1-year new competency integration effort aimed at increasing genomic nursing competency and overcoming barriers to genomics integration in practice. Methods Champion dyads underwent genomic training consisting of one in-person kick-off training meeting followed by monthly education webinars. Champion dyads designed institution-specific action plans detailing objectives, methods or strategies used to engage and educate nursing staff, timeline for implementation, and outcomes achieved. Action plans focused on a minimum of seven genomic priority areas: champion dyad personal development; practice assessment; policy content assessment; staff knowledge needs assessment; staff development; plans for integration; and anticipated obstacles and challenges. Action plans were updated quarterly, outlining progress made as well as inclusion of new methods or strategies. Progress was validated through virtual site visits with the champion dyads and chief nursing officers. Descriptive data were collected on all strategies or methods utilized, and timeline for achievement. Descriptive data were analyzed using content analysis. Findings The complexity of the competency content and the uniqueness of social systems and infrastructure resulted in a significant variation of champion dyad interventions. Conclusions Nursing champions can facilitate change in genomic nursing capacity through

  6. Curriculum Integration in Arts Education: Connecting Multiple Art Forms through the Idea of "Space"

    ERIC Educational Resources Information Center

    Bautista, Alfredo; Tan, Liang See; Ponnusamy, Letchmi Devi; Yau, Xenia

    2016-01-01

    Arts integration research has focused on documenting how the teaching of specific art forms can be integrated with "core" academic subject matters (e.g. science, mathematics and literacy). However, the question of how the teaching of multiple art forms themselves can be integrated in schools remains to be explored by educational…

  7. A Novel Method of Line Detection using Image Integration Method

    NASA Astrophysics Data System (ADS)

    Lin, Daniel; Sun, Bo

    2015-03-01

    We developed a novel line detection algorithm based on image integration method. Hough Transformation uses spatial image gradient method to detect lines on an image. This is problematic because if the image has a region of high noise intensity, the gradient would point towards the noisy region . Denoising the noisy image requires an application of sophisticated noise reduction algorithm which increases computation complexity. Our algorithm can remedy this problem by averaging the pixels around the image region of interest. We were able to detect collagen fiber lines on an image produced by confocal microscope.

  8. Using Images, Metaphor, and Hypnosis in Integrating Multiple Personality and Dissociative States: A Review of the Literature.

    ERIC Educational Resources Information Center

    Crawford, Carrie L.

    1990-01-01

    Reviews literature on hypnosis, imagery, and metaphor as applied to the treatment and integration of those with multiple personality disorder (MPD) and dissociative states. Considers diagnostic criteria of MPD; explores current theories of etiology and treatment; and suggests specific examples of various clinical methods of treatment using…

  9. Solution methods for very highly integrated circuits.

    SciTech Connect

    Nong, Ryan; Thornquist, Heidi K.; Chen, Yao; Mei, Ting; Santarelli, Keith R.; Tuminaro, Raymond Stephen

    2010-12-01

    While advances in manufacturing enable the fabrication of integrated circuits containing tens-to-hundreds of millions of devices, the time-sensitive modeling and simulation necessary to design these circuits poses a significant computational challenge. This is especially true for mixed-signal integrated circuits where detailed performance analyses are necessary for the individual analog/digital circuit components as well as the full system. When the integrated circuit has millions of devices, performing a full system simulation is practically infeasible using currently available Electrical Design Automation (EDA) tools. The principal reason for this is the time required for the nonlinear solver to compute the solutions of large linearized systems during the simulation of these circuits. The research presented in this report aims to address the computational difficulties introduced by these large linearized systems by using Model Order Reduction (MOR) to (i) generate specialized preconditioners that accelerate the computation of the linear system solution and (ii) reduce the overall dynamical system size. MOR techniques attempt to produce macromodels that capture the desired input-output behavior of larger dynamical systems and enable substantial speedups in simulation time. Several MOR techniques that have been developed under the LDRD on 'Solution Methods for Very Highly Integrated Circuits' will be presented in this report. Among those presented are techniques for linear time-invariant dynamical systems that either extend current approaches or improve the time-domain performance of the reduced model using novel error bounds and a new approach for linear time-varying dynamical systems that guarantees dimension reduction, which has not been proven before. Progress on preconditioning power grid systems using multi-grid techniques will be presented as well as a framework for delivering MOR techniques to the user community using Trilinos and the Xyce circuit simulator

  10. Multiple imputation methods for bivariate outcomes in cluster randomised trials.

    PubMed

    DiazOrdaz, K; Kenward, M G; Gomes, M; Grieve, R

    2016-09-10

    Missing observations are common in cluster randomised trials. The problem is exacerbated when modelling bivariate outcomes jointly, as the proportion of complete cases is often considerably smaller than the proportion having either of the outcomes fully observed. Approaches taken to handling such missing data include the following: complete case analysis, single-level multiple imputation that ignores the clustering, multiple imputation with a fixed effect for each cluster and multilevel multiple imputation. We contrasted the alternative approaches to handling missing data in a cost-effectiveness analysis that uses data from a cluster randomised trial to evaluate an exercise intervention for care home residents. We then conducted a simulation study to assess the performance of these approaches on bivariate continuous outcomes, in terms of confidence interval coverage and empirical bias in the estimated treatment effects. Missing-at-random clustered data scenarios were simulated following a full-factorial design. Across all the missing data mechanisms considered, the multiple imputation methods provided estimators with negligible bias, while complete case analysis resulted in biased treatment effect estimates in scenarios where the randomised treatment arm was associated with missingness. Confidence interval coverage was generally in excess of nominal levels (up to 99.8%) following fixed-effects multiple imputation and too low following single-level multiple imputation. Multilevel multiple imputation led to coverage levels of approximately 95% throughout. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. PMID:26990655

  11. Integrability: mathematical methods for studying solitary waves theory

    NASA Astrophysics Data System (ADS)

    Wazwaz, Abdul-Majid

    2014-03-01

    In recent decades, substantial experimental research efforts have been devoted to linear and nonlinear physical phenomena. In particular, studies of integrable nonlinear equations in solitary waves theory have attracted intensive interest from mathematicians, with the principal goal of fostering the development of new methods, and physicists, who are seeking solutions that represent physical phenomena and to form a bridge between mathematical results and scientific structures. The aim for both groups is to build up our current understanding and facilitate future developments, develop more creative results and create new trends in the rapidly developing field of solitary waves. The notion of the integrability of certain partial differential equations occupies an important role in current and future trends, but a unified rigorous definition of the integrability of differential equations still does not exist. For example, an integrable model in the Painlevé sense may not be integrable in the Lax sense. The Painlevé sense indicates that the solution can be represented as a Laurent series in powers of some function that vanishes on an arbitrary surface with the possibility of truncating the Laurent series at finite powers of this function. The concept of Lax pairs introduces another meaning of the notion of integrability. The Lax pair formulates the integrability of nonlinear equation as the compatibility condition of two linear equations. However, it was shown by many researchers that the necessary integrability conditions are the existence of an infinite series of generalized symmetries or conservation laws for the given equation. The existence of multiple soliton solutions often indicates the integrability of the equation but other tests, such as the Painlevé test or the Lax pair, are necessary to confirm the integrability for any equation. In the context of completely integrable equations, studies are flourishing because these equations are able to describe the

  12. Method for measuring multiple scattering corrections between liquid scintillators

    NASA Astrophysics Data System (ADS)

    Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.; Wurtz, R. E.

    2016-07-01

    A time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.

  13. Method for measuring multiple scattering corrections between liquid scintillators

    DOE PAGESBeta

    Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.; Wurtz, R. E.

    2016-04-11

    In this study, a time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.

  14. Generalized multiple internal standard method for quantitative liquid chromatography mass spectrometry.

    PubMed

    Hu, Yuan-Liang; Chen, Zeng-Ping; Chen, Yao; Shi, Cai-Xia; Yu, Ru-Qin

    2016-05-01

    In this contribution, a multiplicative effects model for generalized multiple-internal-standard method (MEMGMIS) was proposed to solve the signal instability problem of LC-MS over time. MEMGMIS model seamlessly integrates the multiple-internal-standard strategy with multivariate calibration method, and takes full use of all the information carried by multiple internal standards during the quantification of target analytes. Unlike the existing methods based on multiple internal standards, MEMGMIS does not require selecting an optimal internal standard for the quantification of a specific analyte from multiple internal standards used. MEMGMIS was applied to a proof-of-concept model system: the simultaneous quantitative analysis of five edible artificial colorants in two kinds of cocktail drinks. Experimental results demonstrated that MEMGMIS models established on LC-MS data of calibration samples prepared with ultrapure water could provide quite satisfactory concentration predictions for colorants in cocktail samples from their LC-MS data measured 10days after the LC-MS analysis of the calibration samples. The average relative prediction errors of MEMGMIS models did not exceed 6.0%, considerably better than the corresponding values of commonly used univariate calibration models combined with multiple internal standards. The advantages of good performance and simple implementation render MEMGMIS model a promising alternative tool in quantitative LC-MS assays. PMID:27072522

  15. Integrative Data Analysis: The Simultaneous Analysis of Multiple Data Sets

    ERIC Educational Resources Information Center

    Curran, Patrick J.; Hussong, Andrea M.

    2009-01-01

    There are both quantitative and methodological techniques that foster the development and maintenance of a cumulative knowledge base within the psychological sciences. Most noteworthy of these techniques is meta-analysis, which allows for the synthesis of summary statistics drawn from multiple studies when the original data are not available.…

  16. Parallel methods for dynamic simulation of multiple manipulator systems

    NASA Technical Reports Server (NTRS)

    Mcmillan, Scott; Sadayappan, P.; Orin, David E.

    1993-01-01

    In this paper, efficient dynamic simulation algorithms for a system of m manipulators, cooperating to manipulate a large load, are developed; their performance, using two possible forms of parallelism on a general-purpose parallel computer, is investigated. One form, temporal parallelism, is obtained with the use of parallel numerical integration methods. A speedup of 3.78 on four processors of CRAY Y-MP8 was achieved with a parallel four-point block predictor-corrector method for the simulation of a four manipulator system. These multi-point methods suffer from reduced accuracy, and when comparing these runs with a serial integration method, the speedup can be as low as 1.83 for simulations with the same accuracy. To regain the performance lost due to accuracy problems, a second form of parallelism is employed. Spatial parallelism allows most of the dynamics of each manipulator chain to be computed simultaneously. Used exclusively in the four processor case, this form of parallelism in conjunction with a serial integration method results in a speedup of 3.1 on four processors over the best serial method. In cases where there are either more processors available or fewer chains in the system, the multi-point parallel integration methods are still advantageous despite the reduced accuracy because both forms of parallelism can then combine to generate more parallel tasks and achieve greater effective speedups. This paper also includes results for these cases.

  17. Multiple Solution Methods and Multiple Outcomes--Is It a Task for Kindergarten Children?

    ERIC Educational Resources Information Center

    Tsamir, Pessia; Tirosh, Dina; Tabach, Michal; Levenson, Esther

    2010-01-01

    Engaging students with multiple solution problems is considered good practice. Solutions to problems consist of the outcomes of the problem as well as the methods employed to reach these outcomes. In this study we analyze the results obtained from two groups of kindergarten children who engaged in one task, the Create an Equal Number Task. This…

  18. Multiple Contexts, Multiple Methods: A Study of Academic and Cultural Identity among Children of Immigrant Parents

    ERIC Educational Resources Information Center

    Urdan, Tim; Munoz, Chantico

    2012-01-01

    Multiple methods were used to examine the academic motivation and cultural identity of a sample of college undergraduates. The children of immigrant parents (CIPs, n = 52) and the children of non-immigrant parents (non-CIPs, n = 42) completed surveys assessing core cultural identity, valuing of cultural accomplishments, academic self-concept,…

  19. 360 degree viewable floating autostereoscopic display using integral photography and multiple semitransparent mirrors.

    PubMed

    Zhao, Dong; Su, Baiquan; Chen, Guowen; Liao, Hongen

    2015-04-20

    In this paper, we present a polyhedron-shaped floating autostereoscopic display viewable from 360 degrees using integral photography (IP) and multiple semitransparent mirrors. IP combined with polyhedron-shaped multiple semitransparent mirrors is used to achieve a 360 degree viewable floating three-dimensional (3D) autostereoscopic display, having the advantage of being able to be viewed by several observers from various viewpoints simultaneously. IP is adopted to generate a 3D autostereoscopic image with full parallax property. Multiple semitransparent mirrors reflect corresponding IP images, and the reflected IP images are situated around the center of the polyhedron-shaped display device for producing the floating display. The spatial reflected IP images reconstruct a floating autostereoscopic image viewable from 360 degrees. We manufactured two prototypes for producing such displays and performed two sets of experiments to evaluate the feasibility of the method described above. The results of our experiments showed that our approach can achieve a floating autostereoscopic display viewable from surrounding area. Moreover, it is shown the proposed method is feasible to facilitate the continuous viewpoint of a whole 360 degree display without flipping. PMID:25969022

  20. Satellite attitude prediction by multiple time scales method

    NASA Technical Reports Server (NTRS)

    Tao, Y. C.; Ramnath, R.

    1975-01-01

    An investigation is made of the problem of predicting the attitude of satellites under the influence of external disturbing torques. The attitude dynamics are first expressed in a perturbation formulation which is then solved by the multiple scales approach. The independent variable, time, is extended into new scales, fast, slow, etc., and the integration is carried out separately in the new variables. The theory is applied to two different satellite configurations, rigid body and dual spin, each of which may have an asymmetric mass distribution. The disturbing torques considered are gravity gradient and geomagnetic. Finally, as multiple time scales approach separates slow and fast behaviors of satellite attitude motion, this property is used for the design of an attitude control device. A nutation damping control loop, using the geomagnetic torque for an earth pointing dual spin satellite, is designed in terms of the slow equation.

  1. Restructuring for Integrative Education: Multiple Perspectives, Multiple Contexts. Critical Studies in Education and Culture Series.

    ERIC Educational Resources Information Center

    Jennings, Todd, Ed.

    Integrative education is defined as education that promotes learning and teaching in nonfragmented ways that embrace notions of holism, complexity, and interconnection. Furthermore, integrative education embraces the links, rather than the divisions, between the academic disciplines (e.g., arts and sciences) and between various subjective and…

  2. Identifying multiple submissions in Internet research: preserving data integrity.

    PubMed

    Bowen, Anne M; Daniel, Candice M; Williams, Mark L; Baird, Grayson L

    2008-11-01

    Internet-based sexuality research with hidden populations has become increasingly popular. Respondent anonymity may encourage participation and lower social desirability, but associated disinhibition may promote multiple submissions, especially when incentives are offered. The goal of this study was to identify the usefulness of different variables for detecting multiple submissions from repeat responders and to explore incentive effects. The data included 1,900 submissions from a three-session Internet intervention with a pretest and three post-test questionnaires. Participants were men who have sex with men and incentives were offered to rural participants for completing each questionnaire. The final number of submissions included 1,273 "unique", 132 first submissions by "repeat responders" and 495 additional submissions by the "repeat responders" (N = 1,900). Four categories of repeat responders were identified: "infrequent" (2-5 submissions), "persistent" (6-10 submissions), "very persistent" (11-30 submissions), and "hackers" (more than 30 submissions). Internet Provider (IP) addresses, user names, and passwords were the most useful for identifying "infrequent" repeat responders. "Hackers" often varied their IP address and identifying information to prevent easy identification, but investigating the data for small variations in IP, using reverse telephone look up, and patterns across usernames and passwords were helpful. Incentives appeared to play a role in stimulating multiple submissions, especially from the more sophisticated "hackers". Finally, the web is ever evolving and it will be necessary to have good programmers and staff who evolve as fast as "hackers". PMID:18240015

  3. Path Integral Monte Carlo Methods for Fermions

    NASA Astrophysics Data System (ADS)

    Ethan, Ethan; Dubois, Jonathan; Ceperley, David

    2014-03-01

    In general, Quantum Monte Carlo methods suffer from a sign problem when simulating fermionic systems. This causes the efficiency of a simulation to decrease exponentially with the number of particles and inverse temperature. To circumvent this issue, a nodal constraint is often implemented, restricting the Monte Carlo procedure from sampling paths that cause the many-body density matrix to change sign. Unfortunately, this high-dimensional nodal surface is not a priori known unless the system is exactly solvable, resulting in uncontrolled errors. We will discuss two possible routes to extend the applicability of finite-temperatue path integral Monte Carlo. First we extend the regime where signful simulations are possible through a novel permutation sampling scheme. Afterwards, we discuss a method to variationally improve the nodal surface by minimizing a free energy during simulation. Applications of these methods will include both free and interacting electron gases, concluding with discussion concerning extension to inhomogeneous systems. Support from DOE DE-FG52-09NA29456, DE-AC52-07NA27344, LLNL LDRD 10- ERD-058, and the Lawrence Scholar program.

  4. Multiple integral representation for the trigonometric SOS model with domain wall boundaries

    NASA Astrophysics Data System (ADS)

    Galleas, W.

    2012-05-01

    Using the dynamical Yang-Baxter algebra we derive a functional equation for the partition function of the trigonometric SOS model with domain wall boundary conditions. The solution of the equation is given in terms of a multiple contour integral.

  5. Integrating multiple scales of hydraulic conductivity measurements in training image-based stochastic models

    NASA Astrophysics Data System (ADS)

    Mahmud, K.; Mariethoz, G.; Baker, A.; Sharma, A.

    2015-01-01

    Hydraulic conductivity is one of the most critical and at the same time one of the most uncertain parameters in many groundwater models. One problem commonly faced is that the data are usually not collected at the same scale as the discretized elements used in a numerical model. Moreover, it is common that different types of hydraulic conductivity measurements, corresponding to different spatial scales, coexist in a studied domain, which have to be integrated simultaneously. Here we address this issue in the context of Image Quilting, one of the recently developed multiple-point geostatistics methods. Based on a training image that represents fine-scale spatial variability, we use the simplified renormalization upscaling method to obtain a series of upscaled training images that correspond to the different scales at which measurements are available. We then apply Image Quilting with such a multiscale training image to be able to incorporate simultaneously conditioning data at several spatial scales of heterogeneity. The realizations obtained satisfy the conditioning data exactly across all scales, but it can come at the expense of a small approximation in the representation of the physical scale relationships. In order to mitigate this approximation, we iteratively apply a kriging-based correction to the finest scale that ensures local conditioning at the coarsest scales. The method is tested on a series of synthetic examples where it gives good results and shows potential for the integration of different measurement methods in real-case hydrogeological models.

  6. Integration of Multiple Organic Light Emitting Diodes and a Lens for Emission Angle Control

    NASA Astrophysics Data System (ADS)

    Rahadian, Fanny; Masada, Tatsuya; Fujieda, Ichiro

    We propose to integrate a single lens on top of multiple OLEDs. Angular distribution of the light emitted from the lens surface is altered by turning on the OLEDs selectively. We can use such a light source as a backlight for a liquid crystal display to switch its viewing angle range and/or to display multiple images in different directions. Pixel-level integration would allow one to construct an OLED display with a similar emission angle control.

  7. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    USGS Publications Warehouse

    Fischer, Jesse R.; Quist, Michael

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  8. Two-Dimensional Integral Combustion for Multiple Phase Flow

    Energy Science and Technology Software Center (ESTSC)

    1997-05-05

    This ANL multiphase two-dimensional combustion computer code solves conservation equations for gaseous species and solid particles (or droplets) of various sizes. General conservation laws, expressed by ellipitic-type partial differential equations are used in conjunction with rate equations governing the mass, momentum, enthaply, species, turbulent kinetic energy, and turbulent dissipation for a two-phase reacting flow. Associated submodels include an integral combustion, a two-parameter turbulence, a particle evaporation, and interfacial submodels. A newly-developed integral combustion submodel replacingmore » an Arrhenius-type differential reaction submodel is implemented to improve numerical convergence and enhance numerical stability. The two-parameter turbulence submodel is modified for both gas and solid phases. The evaporation submodel treats size dispersion as well as particle evaporation. Interfacial submodels use correlations to model interfacial momentum and energy transfer.« less

  9. Accelerating Ab Initio Path Integral Simulations via Imaginary Multiple-Timestepping.

    PubMed

    Cheng, Xiaolu; Herr, Jonathan D; Steele, Ryan P

    2016-04-12

    This work investigates the use of multiple-timestep schemes in imaginary time for computationally efficient ab initio equilibrium path integral simulations of quantum molecular motion. In the simplest formulation, only every n(th) path integral replica is computed at the target level of electronic structure theory, whereas the remaining low-level replicas still account for nuclear motion quantum effects with a more computationally economical theory. Motivated by recent developments for multiple-timestep techniques in real-time classical molecular dynamics, both 1-electron (atomic-orbital basis set) and 2-electron (electron correlation) truncations are shown to be effective. Structural distributions and thermodynamic averages are tested for representative analytic potentials and ab initio molecular examples. Target quantum chemistry methods include density functional theory and second-order Møller-Plesset perturbation theory, although any level of theory is formally amenable to this framework. For a standard two-level splitting, computational speedups of 1.6-4.0x are observed when using a 4-fold reduction in time slices; an 8-fold reduction is feasible in some cases. Multitiered options further reduce computational requirements and suggest that quantum mechanical motion could potentially be obtained at a cost not significantly different from the cost of classical simulations. PMID:26966920

  10. Method and apparatus for fiber optic multiple scattering suppression

    NASA Technical Reports Server (NTRS)

    Ackerson, Bruce J. (Inventor)

    2000-01-01

    The instant invention provides a method and apparatus for use in laser induced dynamic light scattering which attenuates the multiple scattering component in favor of the single scattering component. The preferred apparatus utilizes two light detectors that are spatially and/or angularly separated and which simultaneously record the speckle pattern from a single sample. The recorded patterns from the two detectors are then cross correlated in time to produce one point on a composite single/multiple scattering function curve. By collecting and analyzing cross correlation measurements that have been taken at a plurality of different spatial/angular positions, the signal representative of single scattering may be differentiated from the signal representative of multiple scattering, and a near optimum detector separation angle for use in taking future measurements may be determined.

  11. Identifying Multiple Submissions in Internet Research: Preserving Data Integrity

    PubMed Central

    Bowen, Anne M.; Daniel, Candice M.; Williams, Mark L.; Baird, Grayson L.

    2008-01-01

    Internet-based sexuality research with hidden populations has become increasingly popular. Respondent anonymity may encourage participation and lower social desirability, but associated disinhibition may promote multiple submissions, especially when incentives are offered. The goal of this study was to identify the usefulness of different variables for detecting multiple submissions from repeat responders and to explore incentive effects. The data included 1,900 submissions from a three-session Internet intervention with a pretest and three post-test questionnaires. Participants were men who have sex with men and incentives were offered to rural participants for completing each questionnaire. The final number of submissions included 1,273 “unique”, 132 first submissions by “repeat responders” and 495 additional submissions by the “repeat responders” (N = 1,900). Four categories of repeat responders were identified: “infrequent” (2–5 submissions), “persistent” (6–10 submissions), “very persistent” (11–30 submissions), and “hackers” (more than 30 submissions). Internet Provider (IP) addresses, user names, and passwords were the most useful for identifying “infrequent” repeat responders. “Hackers” often varied their IP address and identifying information to prevent easy identification, but investigating the data for small variations in IP, using reverse telephone look up, and patterns across usernames and passwords were helpful. Incentives appeared to play a role in stimulating multiple submissions, especially from the more sophisticated “hackers”. Finally, the web is ever evolving and it will be necessary to have good programmers and staff who evolve as fast as “hackers”. PMID:18240015

  12. Multiple grid method for the calculation of potential flow around three dimensional bodies

    NASA Astrophysics Data System (ADS)

    Wolff, H.

    1982-01-01

    The classical approach of representation of the solution by means of a doublet distribution on the boundary of the domain is considered. From the boundary condition, a Fredholm integral equation for the doublet distribution, mu, is obtained. By a piecewise constant function, mu is approximated. This numerical method results in a nonsparse system that is solved by a multiple grid iterative process. The convergence rate of this process is discussed and its performance is compared with the Jacobi iterative process. For flow around an ellipsoid, the multiple grid process turns out to be much more efficient than the Jacobi iterative process.

  13. Empathetic, Critical Integrations of Multiple Perspectives: A Core Practice for Language Teacher Education?

    ERIC Educational Resources Information Center

    Daniel, Shannon M.

    2015-01-01

    In this self-study, the author reflects on her implementation of empathetic, critical integrations of multiple perspectives (ECI), which she designed to afford preservice teachers the opportunity to discuss and collectively reflect upon the oft-diverging multiple perspectives, values, and practices they experience during their practicum (Daniel,…

  14. Energy Simulation of Integrated Multiple-Zone Variable Refrigerant Flow System

    SciTech Connect

    Shen, Bo; Rice, C Keith; Baxter, Van D

    2013-01-01

    We developed a detailed steady-state system model, to simulate the performance of an integrated five-zone variable refrigerant flow (VRF)heat pump system. The system is multi-functional, capable of space cooling, space heating, combined space cooling and water heating, and dedicated water heating. Methods were developed to map the VRF performance in each mode, based on the abundant data produced by the equipment system model. The performance maps were used in TRNSYS annual energy simulations. Using TRNSYS, we have successfully setup and run cases for a multiple-split, VRF heat pump and dehumidifier combination in 5-zone houses in 5 climates that control indoor dry-bulb temperature and relative humidity. We compared the calculated energy consumptions for the VRF heat pump against that of a baseline central air source heat pump, coupled with electric water heating and the standalone dehumidifiers. In addition, we investigated multiple control scenarios for the VRF heat pump, i.e. on/off control, variable indoor air flow rate, and using different zone temperature setting schedules, etc. The energy savings for the multiple scenarios were assessed.

  15. A method for interactive specification of multiple-block topologies

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.; Mccann, Karen M.

    1991-01-01

    A method is presented for dealing with the vast amount of topological and other data which must be specified to generate a multiple-block computational grid. Specific uses of the graphical capabilities of a powerful scientific workstation are described which reduce the burden on the user of collecting and formatting such large amounts of data. A program to implement this method, 3DPREP, is described. A plotting transformation algorithm, some useful software tools, notes on programming, and a database organization are also presented. Example grids developed using the method are shown.

  16. Improved parallel solution techniques for the integral transport matrix method

    SciTech Connect

    Zerr, Robert J; Azmy, Yousry Y

    2010-11-23

    Alternative solution strategies to the parallel block Jacobi (PBJ) method for the solution of the global problem with the integral transport matrix method operators have been designed and tested. The most straightforward improvement to the Jacobi iterative method is the Gauss-Seidel alternative. The parallel red-black Gauss-Seidel (PGS) algorithm can improve on the number of iterations and reduce work per iteration by applying an alternating red-black color-set to the subdomains and assigning multiple sub-domains per processor. A parallel GMRES(m) method was implemented as an alternative to stationary iterations. Computational results show that the PGS method can improve on the PBJ method execution by up to {approx}50% when eight sub-domains per processor are used. However, compared to traditional source iterations with diffusion synthetic acceleration, it is still approximately an order of magnitude slower. The best-performing case are opticaUy thick because sub-domains decouple, yielding faster convergence. Further tests revealed that 64 sub-domains per processor was the best performing level of sub-domain division. An acceleration technique that improves the convergence rate would greatly improve the ITMM. The GMRES(m) method with a diagonal block preconditioner consumes approximately the same time as the PBJ solver but could be improved by an as yet undeveloped, more efficient preconditioner.

  17. Improved Multiple-Coarsening Methods for Sn Discretizations of the Boltzmann Equation

    SciTech Connect

    Lee, Barry

    2010-06-01

    In a recent series of articles, the author presented a multiple-coarsening multigrid method for solving $S_n$ discretizations of the Boltzmann transport equation. This algorithm is applied to an integral equation for the scalar flux or moments. Although this algorithm is very efficient over parameter regimes that describe realistic neutron/photon transport applications, improved methods that can reduce the computational cost are presented in this paper. These improved methods are derived through a careful examination of the frequencies, particularly the near-nullspace, of the integral equation. In the earlier articles, the near-nullspace components were shown to be smooth in angle in the sense that the angular fluxes generated by these components are smooth in angle. In this paper, we present a spatial description of these near-nullspace components. Using the angular description of the earlier papers together with the spatial description reveals the intrinsic space-angle dependence of the integral equation's frequencies. This space-angle dependence is used to determine the appropriate space-angle grids to represent and efficiently attenuate the near-nullspace error components on. It will be shown that these components can have multiple spatial scales. By using only the appropriate space-angle grids that can represent these spatial scales in the original multiple-coarsening algorithm, an improved algorithm is obtained. Moreover, particularly for anisotropic scattering, recognizing the strong angle dependence of the angular fluxes generated by the high frequencies of the integral equation, another improved multiple-coarsening scheme is derived. Restricting this scheme to the appropriate space-angle grids produces a very efficient method.

  18. Improved Multiple-Coarsening Methods for Sn Discretizations of the Boltzmann Equation

    SciTech Connect

    Lee, B

    2008-12-01

    In a recent series of articles, the author presented a multiple-coarsening multigrid method for solving S{sub n} discretizations of the Boltzmann transport equation. This algorithm is applied to an integral equation for the scalar flux or moments. Although this algorithm is very efficient over parameter regimes that describe realistic neutron/photon transport applications, improved methods that can reduce the computational cost are presented in this paper. These improved methods are derived through a careful examination of the frequencies, particularly the near-nullspace, of the integral equation. In the earlier articles, the near-nullspace components were shown to be smooth in angle in the sense that the angular fluxes generated by these components are smooth in angle. In this paper, we present a spatial description of these near-nullspace components. Using the angular description of the earlier papers together with the spatial description reveals the intrinsic space-angle dependence of the integral equation's frequencies. This space-angle dependence is used to determine the appropriate space-angle grids to represent and efficiently attenuate the near-nullspace error components on. It will be shown that these components can have multiple spatial scales. By using only the appropriate space-angle grids that can represent these spatial scales in the original multiple-coarsening algorithm, an improved algorithm is obtained. Moreover, particularly for anisotropic scattering, recognizing the strong angle dependence of the angular fluxes generated by the high frequencies of the integral equation, another improved multiple-coarsening scheme is derived. Restricting this scheme to the appropriate space-angle grids produces a very efficient method.

  19. Integrated Dataset of Screening Hits against Multiple Neglected Disease Pathogens

    PubMed Central

    Nwaka, Solomon; Besson, Dominique; Ramirez, Bernadette; Maes, Louis; Matheeussen, An; Bickle, Quentin; Mansour, Nuha R.; Yousif, Fouad; Townson, Simon; Gokool, Suzanne; Cho-Ngwa, Fidelis; Samje, Moses; Misra-Bhattacharya, Shailja; Murthy, P. K.; Fakorede, Foluke; Paris, Jean-Marc; Yeates, Clive; Ridley, Robert; Van Voorhis, Wesley C.; Geary, Timothy

    2011-01-01

    New chemical entities are desperately needed that overcome the limitations of existing drugs for neglected diseases. Screening a diverse library of 10,000 drug-like compounds against 7 neglected disease pathogens resulted in an integrated dataset of 744 hits. We discuss the prioritization of these hits for each pathogen and the strong correlation observed between compounds active against more than two pathogens and mammalian cell toxicity. Our work suggests that the efficiency of early drug discovery for neglected diseases can be enhanced through a collaborative, multi-pathogen approach. PMID:22247786

  20. Galerkin projection methods for solving multiple related linear systems

    SciTech Connect

    Chan, T.F.; Ng, M.; Wan, W.L.

    1996-12-31

    We consider using Galerkin projection methods for solving multiple related linear systems A{sup (i)}x{sup (i)} = b{sup (i)} for 1 {le} i {le} s, where A{sup (i)} and b{sup (i)} are different in general. We start with the special case where A{sup (i)} = A and A is symmetric positive definite. The method generates a Krylov subspace from a set of direction vectors obtained by solving one of the systems, called the seed system, by the CG method and then projects the residuals of other systems orthogonally onto the generated Krylov subspace to get the approximate solutions. The whole process is repeated with another unsolved system as a seed until all the systems are solved. We observe in practice a super-convergence behaviour of the CG process of the seed system when compared with the usual CG process. We also observe that only a small number of restarts is required to solve all the systems if the right-hand sides are close to each other. These two features together make the method particularly effective. In this talk, we give theoretical proof to justify these observations. Furthermore, we combine the advantages of this method and the block CG method and propose a block extension of this single seed method. The above procedure can actually be modified for solving multiple linear systems A{sup (i)}x{sup (i)} = b{sup (i)}, where A{sup (i)} are now different. We can also extend the previous analytical results to this more general case. Applications of this method to multiple related linear systems arising from image restoration and recursive least squares computations are considered as examples.

  1. Promoting return of function in multiple sclerosis: An integrated approach.

    PubMed

    Gacias, Mar; Casaccia, Patrizia

    2013-10-01

    Multiple sclerosis is a disease characterized by inflammatory demyelination, axonal degeneration and progressive brain atrophy. Most of the currently available disease modifying agents proved to be very effective in managing the relapse rate, however progressive neuronal damage continues to occur and leads to progressive accumulation of irreversible disability. For this reason, any therapeutic strategy aimed at restoration of function must take into account not only immunomodulation, but also axonal protection and new myelin formation. We further highlight the importance of an holistic approach, which considers the variability of therapeutic responsiveness as the result of the interplay between genetic differences and the epigenome, which is in turn affected by gender, age and differences in life style including diet, exercise, smoking and social interaction. PMID:24363985

  2. Promoting return of function in multiple sclerosis: An integrated approach

    PubMed Central

    Gacias, Mar; Casaccia, Patrizia

    2013-01-01

    Multiple sclerosis is a disease characterized by inflammatory demyelination, axonal degeneration and progressive brain atrophy. Most of the currently available disease modifying agents proved to be very effective in managing the relapse rate, however progressive neuronal damage continues to occur and leads to progressive accumulation of irreversible disability. For this reason, any therapeutic strategy aimed at restoration of function must take into account not only immunomodulation, but also axonal protection and new myelin formation. We further highlight the importance of an holistic approach, which considers the variability of therapeutic responsiveness as the result of the interplay between genetic differences and the epigenome, which is in turn affected by gender, age and differences in life style including diet, exercise, smoking and social interaction. PMID:24363985

  3. Exercise in multiple sclerosis -- an integral component of disease management

    PubMed Central

    2012-01-01

    Multiple sclerosis (MS) is the most common chronic inflammatory disorder of the central nervous system (CNS) in young adults. The disease causes a wide range of symptoms depending on the localization and characteristics of the CNS pathology. In addition to drug-based immunomodulatory treatment, both drug-based and non-drug approaches are established as complementary strategies to alleviate existing symptoms and to prevent secondary diseases. In particular, physical therapy like exercise and physiotherapy can be customized to the individual patient's needs and has the potential to improve the individual outcome. However, high quality systematic data on physical therapy in MS are rare. This article summarizes the current knowledge on the influence of physical activity and exercise on disease-related symptoms and physical restrictions in MS patients. Other treatment strategies such as drug treatments or cognitive training were deliberately excluded for the purposes of this article. PMID:22738091

  4. Students' integration of multiple representations in a titration experiment

    NASA Astrophysics Data System (ADS)

    Kunze, Nicole M.

    A complete understanding of a chemical concept is dependent upon a student's ability to understand the microscopic or particulate nature of the phenomenon and integrate the microscopic, symbolic, and macroscopic representations of the phenomenon. Acid-base chemistry is a general chemistry topic requiring students to understand the topics of chemical reactions, solutions, and equilibrium presented earlier in the course. In this study, twenty-five student volunteers from a second semester general chemistry course completed two interviews. The first interview was completed prior to any classroom instruction on acids and bases. The second interview took place after classroom instruction, a prelab activity consisting of a titration calculation worksheet, a titration computer simulation, or a microscopic level animation of a titration, and two microcomputer-based laboratory (MBL) titration experiments. During the interviews, participants were asked to define and describe acid-base concepts and in the second interview they also drew the microscopic representations of four stages in an acid-base titration. An analysis of the data showed that participants had integrated the three representations of an acid-base titration to varying degrees. While some participants showed complete understanding of acids, bases, titrations, and solution chemistry, other participants showed several alternative conceptions concerning strong acid and base dissociation, the formation of titration products, and the dissociation of soluble salts. Before instruction, participants' definitions of acid, base, and pH were brief and consisted of descriptive terms. After instruction, the definitions were more scientific and reflected the definitions presented during classroom instruction.

  5. Multiple light scattering methods for multiphase flow diagnostics

    NASA Astrophysics Data System (ADS)

    Estevadeordal, Jordi

    2015-11-01

    Multiphase flows of gases and liquids containing droplets, bubbles, or particulates present light scattering imaging challenges due to the interference from each phase, such as secondary reflections, extinctions, absorptions, and refractions. These factors often prevent the unambiguous detection of each phase and also produce undesired beam steering. The effects can be especially complex in presence of dense phases, multispecies flows, and high pressure environments. This investigation reports new methods for overcoming these effects for quantitative measurements of velocity, density, and temperature fields. The methods are based on light scattering techniques combining Mie and filtered Rayleigh scattering and light extinction analyses and measurements. The optical layout is designed to perform multiple property measurements with improved signal from each phase via laser spectral and polarization characterization, etalon decontamination, and use of multiple wavelengths and imaging detectors.

  6. Method for high-accuracy multiplicity-correlation measurements

    NASA Astrophysics Data System (ADS)

    Gulbrandsen, K.; Søgaard, C.

    2016-04-01

    Multiplicity-correlation measurements provide insight into the dynamics of high-energy collisions. Models describing these collisions need these correlation measurements to tune the strengths of the underlying QCD processes which influence all observables. Detectors, however, often possess limited coverage or reduced efficiency that influence correlation measurements in obscure ways. In this paper, the effects of nonuniform detection acceptance and efficiency on the measurement of multiplicity correlations between two distinct detector regions (termed forward-backward correlations) are derived. An analysis method with such effects built in is developed and subsequently verified using different event generators. The resulting method accounts for acceptance and efficiency in a model-independent manner with high accuracy, thereby shedding light on the relative contributions of the underlying processes to particle production.

  7. Measuring multiple residual-stress components using the contour method and multiple cuts

    SciTech Connect

    Prime, Michael B; Swenson, Hunter; Pagliaro, Pierluigi; Zuccarello, Bernardo

    2009-01-01

    The conventional contour method determines one component of stress over the cross section of a part. The part is cut into two, the contour of the exposed surface is measured, and Bueckner's superposition principle is analytically applied to calculate stresses. In this paper, the contour method is extended to the measurement of multiple stress components by making multiple cuts with subsequent applications of superposition. The theory and limitations are described. The theory is experimentally tested on a 316L stainless steel disk with residual stresses induced by plastically indenting the central portion of the disk. The stress results are validated against independent measurements using neutron diffraction. The theory has implications beyond just multiple cuts. The contour method measurements and calculations for the first cut reveal how the residual stresses have changed throughout the part. Subsequent measurements of partially relaxed stresses by other techniques, such as laboratory x-rays, hole drilling, or neutron or synchrotron diffraction, can be superimposed back to the original state of the body.

  8. Pursuing the method of multiple working hypotheses for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Clark, M. P.; Kavetski, D.; Fenicia, F.

    2012-12-01

    Ambiguities in the representation of environmental processes have manifested themselves in a plethora of hydrological models, differing in almost every aspect of their conceptualization and implementation. The current overabundance of models is symptomatic of an insufficient scientific understanding of environmental dynamics at the catchment scale, which can be attributed to difficulties in measuring and representing the heterogeneity encountered in natural systems. This presentation advocates using the method of multiple working hypotheses for systematic and stringent testing of model alternatives in hydrology. We discuss how the multiple hypothesis approach provides the flexibility to formulate alternative representations (hypotheses) describing both individual processes and the overall system. When combined with incisive diagnostics to scrutinize multiple model representations against observed data, this provides hydrologists with a powerful and systematic approach for model development and improvement. Multiple hypothesis frameworks also support a broader coverage of the model hypothesis space and hence improve the quantification of predictive uncertainty arising from system and component non-identifiabilities. As part of discussing the advantages and limitations of multiple hypothesis frameworks, we critically review major contemporary challenges in hydrological hypothesis-testing, including exploiting different types of data to investigate the fidelity of alternative process representations, accounting for model structure ambiguities arising from major uncertainties in environmental data, quantifying regional differences in dominant hydrological processes, and the grander challenge of understanding the self-organization and optimality principles that may functionally explain and describe the heterogeneities evident in most environmental systems. We assess recent progress in these research directions, and how new advances are possible using multiple hypothesis

  9. Pursuing the method of multiple working hypotheses for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Clark, Martyn P.; Kavetski, Dmitri; Fenicia, Fabrizio

    2011-09-01

    Ambiguities in the representation of environmental processes have manifested themselves in a plethora of hydrological models, differing in almost every aspect of their conceptualization and implementation. The current overabundance of models is symptomatic of an insufficient scientific understanding of environmental dynamics at the catchment scale, which can be attributed to difficulties in measuring and representing the heterogeneity encountered in natural systems. This commentary advocates using the method of multiple working hypotheses for systematic and stringent testing of model alternatives in hydrology. We discuss how the multiple-hypothesis approach provides the flexibility to formulate alternative representations (hypotheses) describing both individual processes and the overall system. When combined with incisive diagnostics to scrutinize multiple model representations against observed data, this provides hydrologists with a powerful and systematic approach for model development and improvement. Multiple-hypothesis frameworks also support a broader coverage of the model hypothesis space and hence improve the quantification of predictive uncertainty arising from system and component nonidentifiabilities. As part of discussing the advantages and limitations of multiple-hypothesis frameworks, we critically review major contemporary challenges in hydrological hypothesis-testing, including exploiting different types of data to investigate the fidelity of alternative process representations, accounting for model structure ambiguities arising from major uncertainties in environmental data, quantifying regional differences in dominant hydrological processes, and the grander challenge of understanding the self-organization and optimality principles that may functionally explain and describe the heterogeneities evident in most environmental systems. We assess recent progress in these research directions, and how new advances are possible using multiple

  10. Integrating syndromic surveillance data across multiple locations: effects on outbreak detection performance.

    PubMed

    Reis, Ben Y; Mandl, Kenneth D

    2003-01-01

    Syndromic surveillance systems are being deployed widely to monitor for signals of covert bioterrorist attacks. Regional systems are being established through the integration of local surveillance data across multiple facilities. We studied how different methods of data integration affect outbreak detection performance. We used a simulation relying on a semi-synthetic dataset, introducing simulated outbreaks of different sizes into historical visit data from two hospitals. In one simulation, we introduced the synthetic outbreak evenly into both hospital datasets (aggregate model). In the second, the outbreak was introduced into only one or the other of the hospital datasets (local model). We found that the aggregate model had a higher sensitivity for detecting outbreaks that were evenly distributed between the hospitals. However, for outbreaks that were localized to one facility, maintaining individual models for each location proved to be better. Given the complementary benefits offered by both approaches, the results suggest building a hybrid system that includes both individual models for each location, and an aggregate model that combines all the data. We also discuss options for multi-level signal integration hierarchies. PMID:14728233

  11. Lidar Tracking of Multiple Fluorescent Tracers: Method and Field Test

    NASA Technical Reports Server (NTRS)

    Eberhard, Wynn L.; Willis, Ron J.

    1992-01-01

    Past research and applications have demonstrated the advantages and usefulness of lidar detection of a single fluorescent tracer to track air motions. Earlier researchers performed an analytical study that showed good potential for lidar discrimination and tracking of two or three different fluorescent tracers at the same time. The present paper summarizes the multiple fluorescent tracer method, discusses its expected advantages and problems, and describes our field test of this new technique.

  12. Plant aquaporins: membrane channels with multiple integrated functions.

    PubMed

    Maurel, Christophe; Verdoucq, Lionel; Luu, Doan-Trung; Santoni, Véronique

    2008-01-01

    Aquaporins are channel proteins present in the plasma and intracellular membranes of plant cells, where they facilitate the transport of water and/or small neutral solutes (urea, boric acid, silicic acid) or gases (ammonia, carbon dioxide). Recent progress was made in understanding the molecular bases of aquaporin transport selectivity and gating. The present review examines how a wide range of selectivity profiles and regulation properties allows aquaporins to be integrated in numerous functions, throughout plant development, and during adaptations to variable living conditions. Although they play a central role in water relations of roots, leaves, seeds, and flowers, aquaporins have also been linked to plant mineral nutrition and carbon and nitrogen fixation. PMID:18444909

  13. Path integral method for DNA denaturation

    NASA Astrophysics Data System (ADS)

    Zoli, Marco

    2009-04-01

    The statistical physics of homogeneous DNA is investigated by the imaginary time path integral formalism. The base pair stretchings are described by an ensemble of paths selected through a macroscopic constraint, the fulfillment of the second law of thermodynamics. The number of paths contributing to the partition function strongly increases around and above a specific temperature Tc∗ , whereas the fraction of unbound base pairs grows continuously around and above Tc∗ . The latter is identified with the denaturation temperature. Thus, the separation of the two complementary strands appears as a highly cooperative phenomenon displaying a smooth crossover versus T . The thermodynamical properties have been computed in a large temperature range by varying the size of the path ensemble at the lower bound of the range. No significant physical dependence on the system size has been envisaged. The entropy grows continuously versus T while the specific heat displays a remarkable peak at Tc∗ . The location of the peak versus T varies with the stiffness of the anharmonic stacking interaction along the strand. The presented results suggest that denaturation in homogeneous DNA has the features of a second-order phase transition. The method accounts for the cooperative behavior of a very large number of degrees of freedom while the computation time is kept within a reasonable limit.

  14. Integrating regional conservation priorities for multiple objectives into national policy

    PubMed Central

    Beger, Maria; McGowan, Jennifer; Treml, Eric A.; Green, Alison L.; White, Alan T.; Wolff, Nicholas H.; Klein, Carissa J.; Mumby, Peter J.; Possingham, Hugh P.

    2015-01-01

    Multinational conservation initiatives that prioritize investment across a region invariably navigate trade-offs among multiple objectives. It seems logical to focus where several objectives can be achieved efficiently, but such multi-objective hotspots may be ecologically inappropriate, or politically inequitable. Here we devise a framework to facilitate a regionally cohesive set of marine-protected areas driven by national preferences and supported by quantitative conservation prioritization analyses, and illustrate it using the Coral Triangle Initiative. We identify areas important for achieving six objectives to address ecosystem representation, threatened fauna, connectivity and climate change. We expose trade-offs between areas that contribute substantially to several objectives and those meeting one or two objectives extremely well. Hence there are two strategies to guide countries choosing to implement regional goals nationally: multi-objective hotspots and complementary sets of single-objective priorities. This novel framework is applicable to any multilateral or global initiative seeking to apply quantitative information in decision making. PMID:26364769

  15. Integrating regional conservation priorities for multiple objectives into national policy.

    PubMed

    Beger, Maria; McGowan, Jennifer; Treml, Eric A; Green, Alison L; White, Alan T; Wolff, Nicholas H; Klein, Carissa J; Mumby, Peter J; Possingham, Hugh P

    2015-01-01

    Multinational conservation initiatives that prioritize investment across a region invariably navigate trade-offs among multiple objectives. It seems logical to focus where several objectives can be achieved efficiently, but such multi-objective hotspots may be ecologically inappropriate, or politically inequitable. Here we devise a framework to facilitate a regionally cohesive set of marine-protected areas driven by national preferences and supported by quantitative conservation prioritization analyses, and illustrate it using the Coral Triangle Initiative. We identify areas important for achieving six objectives to address ecosystem representation, threatened fauna, connectivity and climate change. We expose trade-offs between areas that contribute substantially to several objectives and those meeting one or two objectives extremely well. Hence there are two strategies to guide countries choosing to implement regional goals nationally: multi-objective hotspots and complementary sets of single-objective priorities. This novel framework is applicable to any multilateral or global initiative seeking to apply quantitative information in decision making. PMID:26364769

  16. A method for shipboard treatment of multiple heat casualties.

    PubMed

    Sweeney, W B; Krafte-Jacobs, B; Hansen, W; Saldana, M

    1992-03-01

    A method is presented for the treatment aboard ship of multiple patients afflicted with life-threatening heat illness, using an inflatable life raft cooling system. The potential benefits of this method include: (1) the utilization of readily available materials aboard U.S. Naval vessels; (2) the provision for rapid patient cooling by evaporation while maintaining patient safety and comfort; (3) the ability to treat many patients simultaneously with minimal attendant personnel; and (4) the maintenance of patient access allowing for monitoring and the administration of additional supportive measures. PMID:1603408

  17. Comparison of four methods for aggregating judgments from multiple experts

    SciTech Connect

    Booker, J.M.; Picard, R.R.

    1991-01-01

    This report describes a study that compares four different methods for aggregating expert judgment data given from multiple experts. These experts need not be a random sample of available experts. The experts estimate the same unknown parameter value. Their estimates need not be a representative set of sample values from an underlying distribution whose mean is an unknown parameter, {theta}. However, it is desired to combine the experts' estimates into a single aggregation estimate to reflect their amount of available knowledge about the unknown parameter. Many different aggregation estimators and methods have been proposed in the literature. However, few have been used, tested, or compared. Four different methods are chosen for this study which have been used or proposed for use in NRC studies. The set represents a cross section of the various types of methods. The results of this study do not indicate the use of any one method over another. Methods requiring minimal decision maker input are sensitive to the biases in the experts' responses. For these methods, there is no mechanism to adjust the experts' estimates to account for any known biases in the expert population such as optimism or pessimism. The results of this study indicate that these methods tend to perform poorly in all but the most ideal cases. Conversely, methods requiring extensive decision maker inputs are sensitive to misspecification. These methods perform poorly unless complete information is known about all the experts. That is, the decision maker's input parameters must nearly equal the actual values.

  18. Integrating multiple scientific computing needs via a Private Cloud infrastructure

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Brunetti, R.; Lusso, S.; Vallero, S.

    2014-06-01

    In a typical scientific computing centre, diverse applications coexist and share a single physical infrastructure. An underlying Private Cloud facility eases the management and maintenance of heterogeneous use cases such as multipurpose or application-specific batch farms, Grid sites catering to different communities, parallel interactive data analysis facilities and others. It allows to dynamically and efficiently allocate resources to any application and to tailor the virtual machines according to the applications' requirements. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques; for example, rolling updates can be performed easily and minimizing the downtime. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 site and a dynamically expandable PROOF-based Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The Private Cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem (used in two different configurations for worker- and service-class hypervisors) and the OpenWRT Linux distribution (used for network virtualization). A future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and by using mainstream contextualization tools like CloudInit.

  19. Integration over Multiple Timescales in Primary Auditory Cortex

    PubMed Central

    Shamma, Shihab A.

    2013-01-01

    Speech and other natural vocalizations are characterized by large modulations in their sound envelope. The timing of these modulations contains critical information for discrimination of important features, such as phonemes. We studied how depression of synaptic inputs, a mechanism frequently reported in cortex, can contribute to the encoding of envelope dynamics. Using a nonlinear stimulus-response model that accounted for synaptic depression, we predicted responses of neurons in ferret primary auditory cortex (A1) to stimuli with natural temporal modulations. The depression model consistently performed better than linear and second-order models previously used to characterize A1 neurons, and it produced more biologically plausible fits. To test how synaptic depression can contribute to temporal stimulus integration, we used nonparametric maximum a posteriori decoding to compare the ability of neurons showing and not showing depression to reconstruct the stimulus envelope. Neurons showing evidence for depression reconstructed stimuli over a longer range of latencies. These findings suggest that variation in depression across the cortical population supports a rich code for representing the temporal dynamics of natural sounds. PMID:24305812

  20. Integrating multiple irrigation technologies for overall improvement in irrigation.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There are many tools, techniques, and/or schemes to assist producers in irrigation water management and specifically in irrigation scheduling. This paper will highlight several of those but emphasize that several methods should be used simultaneously as an improved or advanced procedure to avoid bia...

  1. Assessing District Energy Systems Performance Integrated with Multiple Thermal Energy Storages

    NASA Astrophysics Data System (ADS)

    Rezaie, Behnaz

    The goal of this study is to examine various energy resources in district energy (DE) systems and then DE system performance development by means of multiple thermal energy storages (TES) application. This study sheds light on areas not yet investigated precisely in detail. Throughout the research, major components of the heat plant, energy suppliers of the DE systems, and TES characteristics are separately examined; integration of various configurations of the multiple TESs in the DE system is then analysed. In the first part of the study, various sources of energy are compared, in a consistent manner, financially and environmentally. The TES performance is then assessed from various aspects. Then, TES(s) and DE systems with several sources of energy are integrated, and are investigated as a heat process centre. The most efficient configurations of the multiple TESs integrated with the DE system are investigated. Some of the findings of this study are applied on an actual DE system. The outcomes of this study provide insight for researchers and engineers who work in this field, as well as policy makers and project managers who are decision-makers. The accomplishments of the study are original developments TESs and DE systems. As an original development the Enviro-Economic Function, to balance the economic and environmental aspects of energy resources technologies in DE systems, is developed; various configurations of multiple TESs, including series, parallel, and general grid, are developed. The developed related functions are discharge temperature and energy of the TES, and energy and exergy efficiencies of the TES. The TES charging and discharging behavior of TES instantaneously is also investigated to obtain the charging temperature, the maximum charging temperature, the charging energy flow, maximum heat flow capacity, the discharging temperature, the minimum charging temperature, the discharging energy flow, the maximum heat flow capacity, and performance

  2. Integrating the tools for an individualized prognosis in multiple sclerosis.

    PubMed

    Fernández, O

    2013-08-15

    Clinicians treating multiple sclerosis (MS) patients need biomarkers in order to predict an individualized prognosis for every patient, that is, characteristics that can be measured in an objective manner, and that give information over normal or pathological processes, or about the response to a given therapeutic intervention. Pharmacogenetics/Genomics in the fields of MS by now can be considered a promise. In the meanwhile, clinicians should use the information provided by the many clinical epidemiological studies performed by now, telling us that there are some clinical markers of good prognosis (female sex, young age of onset, optic neuritis or isolated sensory symptoms at debut, long interval between initial and second relapse, no accumulation of disability after five years of disease evolution, normal or near normal magnetic resonance imaging (MRI) at onset). Some markers in biological samples are considered as potential prognostic markers like IgM and neurofilaments in CSF or antimyelin and chitinase 3-like 1 in blood (plasma/sera). Baseline MRI lesion number, lesion load and location have been closely associated with a worse evolution, as well as MRI measures related to axonal damage (black holes in T1, brain atrophy, grey matter atrophy (GMA) and white matter atrophy (WMA), magnetization transfer measures and intracortical lesions). Functional measures (OCT, evoked potentials) have a potential role in measuring neurodegeneration in MS and could be very useful tools for prognosis. Several mathematical approaches to estimate the risk of short term use early clinical and paraclinical biomarkers to predict the evolution of the disease. PMID:23692966

  3. Differential operator multiplication method for fractional differential equations

    NASA Astrophysics Data System (ADS)

    Tang, Shaoqiang; Ying, Yuping; Lian, Yanping; Lin, Stephen; Yang, Yibo; Wagner, Gregory J.; Liu, Wing Kam

    2016-08-01

    Fractional derivatives play a very important role in modeling physical phenomena involving long-range correlation effects. However, they raise challenges of computational cost and memory storage requirements when solved using current well developed numerical methods. In this paper, the differential operator multiplication method is proposed to address the issues by considering a reaction-advection-diffusion equation with a fractional derivative in time. The linear fractional differential equation is transformed into an integer order differential equation by the proposed method, which can fundamentally fix the aforementioned issues for select fractional differential equations. In such a transform, special attention should be paid to the initial conditions for the resulting differential equation of higher integer order. Through numerical experiments, we verify the proposed method for both fractional ordinary differential equations and partial differential equations.

  4. Integration of multiple research disciplines on the International Space Station

    NASA Technical Reports Server (NTRS)

    Penley, N. J.; Uri, J.; Sivils, T.; Bartoe, J. D.

    2000-01-01

    The International Space Station will provide an extremely high-quality, long-duration microgravity environment for the conduct of research. In addition, the ISS offers a platform for performing observations of Earth and Space from a high-inclination orbit, outside of the Earth's atmosphere. This unique environment and observational capability offers the opportunity for advancement in a diverse set of research fields. Many of these disciplines do not relate to one another, and present widely differing approaches to study, as well as different resource and operational requirements. Significant challenges exist to ensure the highest quality research return for each investigation. Requirements from different investigations must be identified, clarified, integrated and communicated to ISS personnel in a consistent manner. Resources such as power, crew time, etc. must be apportioned to allow the conduct of each investigation. Decisions affecting research must be made at the strategic level as well as at a very detailed execution level. The timing of the decisions can range from years before an investigation to real-time operations. The international nature of the Space Station program adds to the complexity. Each participating country must be assured that their interests are represented during the entire planning and operations process. A process for making decisions regarding research planning, operations, and real-time replanning is discussed. This process ensures adequate representation of all research investigators. It provides a means for timely decisions, and it includes a means to ensure that all ISS International Partners have their programmatic interests represented. c 2000 Published by Elsevier Science Ltd. All rights reserved.

  5. Field Evaluation of Personal Sampling Methods for Multiple Bioaerosols

    PubMed Central

    Wang, Chi-Hsun; Chen, Bean T.; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols. PMID:25799419

  6. Comparison of Multiple Gene Assembly Methods for Metabolic Engineering

    NASA Astrophysics Data System (ADS)

    Lu, Chenfeng; Mansoorabadi, Karen; Jeffries, Thomas

    A universal, rapid DNA assembly method for efficient multigene plasmid construction is important for biological research and for optimizing gene expression in industrial microbes. Three different approaches to achieve this goal were evaluated. These included creating long complementary extensions using a uracil-DNA glycosylase technique, overlap extension polymerase chain reaction, and a SfiI-based ligation method. SfiI ligation was the only successful approach for assembling large DNA fragments that contained repeated homologous regions. In addition, the SfiI method has been improved over a similar, previous published technique so that it is more flexible and does not require polymerase chain reaction to incorporate adaptors. In the present study, Saccharomyces cerevisiae genes TAL1, TKL1, and PYK1 under control of the 6-phosphogluconate dehydrogenase promoter were successfully ligated together using multiple unique SfiI restriction sites. The desired construct was obtained 65% of the time during vector construction using four-piece ligations. The SfiI method consists of three steps: first a SfiI linker vector is constructed, whose multiple cloning site is flanked by two three-base linkers matching the neighboring SfiI linkers on SfiI digestion; second, the linkers are attached to the desired genes by cloning them into SfiI linker vectors; third, the genes flanked by the three-base linkers, are released by SfiI digestion. In the final step, genes of interest are joined together in a simple one-step ligation.

  7. Field evaluation of personal sampling methods for multiple bioaerosols.

    PubMed

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols. PMID:25799419

  8. New methods for the numerical integration of ordinary differential equations and their application to the equations of motion of spacecraft

    NASA Technical Reports Server (NTRS)

    Banyukevich, A.; Ziolkovski, K.

    1975-01-01

    A number of hybrid methods for solving Cauchy problems are described on the basis of an evaluation of advantages of single and multiple-point numerical integration methods. The selection criterion is the principle of minimizing computer time. The methods discussed include the Nordsieck method, the Bulirsch-Stoer extrapolation method, and the method of recursive Taylor-Steffensen power series.

  9. Calculation of transonic flows using an extended integral equation method

    NASA Technical Reports Server (NTRS)

    Nixon, D.

    1976-01-01

    An extended integral equation method for transonic flows is developed. In the extended integral equation method velocities in the flow field are calculated in addition to values on the aerofoil surface, in contrast with the less accurate 'standard' integral equation method in which only surface velocities are calculated. The results obtained for aerofoils in subcritical flow and in supercritical flow when shock waves are present compare satisfactorily with the results of recent finite difference methods.

  10. Enhancing subsurface information from the fusion of multiple geophysical methods

    NASA Astrophysics Data System (ADS)

    Jafargandomi, A.; Binley, A.

    2011-12-01

    Characterization of hydrologic systems is a key element in understanding and predicting their behaviour. Geophysical methods especially electrical methods (e.g., electrical resistivity tomography (ERT), induced polarization (IP) and electromagnetic (EM)) are becoming popular for such purpose due to their non-invasive nature, high sensitivity to hydrological parameters and the speed of measurements. However, interrogation of each geophysical method provides only limited information about some of the subsurface parameters. Therefore, in order to achieve a comprehensive picture from the hydrologic system, fusion of multiple geophysical data sets can be beneficial. Although a number of fusion approaches have been proposed in the literature, an aspect that has been generally overlooked is the assessment of information content from each measurement approach. Such an assessment provides useful insight for the design of future surveys. We develop a fusion strategy based on the capability of multiple geophysical methods to provide enough resolution to identify subsurface material parameters and structure. We apply a Bayesian framework to analyse the information in multiple geophysical data sets. In this approach multiple geophysical data sets are fed into a Markov chain Monte Carlo (McMC) inversion algorithm and the information content of the post-inversion result (posterior probability distribution) is quantified. We use Shannon's information measure to quantify the information obtained from the inversion of different combinations of geophysical data sets. In this strategy, information from multiple methods is brought together via introducing joint likelihood function and/or constraining the prior information. We apply the fusion tool to one of the target sites of the EU FP7 project ModelProbe which aims to develop technologies and tools for soil contamination assessment and site characterization. The target site is located close to Trecate (Novara - NW Italy). At this

  11. Decoding intracranial EEG data with multiple kernel learning method

    PubMed Central

    Schrouff, Jessica; Mourão-Miranda, Janaina; Phillips, Christophe; Parvizi, Josef

    2016-01-01

    Background Machine learning models have been successfully applied to neuroimaging data to make predictions about behavioral and cognitive states of interest. While these multivariate methods have greatly advanced the field of neuroimaging, their application to electrophysiological data has been less common especially in the analysis of human intracranial electroencephalography (iEEG, also known as electrocorticography or ECoG) data, which contains a rich spectrum of signals recorded from a relatively high number of recording sites. New method In the present work, we introduce a novel approach to determine the contribution of different bandwidths of EEG signal in different recording sites across different experimental conditions using the Multiple Kernel Learning (MKL) method. Comparison with existing method To validate and compare the usefulness of our approach, we applied this method to an ECoG dataset that was previously analysed and published with univariate methods. Results Our findings proved the usefulness of the MKL method in detecting changes in the power of various frequency bands during a given task and selecting automatically the most contributory signal in the most contributory site(s) of recording. Conclusions With a single computation, the contribution of each frequency band in each recording site in the estimated multivariate model can be highlighted, which then allows formulation of hypotheses that can be tested a posteriori with univariate methods if needed. PMID:26692030

  12. An Integrated Approach for Accessing Multiple Datasets through LANCE

    NASA Astrophysics Data System (ADS)

    Murphy, K. J.; Teague, M.; Conover, H.; Regner, K.; Beaumont, B.; Masuoka, E.; Vollmer, B.; Theobald, M.; Durbin, P.; Michael, K.; Boller, R. A.; Schmaltz, J. E.; Davies, D.; Horricks, K.; Ilavajhala, S.; Thompson, C. K.; Bingham, A.

    2011-12-01

    The NASA/GSFC Land Atmospheres Near-real time Capability for EOS (LANCE) provides imagery for approximately 40 data products from MODIS, AIRS, AMSR-E and OMI to support the applications community in the study of a variety of phenomena. Thirty-six of these products are available within 2.5 hours of observation at the spacecraft. The data set includes the population density data provided by the EOSDIS Socio-Economic Data and Applications Center (SEDAC). The purpose of this paper is to describe the variety of tools that have been developed by LANCE to support user access to the imagery. The long-standing Rapid Response system has been integrated into LANCE and is a major vehicle for the distribution of the imagery to end users. There are presently approximately 10,000 anonymous users per month accessing these imagery. The products are grouped into 14 applications categories such as Smoke Plumes, Pollution, Fires, Agriculture and the selection of any category will make relevant subsets of the 40 products available as possible overlays in an interactive Web Client utilizing Web Mapping Service (WMS) to support user investigations (http://lance2.modaps.eosdis.nasa.gov/wms/). For example, selecting Severe Storms will include 6 products for MODIS, OMI, AIRS, and AMSR-E plus the SEDAC population density data. The client and WMS were developed using open-source technologies such as OpenLayers and MapServer and provides a uniform, browser-based access to data products. All overlays are downloadable in PNG, JPEG, or GeoTiff form up to 200MB per request. The WMS was beta-tested with the user community and substantial performance improvements were made through the use of such techniques as tile-caching. LANCE established a partnership with Physical Oceanography Distributed Active Archive Center (PO DAAC) to develop an alternative presentation for the 40 data products known as the State of the Earth (SOTE). This provides a Google Earth-based interface to the products grouped in

  13. A New Method for Multiple Sperm Cells Tracking

    PubMed Central

    Imani, Yoones; Teyfouri, Niloufar; Ahmadzadeh, Mohammad Reza; Golabbakhsh, Marzieh

    2014-01-01

    Motion analysis or quality assessment of human sperm cell is great important for clinical applications of male infertility. Sperm tracking is quite complex due to cell collision, occlusion and missed detection. The aim of this study is simultaneous tracking of multiple human sperm cells. In the first step in this research, the frame difference algorithm is used for background subtraction. There are some limitations to select an appropriate threshold value since the output accuracy is strongly dependent on the selected threshold value. To eliminate this dependency, we propose an improved non-linear diffusion filtering in the time domain. Non-linear diffusion filtering is a smoothing and noise removing approach that can preserve edges in images. Many sperms that move with different speeds in different directions eventually coincide. For multiple tracking over time, an optimal matching strategy is introduced that is based on the optimization of a new cost function. A Hungarian search method is utilized to obtain the best matching for all possible candidates. The results show nearly 3.24% frame based error in dataset of videos that contain more than 1 and less than 10 sperm cells. Hence the accuracy rate was 96.76%. These results indicate the validity of the proposed algorithm to perform multiple sperms tracking. PMID:24696807

  14. A new method for multiple sperm cells tracking.

    PubMed

    Imani, Yoones; Teyfouri, Niloufar; Ahmadzadeh, Mohammad Reza; Golabbakhsh, Marzieh

    2014-01-01

    Motion analysis or quality assessment of human sperm cell is great important for clinical applications of male infertility. Sperm tracking is quite complex due to cell collision, occlusion and missed detection. The aim of this study is simultaneous tracking of multiple human sperm cells. In the first step in this research, the frame difference algorithm is used for background subtraction. There are some limitations to select an appropriate threshold value since the output accuracy is strongly dependent on the selected threshold value. To eliminate this dependency, we propose an improved non-linear diffusion filtering in the time domain. Non-linear diffusion filtering is a smoothing and noise removing approach that can preserve edges in images. Many sperms that move with different speeds in different directions eventually coincide. For multiple tracking over time, an optimal matching strategy is introduced that is based on the optimization of a new cost function. A Hungarian search method is utilized to obtain the best matching for all possible candidates. The results show nearly 3.24% frame based error in dataset of videos that contain more than 1 and less than 10 sperm cells. Hence the accuracy rate was 96.76%. These results indicate the validity of the proposed algorithm to perform multiple sperms tracking. PMID:24696807

  15. Pooling data from multiple longitudinal studies: the role of item response theory in integrative data analysis.

    PubMed

    Curran, Patrick J; Hussong, Andrea M; Cai, Li; Huang, Wenjing; Chassin, Laurie; Sher, Kenneth J; Zucker, Robert A

    2008-03-01

    There are a number of significant challenges researchers encounter when studying development over an extended period of time, including subject attrition, the changing of measurement structures across groups and developmental periods, and the need to invest substantial time and money. Integrative data analysis is an emerging set of methodologies that allows researchers to overcome many of the challenges of single-sample designs through the pooling of data drawn from multiple existing developmental studies. This approach is characterized by a host of advantages, but this also introduces several new complexities that must be addressed prior to broad adoption by developmental researchers. In this article, the authors focus on methods for fitting measurement models and creating scale scores using data drawn from multiple longitudinal studies. The authors present findings from the analysis of repeated measures of internalizing symptomatology that were pooled from three existing developmental studies. The authors describe and demonstrate each step in the analysis and conclude with a discussion of potential limitations and directions for future research. PMID:18331129

  16. Power-efficient method for IM-DD optical transmission of multiple OFDM signals.

    PubMed

    Effenberger, Frank; Liu, Xiang

    2015-05-18

    We propose a power-efficient method for transmitting multiple frequency-division multiplexed (FDM) orthogonal frequency-division multiplexing (OFDM) signals in intensity-modulation direct-detection (IM-DD) optical systems. This method is based on quadratic soft clipping in combination with odd-only channel mapping. We show, both analytically and experimentally, that the proposed approach is capable of improving the power efficiency by about 3 dB as compared to conventional FDM OFDM signals under practical bias conditions, making it a viable solution in applications such as optical fiber-wireless integrated systems where both IM-DD optical transmission and OFDM signaling are important. PMID:26074605

  17. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    Fabricating and operating complex systems involves dealing with uncertainty in the relevant variables. In the case of aircraft, flow conditions are subject to change during operation. Efficiency and engine noise may be different from the expected values because of manufacturing tolerances and normal wear and tear. Engine components may have a shorter life than expected because of manufacturing tolerances. In spite of the important effect of operating- and manufacturing-uncertainty on the performance and expected life of the component or system, traditional aerodynamic shape optimization has focused on obtaining the best design given a set of deterministic flow conditions. Clearly it is important to both maintain near-optimal performance levels at off-design operating conditions, and, ensure that performance does not degrade appreciably when the component shape differs from the optimal shape due to manufacturing tolerances and normal wear and tear. These requirements naturally lead to the idea of robust optimal design wherein the concept of robustness to various perturbations is built into the design optimization procedure. The basic ideas involved in robust optimal design will be included in this lecture. The imposition of the additional requirement of robustness results in a multiple-objective optimization problem requiring appropriate solution procedures. Typically the costs associated with multiple-objective optimization are substantial. Therefore efficient multiple-objective optimization procedures are crucial to the rapid deployment of the principles of robust design in industry. Hence the companion set of lecture notes (Single- and Multiple-Objective Optimization with Differential Evolution and Neural Networks ) deals with methodology for solving multiple-objective Optimization problems efficiently, reliably and with little user intervention. Applications of the methodologies presented in the companion lecture to robust design will be included here. The

  18. Methods for radiation detection and characterization using a multiple detector probe

    DOEpatents

    Akers, Douglas William; Roybal, Lyle Gene

    2014-11-04

    Apparatuses, methods, and systems relating to radiological characterization of environments are disclosed. Multi-detector probes with a plurality of detectors in a common housing may be used to substantially concurrently detect a plurality of different radiation activities and types. Multiple multi-detector probes may be used in a down-hole environment to substantially concurrently detect radioactive activity and contents of a buried waste container. Software may process, analyze, and integrate the data from the different multi-detector probes and the different detector types therein to provide source location and integrated analysis as to the source types and activity in the measured environment. Further, the integrated data may be used to compensate for differential density effects and the effects of radiation shielding materials within the volume being measured.

  19. Solution of elastoplastic torsion problem by boundary integral method

    NASA Technical Reports Server (NTRS)

    Mendelson, A.

    1975-01-01

    The boundary integral method was applied to the elastoplastic analysis of the torsion of prismatic bars, and the results are compared with those obtained by the finite difference method. Although fewer unknowns were used, very good accuracy was obtained with the boundary integral method. Both simply and multiply connected bodies can be handled with equal ease.

  20. Integrative methods for studying cardiac energetics.

    PubMed

    Diolez, Philippe; Deschodt-Arsac, Véronique; Calmettes, Guillaume; Gouspillou, Gilles; Arsac, Laurent; Dos Santos, Pierre; Jais, Pierre; Haissaguerre, Michel

    2015-01-01

    The more recent studies of human pathologies have essentially revealed the complexity of the interactions involved at the different levels of integration in organ physiology. Integrated organ thus reveals functional properties not predictable by underlying molecular events. It is therefore obvious that current fine molecular analyses of pathologies should be fruitfully combined with integrative approaches of whole organ function. It follows an important issue in the comprehension of the link between molecular events in pathologies, and whole organ function/dysfunction is the development of new experimental strategies aimed at the study of the integrated organ physiology. Cardiovascular diseases are a good example as heart submitted to ischemic conditions has to cope both with a decreased supply of nutrients and oxygen, and the necessary increased activity required to sustain whole body-including the heart itself-oxygenation.By combining the principles of control analysis with noninvasive (31)P NMR measurement of the energetic intermediates and simultaneous measurement of heart contractile activity, we developed MoCA (for Modular Control and Regulation Analysis), an integrative approach designed to study in situ control and regulation of cardiac energetics during contraction in intact beating perfused isolated heart (Diolez et al., Am J Physiol Regul Integr Comp Physiol 293(1):R13-R19, 2007). Because it gives real access to integrated organ function, MoCA brings out a new type of information-the "elasticities," referring to internal responses to metabolic changes-that may be a key to the understanding of the processes involved in pathologies. MoCA can potentially be used not only to detect the origin of the defects associated with the pathology, but also to provide the quantitative description of the routes by which these defects-or also drugs-modulate global heart function, therefore opening therapeutic perspectives. This review presents selected examples of the

  1. Integrating Semantic Information into Multiple Kernels for Protein-Protein Interaction Extraction from Biomedical Literatures

    PubMed Central

    Li, Lishuang; Zhang, Panpan; Zheng, Tianfu; Zhang, Hongying; Jiang, Zhenchao; Huang, Degen

    2014-01-01

    Protein-Protein Interaction (PPI) extraction is an important task in the biomedical information extraction. Presently, many machine learning methods for PPI extraction have achieved promising results. However, the performance is still not satisfactory. One reason is that the semantic resources were basically ignored. In this paper, we propose a multiple-kernel learning-based approach to extract PPIs, combining the feature-based kernel, tree kernel and semantic kernel. Particularly, we extend the shortest path-enclosed tree kernel (SPT) by a dynamic extended strategy to retrieve the richer syntactic information. Our semantic kernel calculates the protein-protein pair similarity and the context similarity based on two semantic resources: WordNet and Medical Subject Heading (MeSH). We evaluate our method with Support Vector Machine (SVM) and achieve an F-score of 69.40% and an AUC of 92.00%, which show that our method outperforms most of the state-of-the-art systems by integrating semantic information. PMID:24622773

  2. Efficient implicit integration for finite-strain viscoplasticity with a nested multiplicative split

    NASA Astrophysics Data System (ADS)

    Shutov, A. V.

    2016-07-01

    An efficient and reliable stress computation algorithm is presented, which is based on implicit integration of the local evolution equations of multiplicative finite-strain plasticity/viscoplasticity. The algorithm is illustrated by an example involving a combined nonlinear isotropic/kinematic hardening; numerous backstress tensors are employed for a better description of the material behavior. The considered material model exhibits the so-called weak invariance under arbitrary isochoric changes of the reference configuration, and the presented algorithm retains this useful property. Even more: the weak invariance serves as a guide in constructing this algorithm. The constraint of inelastic incompressibility is exactly preserved as well. The proposed method is first-order accurate. Concerning the accuracy of the stress computation, the new algorithm is comparable to the Euler Backward method with a subsequent correction of incompressibility (EBMSC) and the classical exponential method (EM). Regarding the computational efficiency, the new algorithm is superior to the EBMSC and EM. Some accuracy tests are presented using parameters of the aluminum alloy 5754-O and the 42CrMo4 steel. FEM solutions of two boundary value problems using MSC.MARC are presented to show the correctness of the numerical implementation.

  3. Integrating semantic information into multiple kernels for protein-protein interaction extraction from biomedical literatures.

    PubMed

    Li, Lishuang; Zhang, Panpan; Zheng, Tianfu; Zhang, Hongying; Jiang, Zhenchao; Huang, Degen

    2014-01-01

    Protein-Protein Interaction (PPI) extraction is an important task in the biomedical information extraction. Presently, many machine learning methods for PPI extraction have achieved promising results. However, the performance is still not satisfactory. One reason is that the semantic resources were basically ignored. In this paper, we propose a multiple-kernel learning-based approach to extract PPIs, combining the feature-based kernel, tree kernel and semantic kernel. Particularly, we extend the shortest path-enclosed tree kernel (SPT) by a dynamic extended strategy to retrieve the richer syntactic information. Our semantic kernel calculates the protein-protein pair similarity and the context similarity based on two semantic resources: WordNet and Medical Subject Heading (MeSH). We evaluate our method with Support Vector Machine (SVM) and achieve an F-score of 69.40% and an AUC of 92.00%, which show that our method outperforms most of the state-of-the-art systems by integrating semantic information. PMID:24622773

  4. Multiple ant colony algorithm method for selecting tag SNPs.

    PubMed

    Liao, Bo; Li, Xiong; Zhu, Wen; Li, Renfa; Wang, Shulin

    2012-10-01

    The search for the association between complex disease and single nucleotide polymorphisms (SNPs) or haplotypes has recently received great attention. Finding a set of tag SNPs for haplotyping in a great number of samples is an important step to reduce cost for association study. Therefore, it is essential to select tag SNPs with more efficient algorithms. In this paper, we model problem of selection tag SNPs by MINIMUM TEST SET and use multiple ant colony algorithm (MACA) to search a smaller set of tag SNPs for haplotyping. The various experimental results on various datasets show that the running time of our method is less than GTagger and MLR. And MACA can find the most representative SNPs for haplotyping, so that MACA is more stable and the number of tag SNPs is also smaller than other evolutionary methods (like GTagger and NSGA-II). Our software is available upon request to the corresponding author. PMID:22480582

  5. Multiple-time-stepping generalized hybrid Monte Carlo methods

    SciTech Connect

    Escribano, Bruno; Akhmatskaya, Elena; Reich, Sebastian; Azpiroz, Jon M.

    2015-01-01

    Performance of the generalized shadow hybrid Monte Carlo (GSHMC) method [1], which proved to be superior in sampling efficiency over its predecessors [2–4], molecular dynamics and hybrid Monte Carlo, can be further improved by combining it with multi-time-stepping (MTS) and mollification of slow forces. We demonstrate that the comparatively simple modifications of the method not only lead to better performance of GSHMC itself but also allow for beating the best performed methods, which use the similar force splitting schemes. In addition we show that the same ideas can be successfully applied to the conventional generalized hybrid Monte Carlo method (GHMC). The resulting methods, MTS-GHMC and MTS-GSHMC, provide accurate reproduction of thermodynamic and dynamical properties, exact temperature control during simulation and computational robustness and efficiency. MTS-GHMC uses a generalized momentum update to achieve weak stochastic stabilization to the molecular dynamics (MD) integrator. MTS-GSHMC adds the use of a shadow (modified) Hamiltonian to filter the MD trajectories in the HMC scheme. We introduce a new shadow Hamiltonian formulation adapted to force-splitting methods. The use of such Hamiltonians improves the acceptance rate of trajectories and has a strong impact on the sampling efficiency of the method. Both methods were implemented in the open-source MD package ProtoMol and were tested on a water and a protein systems. Results were compared to those obtained using a Langevin Molly (LM) method [5] on the same systems. The test results demonstrate the superiority of the new methods over LM in terms of stability, accuracy and sampling efficiency. This suggests that putting the MTS approach in the framework of hybrid Monte Carlo and using the natural stochasticity offered by the generalized hybrid Monte Carlo lead to improving stability of MTS and allow for achieving larger step sizes in the simulation of complex systems.

  6. Multiple-time-stepping generalized hybrid Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Escribano, Bruno; Akhmatskaya, Elena; Reich, Sebastian; Azpiroz, Jon M.

    2015-01-01

    Performance of the generalized shadow hybrid Monte Carlo (GSHMC) method [1], which proved to be superior in sampling efficiency over its predecessors [2-4], molecular dynamics and hybrid Monte Carlo, can be further improved by combining it with multi-time-stepping (MTS) and mollification of slow forces. We demonstrate that the comparatively simple modifications of the method not only lead to better performance of GSHMC itself but also allow for beating the best performed methods, which use the similar force splitting schemes. In addition we show that the same ideas can be successfully applied to the conventional generalized hybrid Monte Carlo method (GHMC). The resulting methods, MTS-GHMC and MTS-GSHMC, provide accurate reproduction of thermodynamic and dynamical properties, exact temperature control during simulation and computational robustness and efficiency. MTS-GHMC uses a generalized momentum update to achieve weak stochastic stabilization to the molecular dynamics (MD) integrator. MTS-GSHMC adds the use of a shadow (modified) Hamiltonian to filter the MD trajectories in the HMC scheme. We introduce a new shadow Hamiltonian formulation adapted to force-splitting methods. The use of such Hamiltonians improves the acceptance rate of trajectories and has a strong impact on the sampling efficiency of the method. Both methods were implemented in the open-source MD package ProtoMol and were tested on a water and a protein systems. Results were compared to those obtained using a Langevin Molly (LM) method [5] on the same systems. The test results demonstrate the superiority of the new methods over LM in terms of stability, accuracy and sampling efficiency. This suggests that putting the MTS approach in the framework of hybrid Monte Carlo and using the natural stochasticity offered by the generalized hybrid Monte Carlo lead to improving stability of MTS and allow for achieving larger step sizes in the simulation of complex systems.

  7. Integrated navigation method based on inertial navigation system and Lidar

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoyue; Shi, Haitao; Pan, Jianye; Zhang, Chunxi

    2016-04-01

    An integrated navigation method based on the inertial navigational system (INS) and Lidar was proposed for land navigation. Compared with the traditional integrated navigational method and dead reckoning (DR) method, the influence of the inertial measurement unit (IMU) scale factor and misalignment was considered in the new method. First, the influence of the IMU scale factor and misalignment on navigation accuracy was analyzed. Based on the analysis, the integrated system error model of INS and Lidar was established, in which the IMU scale factor and misalignment error states were included. Then the observability of IMU error states was analyzed. According to the results of the observability analysis, the integrated system was optimized. Finally, numerical simulation and a vehicle test were carried out to validate the availability and utility of the proposed INS/Lidar integrated navigational method. Compared with the test result of a traditional integrated navigation method and DR method, the proposed integrated navigational method could result in a higher navigation precision. Consequently, the IMU scale factor and misalignment error were effectively compensated by the proposed method and the new integrated navigational method is valid.

  8. Flexible dynamic measurement method of three-dimensional surface profilometry based on multiple vision sensors.

    PubMed

    Liu, Zhen; Li, Xiaojing; Li, Fengjiao; Zhang, Guangjun

    2015-01-12

    Single vision sensor cannot measure an entire object because of their limited field of view. Meanwhile, multiple rigidly-fixed vision sensors for the dynamic vision measurement of three-dimensional (3D) surface profilometry are complex and sensitive to strong environmental vibrations. To overcome these problems, a novel flexible dynamic measurement method for 3D surface profilometry based on multiple vision sensors is presented in this paper. A raster binocular stereo vision sensor is combined with a wide-field camera to produce a 3D optical probe. Multiple 3D optical probes are arranged around the object being measured, then many planar targets are set up. These planar targets function as the mediator to integrate the local 3D data measured by the raster binocular stereo vision sensors into the coordinate system. The proposed method is not sensitive to strong environmental vibrations, and the positions of these 3D optical probes need not be rigidly-fixed during the measurement. The validity of the proposed method is verified in a physical experiment with two 3D optical probes. When the measuring range of raster binocular stereo vision sensor is about 0.5 m × 0.38 m × 0.4 m and the size of the measured object is about 0.7 m, the accuracy of the proposed method could reach 0.12 mm. Meanwhile, the effectiveness of the proposed method in dynamic measurement is confirmed by measuring the rotating fan blades. PMID:25835684

  9. The Effect of Sensory Integration Treatment on Children with Multiple Disabilities.

    ERIC Educational Resources Information Center

    Din, Feng S.; Lodato, Donna M.

    Six children with multiple disabilities (ages 5 to 8) participated in this evaluation of the effect of sensory integration treatment on sensorimotor function and academic learning. The children had cognitive abilities ranging from sub-average to significantly sub-average, three were non-ambulatory, one had severe behavioral problems, and each…

  10. Multiplicity and Self-Identity: Trauma and Integration in Shirley Mason's Art

    ERIC Educational Resources Information Center

    Thompson, Geoffrey

    2011-01-01

    This viewpoint appeared in its original form as the catalogue essay that accompanied the exhibition "Multiplicity and Self-Identity: Trauma and Integration in Shirley Mason's Art," curated by the author for Gallery 2110, Sacramento, CA, and the 2010 Annual Conference of the American Art Therapy Association. The exhibition featured 17 artworks by…

  11. Multiple proviral integration events after virological synapse-mediated HIV-1 spread

    SciTech Connect

    Russell, Rebecca A.; Martin, Nicola; Mitar, Ivonne; Jones, Emma; Sattentau, Quentin J.

    2013-08-15

    HIV-1 can move directly between T cells via virological synapses (VS). Although aspects of the molecular and cellular mechanisms underlying this mode of spread have been elucidated, the outcomes for infection of the target cell remain incompletely understood. We set out to determine whether HIV-1 transfer via VS results in productive, high-multiplicity HIV-1 infection. We found that HIV-1 cell-to-cell spread resulted in nuclear import of multiple proviruses into target cells as seen by fluorescence in-situ hybridization. Proviral integration into the target cell genome was significantly higher than that seen in a cell-free infection system, and consequent de novo viral DNA and RNA production in the target cell detected by quantitative PCR increased over time. Our data show efficient proviral integration across VS, implying the probability of multiple integration events in target cells that drive productive T cell infection. - Highlights: • Cell-to-cell HIV-1 infection delivers multiple vRNA copies to the target cell. • Cell-to-cell infection results in productive infection of the target cell. • Cell-to-cell transmission is more efficient than cell-free HIV-1 infection. • Suggests a mechanism for recombination in cells infected with multiple viral genomes.

  12. Integrating different data types by regularized unsupervised multiple kernel learning with application to cancer subtype discovery

    PubMed Central

    Speicher, Nora K.; Pfeifer, Nico

    2015-01-01

    Motivation: Despite ongoing cancer research, available therapies are still limited in quantity and effectiveness, and making treatment decisions for individual patients remains a hard problem. Established subtypes, which help guide these decisions, are mainly based on individual data types. However, the analysis of multidimensional patient data involving the measurements of various molecular features could reveal intrinsic characteristics of the tumor. Large-scale projects accumulate this kind of data for various cancer types, but we still lack the computational methods to reliably integrate this information in a meaningful manner. Therefore, we apply and extend current multiple kernel learning for dimensionality reduction approaches. On the one hand, we add a regularization term to avoid overfitting during the optimization procedure, and on the other hand, we show that one can even use several kernels per data type and thereby alleviate the user from having to choose the best kernel functions and kernel parameters for each data type beforehand. Results: We have identified biologically meaningful subgroups for five different cancer types. Survival analysis has revealed significant differences between the survival times of the identified subtypes, with P values comparable or even better than state-of-the-art methods. Moreover, our resulting subtypes reflect combined patterns from the different data sources, and we demonstrate that input kernel matrices with only little information have less impact on the integrated kernel matrix. Our subtypes show different responses to specific therapies, which could eventually assist in treatment decision making. Availability and implementation: An executable is available upon request. Contact: nora@mpi-inf.mpg.de or npfeifer@mpi-inf.mpg.de PMID:26072491

  13. Integration of Multiple Genomic and Phenotype Data to Infer Novel miRNA-Disease Associations.

    PubMed

    Shi, Hongbo; Zhang, Guangde; Zhou, Meng; Cheng, Liang; Yang, Haixiu; Wang, Jing; Sun, Jie; Wang, Zhenzhen

    2016-01-01

    MicroRNAs (miRNAs) play an important role in the development and progression of human diseases. The identification of disease-associated miRNAs will be helpful for understanding the molecular mechanisms of diseases at the post-transcriptional level. Based on different types of genomic data sources, computational methods for miRNA-disease association prediction have been proposed. However, individual source of genomic data tends to be incomplete and noisy; therefore, the integration of various types of genomic data for inferring reliable miRNA-disease associations is urgently needed. In this study, we present a computational framework, CHNmiRD, for identifying miRNA-disease associations by integrating multiple genomic and phenotype data, including protein-protein interaction data, gene ontology data, experimentally verified miRNA-target relationships, disease phenotype information and known miRNA-disease connections. The performance of CHNmiRD was evaluated by experimentally verified miRNA-disease associations, which achieved an area under the ROC curve (AUC) of 0.834 for 5-fold cross-validation. In particular, CHNmiRD displayed excellent performance for diseases without any known related miRNAs. The results of case studies for three human diseases (glioblastoma, myocardial infarction and type 1 diabetes) showed that all of the top 10 ranked miRNAs having no known associations with these three diseases in existing miRNA-disease databases were directly or indirectly confirmed by our latest literature mining. All these results demonstrated the reliability and efficiency of CHNmiRD, and it is anticipated that CHNmiRD will serve as a powerful bioinformatics method for mining novel disease-related miRNAs and providing a new perspective into molecular mechanisms underlying human diseases at the post-transcriptional level. CHNmiRD is freely available at http://www.bio-bigdata.com/CHNmiRD. PMID:26849207

  14. Modified principal component analysis: an integration of multiple similarity subspace models.

    PubMed

    Fan, Zizhu; Xu, Yong; Zuo, Wangmeng; Yang, Jian; Tang, Jinhui; Lai, Zhihui; Zhang, David

    2014-08-01

    We modify the conventional principal component analysis (PCA) and propose a novel subspace learning framework, modified PCA (MPCA), using multiple similarity measurements. MPCA computes three similarity matrices exploiting the similarity measurements: 1) mutual information; 2) angle information; and 3) Gaussian kernel similarity. We employ the eigenvectors of similarity matrices to produce new subspaces, referred to as similarity subspaces. A new integrated similarity subspace is then generated using a novel feature selection approach. This approach needs to construct a kind of vector set, termed weak machine cell (WMC), which contains an appropriate number of the eigenvectors spanning the similarity subspaces. Combining the wrapper method and the forward selection scheme, MPCA selects a WMC at a time that has a powerful discriminative capability to classify samples. MPCA is very suitable for the application scenarios in which the number of the training samples is less than the data dimensionality. MPCA outperforms the other state-of-the-art PCA-based methods in terms of both classification accuracy and clustering result. In addition, MPCA can be applied to face image reconstruction. MPCA can use other types of similarity measurements. Extensive experiments on many popular real-world data sets, such as face databases, show that MPCA achieves desirable classification results, as well as has a powerful capability to represent data. PMID:25050950

  15. Online Guidance Law of Missile Using Multiple Design Point Method

    NASA Astrophysics Data System (ADS)

    Yamaoka, Seiji; Ueno, Seiya

    This paper deals with design procedure of online guidance law for future missiles that are required to have agile maneuverability. For the purpose, the authors propose to mount high power side-thrusters on a missile. The guidance law for such missiles is discussed from a point of view of optimal control theory in this paper. Minimum time problem is solved for the approximated system. It is derived that bang-bang control is optimal input from the necessary conditions of optimal solution. Feedback guidance without iterative calculation is useful for actual systems. Multiple design point method is applied to design feedback gains and feedforward inputs of the guidance law. The numerical results show the good performance of the proposed guidance law.

  16. System and method for inventorying multiple remote objects

    DOEpatents

    Carrender, Curtis L.; Gilbert, Ronald W.

    2007-10-23

    A system and method of inventorying multiple objects utilizing a multi-level or a chained radio frequency identification system. The system includes a master tag and a plurality of upper level tags and lower level tags associated with respective objects. The upper and lower level tags communicate with each other and the master tag so that reading of the master tag reveals the presence and absence of upper and lower level tags. In the chained RF system, the upper and lower level tags communicate locally with each other in a manner so that more remote tags that are out of range of some of the upper and lower level tags have their information relayed through adjacent tags to the master tag and thence to a controller.

  17. System and method for inventorying multiple remote objects

    DOEpatents

    Carrender, Curtis L.; Gilbert, Ronald W.

    2009-12-29

    A system and method of inventorying multiple objects utilizing a multi-level or a chained radio frequency identification system. The system includes a master tag and a plurality of upper level tags and lower level tags associated with respective objects. The upper and lower level tags communicate with each other and the master tag so that reading of the master tag reveals the presence and absence of upper and lower level tags. In the chained RF system, the upper and lower level tags communicate locally with each other in a manner so that more remote tags that are out of range of some of the upper and lower level tags have their information relayed through adjacent tags to the master tag and thence to a controller.

  18. A multiple hypotheses uncertainty analysis in hydrological modelling: about model structure, landscape parameterization, and numerical integration

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2016-04-01

    Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.

  19. Integrating Multiple Autonomous Underwater Vessels, Surface Vessels and Aircraft into Oceanographic Research Vessel Operations

    NASA Astrophysics Data System (ADS)

    McGillivary, P. A.; Borges de Sousa, J.; Martins, R.; Rajan, K.

    2012-12-01

    Autonomous platforms are increasingly used as components of Integrated Ocean Observing Systems and oceanographic research cruises. Systems deployed can include gliders or propeller-driven autonomous underwater vessels (AUVs), autonomous surface vessels (ASVs), and unmanned aircraft systems (UAS). Prior field campaigns have demonstrated successful communication, sensor data fusion and visualization for studies using gliders and AUVs. However, additional requirements exist for incorporating ASVs and UASs into ship operations. For these systems to be optimally integrated into research vessel data management and operational planning systems involves addressing three key issues: real-time field data availability, platform coordination, and data archiving for later analysis. A fleet of AUVs, ASVs and UAS deployed from a research vessel is best operated as a system integrated with the ship, provided communications among them can be sustained. For this purpose, Disruptive Tolerant Networking (DTN) software protocols for operation in communication-challenged environments help ensure reliable high-bandwidth communications. Additionally, system components need to have considerable onboard autonomy, namely adaptive sampling capabilities using their own onboard sensor data stream analysis. We discuss Oceanographic Decision Support System (ODSS) software currently used for situational awareness and planning onshore, and in the near future event detection and response will be coordinated among multiple vehicles. Results from recent field studies from oceanographic research vessels using AUVs, ASVs and UAS, including the Rapid Environmental Picture (REP-12) cruise, are presented describing methods and results for use of multi-vehicle communication and deliberative control networks, adaptive sampling with single and multiple platforms, issues relating to data management and archiving, and finally challenges that remain in addressing these technological issues. Significantly, the

  20. Multiple Moment Tensor Inversions For the December 26, 2004 Sumatra Earthquake Based Upon Adjoint Methods

    NASA Astrophysics Data System (ADS)

    Ren, L.; Liu, Q.; Hjörleifsdóttir, V.

    2010-12-01

    We present multiple moment-tensor solution of the Dec 26, 2004 Sumatra earthquake based upon the adjoint methods. An objective function Φ(m), where m is the multiple source model, measures the goodness of waveform fit between data and synthetics. The Fréchet derivatives of Φ in the form δΦ = ∫∫I(ɛ†)(x,T-t)δmij_dot(x,t)dVdt, where δmij is the source model perturbation and I(ɛ†)(x,T-t) denotes the time-integrated adjoint strain tensor, are calculated based upon adjoint methods and spectral-element simulations (SPECFEM3D_GLOBE) in a 3D global earth model S362ANI. Our initial source model is obtained independently by monitoring the time-integrated adjoint strain tensors around the presumed source region. We then utilize the Φ and δΦ calculations in a conjugate-gradient method to iteratively invert for the source model. Our final inversion results show both similarities with and differences to previous source inversion results based on 1D earth models.

  1. Monitoring gray wolf populations using multiple survey methods

    USGS Publications Warehouse

    Ausband, David E.; Rich, Lindsey N.; Glenn, Elizabeth M.; Mitchell, Michael S.; Zager, Pete; Miller, David A.W.; Waits, Lisette P.; Ackerman, Bruce B.; Mack, Curt M.

    2013-01-01

    The behavioral patterns and large territories of large carnivores make them challenging to monitor. Occupancy modeling provides a framework for monitoring population dynamics and distribution of territorial carnivores. We combined data from hunter surveys, howling and sign surveys conducted at predicted wolf rendezvous sites, and locations of radiocollared wolves to model occupancy and estimate the number of gray wolf (Canis lupus) packs and individuals in Idaho during 2009 and 2010. We explicitly accounted for potential misidentification of occupied cells (i.e., false positives) using an extension of the multi-state occupancy framework. We found agreement between model predictions and distribution and estimates of number of wolf packs and individual wolves reported by Idaho Department of Fish and Game and Nez Perce Tribe from intensive radiotelemetry-based monitoring. Estimates of individual wolves from occupancy models that excluded data from radiocollared wolves were within an average of 12.0% (SD = 6.0) of existing statewide minimum counts. Models using only hunter survey data generally estimated the lowest abundance, whereas models using all data generally provided the highest estimates of abundance, although only marginally higher. Precision across approaches ranged from 14% to 28% of mean estimates and models that used all data streams generally provided the most precise estimates. We demonstrated that an occupancy model based on different survey methods can yield estimates of the number and distribution of wolf packs and individual wolf abundance with reasonable measures of precision. Assumptions of the approach including that average territory size is known, average pack size is known, and territories do not overlap, must be evaluated periodically using independent field data to ensure occupancy estimates remain reliable. Use of multiple survey methods helps to ensure that occupancy estimates are robust to weaknesses or changes in any 1 survey method

  2. A survey of payload integration methods

    NASA Technical Reports Server (NTRS)

    Engels, R. C.; Craig, R. R., Jr.; Harcrow, H. W.

    1984-01-01

    Several full-scale and short-cut methods for analyzing a booster/payload system are presented. Two full-scale techniques are considered: (1) a technique that uses a restrained payload together with a free-booster model, the latter being augmented with residual mass and stiffness correction and (2) a technique that uses a restrained payload and booster model. Both techniques determine the 'modal modes', which require the solution of a system eigenvalue problem; the loads usually are then determined via an acceleration approach. A brief description is given of a number of short-cut methods which are of special interest to Shuttle payload design: structural modification, base drive, and interface impedance methods. Directions for further research and development are suggested.

  3. Method for distinguishing multiple targets using time-reversal acoustics

    DOEpatents

    Berryman, James G.

    2004-06-29

    A method for distinguishing multiple targets using time-reversal acoustics. Time-reversal acoustics uses an iterative process to determine the optimum signal for locating a strongly reflecting target in a cluttered environment. An acoustic array sends a signal into a medium, and then receives the returned/reflected signal. This returned/reflected signal is then time-reversed and sent back into the medium again, and again, until the signal being sent and received is no longer changing. At that point, the array has isolated the largest eigenvalue/eigenvector combination and has effectively determined the location of a single target in the medium (the one that is most strongly reflecting). After the largest eigenvalue/eigenvector combination has been determined, to determine the location of other targets, instead of sending back the same signals, the method sends back these time reversed signals, but half of them will also be reversed in sign. There are various possibilities for choosing which half to do sign reversal. The most obvious choice is to reverse every other one in a linear array, or as in a checkerboard pattern in 2D. Then, a new send/receive, send-time reversed/receive iteration can proceed. Often, the first iteration in this sequence will be close to the desired signal from a second target. In some cases, orthogonalization procedures must be implemented to assure the returned signals are in fact orthogonal to the first eigenvector found.

  4. Treatment of domain integrals in boundary element methods

    SciTech Connect

    Nintcheu Fata, Sylvain

    2012-01-01

    A systematic and rigorous technique to calculate domain integrals without a volume-fitted mesh has been developed and validated in the context of a boundary element approximation. In the proposed approach, a domain integral involving a continuous or weakly-singular integrand is first converted into a surface integral by means of straight-path integrals that intersect the underlying domain. Then, the resulting surface integral is carried out either via analytic integration over boundary elements or by use of standard quadrature rules. This domain-to-boundary integral transformation is derived from an extension of the fundamental theorem of calculus to higher dimension, and the divergence theorem. In establishing the method, it is shown that the higher-dimensional version of the first fundamental theorem of calculus corresponds to the well-known Poincare lemma. The proposed technique can be employed to evaluate integrals defined over simply- or multiply-connected domains with Lipschitz boundaries which are embedded in an Euclidean space of arbitrary but finite dimension. Combined with the singular treatment of surface integrals that is widely available in the literature, this approach can also be utilized to effectively deal with boundary-value problems involving non-homogeneous source terms by way of a collocation or a Galerkin boundary integral equation method using only the prescribed surface discretization. Sample problems associated with the three-dimensional Poisson equation and featuring the Newton potential are successfully solved by a constant element collocation method to validate this study.

  5. Hollow fiber integrated microfluidic platforms for in vitro Co-culture of multiple cell types.

    PubMed

    Huang, Jen-Huang; Harris, Jennifer F; Nath, Pulak; Iyer, Rashi

    2016-10-01

    This study demonstrates a rapid prototyping approach for fabricating and integrating porous hollow fibers (HFs) into microfluidic device. Integration of HF can enhance mass transfer and recapitulate tubular shapes for tissue-engineered environments. We demonstrate the integration of single or multiple HFs, which can give the users the flexibility to control the total surface area for tissue development. We also present three microfluidic designs to enable different co-culture conditions such as the ability to co-culture multiple cell types simultaneously on a flat and tubular surface, or inside the lumen of multiple HFs. Additionally, we introduce a pressurized cell seeding process that can allow the cells to uniformly adhere on the inner surface of HFs without losing their viabilities. Co-cultures of lung epithelial cells and microvascular endothelial cells were demonstrated on the different platforms for at least five days. Overall, these platforms provide new opportunities for co-culturing of multiple cell types in a single device to reconstruct native tissue micro-environment for biomedical and tissue engineering research. PMID:27613401

  6. Computational methods for inlet airframe integration

    NASA Technical Reports Server (NTRS)

    Towne, Charles E.

    1988-01-01

    Fundamental equations encountered in computational fluid dynamics (CFD), and analyses used for internal flow are introduced. Irrotational flow; Euler equations; boundary layers; parabolized Navier-Stokes equations; and time averaged Navier-Stokes equations are treated. Assumptions made and solution methods are outlined, with examples. The overall status of CFD in propulsion is indicated.

  7. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  8. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  9. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  10. Investigation of the Multiple Method Adaptive Control (MMAC) method for flight control systems

    NASA Technical Reports Server (NTRS)

    Athans, M.; Baram, Y.; Castanon, D.; Dunn, K. P.; Green, C. S.; Lee, W. H.; Sandell, N. R., Jr.; Willsky, A. S.

    1979-01-01

    The stochastic adaptive control of the NASA F-8C digital-fly-by-wire aircraft using the multiple model adaptive control (MMAC) method is presented. The selection of the performance criteria for the lateral and the longitudinal dynamics, the design of the Kalman filters for different operating conditions, the identification algorithm associated with the MMAC method, the control system design, and simulation results obtained using the real time simulator of the F-8 aircraft at the NASA Langley Research Center are discussed.