Science.gov

Sample records for multiple method integration

  1. Multiple methods integration for structural mechanics analysis and design

    NASA Technical Reports Server (NTRS)

    Housner, J. M.; Aminpour, M. A.

    1991-01-01

    A new research area of multiple methods integration is proposed for joining diverse methods of structural mechanics analysis which interact with one another. Three categories of multiple methods are defined: those in which a physical interface are well defined; those in which a physical interface is not well-defined, but selected; and those in which the interface is a mathematical transformation. Two fundamental integration procedures are presented that can be extended to integrate various methods (e.g., finite elements, Rayleigh Ritz, Galerkin, and integral methods) with one another. Since the finite element method will likely be the major method to be integrated, its enhanced robustness under element distortion is also examined and a new robust shell element is demonstrated.

  2. Method and system of integrating information from multiple sources

    DOEpatents

    Alford, Francine A.; Brinkerhoff, David L.

    2006-08-15

    A system and method of integrating information from multiple sources in a document centric application system. A plurality of application systems are connected through an object request broker to a central repository. The information may then be posted on a webpage. An example of an implementation of the method and system is an online procurement system.

  3. Integrating Multiple Teaching Methods into a General Chemistry Classroom.

    ERIC Educational Resources Information Center

    Francisco, Joseph S.; Nicoll, Gayle; Trautmann, Marcella

    1998-01-01

    Four different methods of teaching--cooperative learning, class discussions, concept maps, and lectures--were integrated into a freshman-level general chemistry course to compare students' levels of participation. Findings support the idea that multiple modes of learning foster the metacognitive skills necessary for mastering general chemistry.…

  4. Integrating Multiple Teaching Methods into a General Chemistry Classroom

    NASA Astrophysics Data System (ADS)

    Francisco, Joseph S.; Nicoll, Gayle; Trautmann, Marcella

    1998-02-01

    In addition to the traditional lecture format, three other teaching strategies (class discussions, concept maps, and cooperative learning) were incorporated into a freshman level general chemistry course. Student perceptions of their involvement in each of the teaching methods, as well as their perceptions of the utility of each method were used to assess the effectiveness of the integration of the teaching strategies as received by the students. Results suggest that each strategy serves a unique purpose for the students and increased student involvement in the course. These results indicate that the multiple teaching strategies were well received by the students and that all teaching strategies are necessary for students to get the most out of the course.

  5. On a New Simple Method for Evaluation of Certain Multiple Definite Integrals

    ERIC Educational Resources Information Center

    Sen Gupta, I.; Debnath, L.

    2006-01-01

    This paper deals with a simple method of evaluation of certain multiple definite integrals. This is followed by two main theorems concerning multiple definite integrals. Some examples of applications are given.

  6. On a New Simple Method for Evaluation of Certain Multiple Definite Integrals

    ERIC Educational Resources Information Center

    Sen Gupta, I.; Debnath, L.

    2006-01-01

    This paper deals with a simple method of evaluation of certain multiple definite integrals. This is followed by two main theorems concerning multiple definite integrals. Some examples of applications are given.

  7. Numerical solution of optimal control problems using multiple-interval integral Gegenbauer pseudospectral methods

    NASA Astrophysics Data System (ADS)

    Tang, Xiaojun

    2016-04-01

    The main purpose of this work is to provide multiple-interval integral Gegenbauer pseudospectral methods for solving optimal control problems. The latest developed single-interval integral Gauss/(flipped Radau) pseudospectral methods can be viewed as special cases of the proposed methods. We present an exact and efficient approach to compute the mesh pseudospectral integration matrices for the Gegenbauer-Gauss and flipped Gegenbauer-Gauss-Radau points. Numerical results on benchmark optimal control problems confirm the ability of the proposed methods to obtain highly accurate solutions.

  8. A method for integrating multiple components in a decision support system

    Treesearch

    Donald Nute; Walter D. Potter; Zhiyuan Cheng; Mayukh Dass; Astrid Glende; Frederick Maierv; Cy Routh; Hajime Uchiyama; Jin Wang; Sarah Witzig; Mark Twery; Peter Knopp; Scott Thomasma; H. Michael Rauscher

    2005-01-01

    We present a flexible, extensible method for integrating multiple tools into a single large decision support system (DSS) using a forest ecosystem management DSS (NED-2) as an example. In our approach, a rich ontology for the target domain is developed and implemented in the internal data model for the DSS. Semi-autonomous agents control external components and...

  9. Multiple program/multiple data molecular dynamics method with multiple time step integrator for large biological systems.

    PubMed

    Jung, Jaewoon; Sugita, Yuji

    2017-06-15

    Parallelization of molecular dynamics (MD) simulation is essential for investigating conformational dynamics of large biological systems, such as ribosomes, viruses, and multiple proteins in cellular environments. To improve efficiency in the parallel computation, we have to reduce the amount of data transfer between processors by introducing domain decomposition schemes. Also, it is important to optimize the computational balance between real-space non-bonded interactions and reciprocal-space interactions for long-range electrostatic interactions. Here, we introduce a novel parallelization scheme for large-scale MD simulations on massively parallel supercomputers consisting of only CPUs. We make use of a multiple program/multiple data (MPMD) approach for separating the real-space and reciprocal-space computations on different processors. We also utilize the r-RESPA multiple time step integrator on the framework of the MPMD approach in an efficient way: when the reciprocal-space computations are skipped in r-RESPA, processors assigned for them are utilized for half of the real-space computations. The new scheme allows us to use twice as many as processors that are available in the conventional single program approach. The best performances of all-atom MD simulations for 1 million (STMV), 8.5 million (8_STMV), and 28.8 million (27_STMV) atom systems on K computer are 65, 36, and 24 ns/day, respectively. The MPMD scheme can accelerate 23.4, 10.2, and 9.2 ns/day from the maximum performance of single-program approach for STMV, 8_STMV, and 27_STMV systems, respectively, which correspond to 57%, 39%, and 60% speed up. This suggests significant speedups by increasing the number of processors without losing parallel computational efficiency. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Jacobian integration method increases the statistical power to measure gray matter atrophy in multiple sclerosis.

    PubMed

    Nakamura, Kunio; Guizard, Nicolas; Fonov, Vladimir S; Narayanan, Sridar; Collins, D Louis; Arnold, Douglas L

    2014-01-01

    Gray matter atrophy provides important insights into neurodegeneration in multiple sclerosis (MS) and can be used as a marker of neuroprotection in clinical trials. Jacobian integration is a method for measuring volume change that uses integration of the local Jacobian determinants of the nonlinear deformation field registering two images, and is a promising tool for measuring gray matter atrophy. Our main objective was to compare the statistical power of the Jacobian integration method to commonly used methods in terms of the sample size required to detect a treatment effect on gray matter atrophy. We used multi-center longitudinal data from relapsing-remitting MS patients and evaluated combinations of cross-sectional and longitudinal pre-processing with SIENAX/FSL, SPM, and FreeSurfer, as well as the Jacobian integration method. The Jacobian integration method outperformed these other commonly used methods, reducing the required sample size by a factor of 4-5. The results demonstrate the advantage of using the Jacobian integration method to assess neuroprotection in MS clinical trials.

  11. Error and timing analysis of multiple time-step integration methods for molecular dynamics

    NASA Astrophysics Data System (ADS)

    Han, Guowen; Deng, Yuefan; Glimm, James; Martyna, Glenn

    2007-02-01

    Molecular dynamics simulations of biomolecules performed using multiple time-step integration methods are hampered by resonance instabilities. We analyze the properties of a simple 1D linear system integrated with the symplectic reference system propagator MTS (r-RESPA) technique following earlier work by others. A closed form expression for the time step dependent Hamiltonian which corresponds to r-RESPA integration of the model is derived. This permits us to present an analytic formula for the dependence of the integration accuracy on short-range force cutoff range. A detailed analysis of the force decomposition for the standard Ewald summation method is then given as the Ewald method is a good candidate to achieve high scaling on modern massively parallel machines. We test the new analysis on a realistic system, a protein in water. Under Langevin dynamics with a weak friction coefficient ( ζ=1 ps) to maintain temperature control and using the SHAKE algorithm to freeze out high frequency vibrations, we show that the 5 fs resonance barrier present when all degrees of freedom are unconstrained is postponed to ≈12 fs. An iso-error boundary with respect to the short-range cutoff range and multiple time step size agrees well with the analytical results which are valid due to dominance of the high frequency modes in determining integrator accuracy. Using r-RESPA to treat the long range interactions results in a 6× increase in efficiency for the decomposition described in the text.

  12. A Nonparametric, Multiple Imputation-Based Method for the Retrospective Integration of Data Sets

    PubMed Central

    Carrig, Madeline M.; Manrique-Vallier, Daniel; Ranby, Krista W.; Reiter, Jerome P.; Hoyle, Rick H.

    2015-01-01

    Complex research questions often cannot be addressed adequately with a single data set. One sensible alternative to the high cost and effort associated with the creation of large new data sets is to combine existing data sets containing variables related to the constructs of interest. The goal of the present research was to develop a flexible, broadly applicable approach to the integration of disparate data sets that is based on nonparametric multiple imputation and the collection of data from a convenient, de novo calibration sample. We demonstrate proof of concept for the approach by integrating three existing data sets containing items related to the extent of problematic alcohol use and associations with deviant peers. We discuss both necessary conditions for the approach to work well and potential strengths and weaknesses of the method compared to other data set integration approaches. PMID:26257437

  13. System and method for integrating and accessing multiple data sources within a data warehouse architecture

    DOEpatents

    Musick, Charles R.; Critchlow, Terence; Ganesh, Madhaven; Slezak, Tom; Fidelis, Krzysztof

    2006-12-19

    A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.

  14. Multiple integral computations

    NASA Astrophysics Data System (ADS)

    Chaloupka, Jan; Kocina, Filip; Veigend, Petr; Nečasová, Gabriela; Kunovský, Jiří; Šátek, Václav

    2017-07-01

    Extending standard numeric integration methods to multi-integrals is possible, however, the numeric effort grows significantly for a given accuracy. In this paper the Modern Taylor Series Method (MTSM) is extended to multi-integrals with the benefit of (theoretically) arbitrary accuracy while being highly parallelizable.

  15. Method for detection and identification of multiple chromosomal integration sites in transgenic animals created with lentivirus.

    PubMed

    Bryda, Elizabeth C; Pearson, Michael; Agca, Yuksel; Bauer, Beth A

    2006-12-01

    Transgene delivery systems, particularly those involving retroviruses, often result in the integration of multiple copies of the transgene throughout the host genome. Since site-specific silencing of trangenes can occur; it becomes important to identify the number and chromosomal location of the multiple copies of the transgenes in order to correlate inheritance of the transgene at a particular chromosomal site with a specific and robust phenotype. Using a technique that combines restriction endonuclease digest and several rounds of PCR amplification followed by nucleotide sequencing, it is possible to identify multiple chromosomal integration sites in transgenic founder animals. By designing genotyping assays to detect each individual integration site in the offspring of these founders, the inheritance of transgenes integrated at specific chromosomal locations can be followed efficiently as the transgenes randomly segregate in subsequent generations. Phenotypic characteristics can then be correlated with inheritance of a transgene integrated at a particular chromosomal location to allow rational selection of breeding animals in order to establish the transgenic line.

  16. An Alzheimers disease related genes identification method based on multiple classifier integration.

    PubMed

    Miao, Yu; Jiang, Huiyan; Liu, Huiling; Yao, Yu-Dong

    2017-10-01

    Alzheimers disease (AD) is a fatal neurodegenerative disease and the onset of AD is insidious. Full understanding of the AD-related genes (ADGs) has not been completed. The National Center for Biotechnology Information (NCBI) provides an AD dataset of 22,283 genes. Among these genes, 71 genes have been identified as ADGs. But there may still be underlying ADGs that have not yet been identified in the remaining 22,212 genes. This paper aims to identify additional ADGs using machine learning techniques. To improve the accuracy of ADG identification, we propose a gene identification method through multiple classifier integration. First, a feature selection algorithm is applied to select the most relevant attributes. Second, a two-stage cascading classifier is developed to identify ADGs. The first stage classification task is based on the relevance vector machine and, in the second stage, the results of three classifiers, support vector machine, random forest and extreme learning machine, are combined through voting. According to our results, feature selection improves accuracy and reduces training time. Voting based classifier reduces the classification errors. The proposed ADG identification system provides accuracy, sensitivity and specificity at levels of 78.77%, 83.10% and 74.67%, respectively. Based on the proposed ADG identification method, potentially additional ADGs are identified and top 13 genes (predicted ADGs) are presented. In this paper, an ADG identification method for identifying ADGs is presented. The proposed method which combines feature selection, cascading classifier and majority voting leads to higher specificity and significantly increases the accuracy and sensitivity of ADG identification. Potentially new ADGs are identified. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Statistical methods for integrating multiple types of high-throughput data.

    PubMed

    Xie, Yang; Ahn, Chul

    2010-01-01

    Large-scale sequencing, copy number, mRNA, and protein data have given great promise to the biomedical research, while posing great challenges to data management and data analysis. Integrating different types of high-throughput data from diverse sources can increase the statistical power of data analysis and provide deeper biological understanding. This chapter uses two biomedical research examples to illustrate why there is an urgent need to develop reliable and robust methods for integrating the heterogeneous data. We then introduce and review some recently developed statistical methods for integrative analysis for both statistical inference and classification purposes. Finally, we present some useful public access databases and program code to facilitate the integrative analysis in practice.

  18. Statistical Methods for Integrating Multiple Types of High-Throughput Data

    PubMed Central

    Xie, Yang; Ahn, Chul

    2011-01-01

    Large-scale sequencing, copy number, mRNA, and protein data have given great promise to the biomedical research, while posing great challenges to data management and data analysis. Integrating different types of high-throughput data from diverse sources can increase the statistical power of data analysis and provide deeper biological understanding. This chapter uses two biomedical research examples to illustrate why there is an urgent need to develop reliable and robust methods for integrating the heterogeneous data. We then introduce and review some recently developed statistical methods for integrative analysis for both statistical inference and classification purposes. Finally, we present some useful public access databases and program code to facilitate the integrative analysis in practice. PMID:20652519

  19. Method for Visually Integrating Multiple Data Acquisition Technologies for Real Time and Retrospective Analysis

    NASA Technical Reports Server (NTRS)

    Bogart, Edward H. (Inventor); Pope, Alan T. (Inventor)

    2000-01-01

    A system for display on a single video display terminal of multiple physiological measurements is provided. A subject is monitored by a plurality of instruments which feed data to a computer programmed to receive data, calculate data products such as index of engagement and heart rate, and display the data in a graphical format simultaneously on a single video display terminal. In addition live video representing the view of the subject and the experimental setup may also be integrated into the single data display. The display may be recorded on a standard video tape recorder for retrospective analysis.

  20. Determination of cloud effective particle size from the multiple-scattering effect on lidar integration-method temperature measurements.

    PubMed

    Reichardt, Jens; Reichardt, Susanne

    2006-04-20

    A method is presented that permits the determination of the cloud effective particle size from Raman- or Rayleigh-integration temperature measurements that exploits the dependence of the multiple-scattering contributions to the lidar signals from heights above the cloud on the particle size of the cloud. Independent temperature information is needed for the determination of size. By use of Raman-integration temperatures, the technique is applied to cirrus measurements. The magnitude of the multiple-scattering effect and the above-cloud lidar signal strength limit the method's range of applicability to cirrus optical depths from 0.1 to 0.5. Our work implies that records of stratosphere temperature obtained with lidar may be affected by multiple scattering in clouds up to heights of 30 km and beyond.

  1. Accurate and efficient Nyström volume integral equation method for the Maxwell equations for multiple 3-D scatterers

    NASA Astrophysics Data System (ADS)

    Chen, Duan; Cai, Wei; Zinser, Brian; Cho, Min Hyung

    2016-09-01

    In this paper, we develop an accurate and efficient Nyström volume integral equation (VIE) method for the Maxwell equations for a large number of 3-D scatterers. The Cauchy Principal Values that arise from the VIE are computed accurately using a finite size exclusion volume together with explicit correction integrals consisting of removable singularities. Also, the hyper-singular integrals are computed using interpolated quadrature formulae with tensor-product quadrature nodes for cubes, spheres and cylinders, that are frequently encountered in the design of meta-materials. The resulting Nyström VIE method is shown to have high accuracy with a small number of collocation points and demonstrates p-convergence for computing the electromagnetic scattering of these objects. Numerical calculations of multiple scatterers of cubic, spherical, and cylindrical shapes validate the efficiency and accuracy of the proposed method.

  2. Hybrid MCDA Methods to Integrate Multiple Ecosystem Services in Forest Management Planning: A Critical Review

    NASA Astrophysics Data System (ADS)

    Uhde, Britta; Andreas Hahn, W.; Griess, Verena C.; Knoke, Thomas

    2015-08-01

    Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.

  3. Hybrid MCDA Methods to Integrate Multiple Ecosystem Services in Forest Management Planning: A Critical Review.

    PubMed

    Uhde, Britta; Hahn, W Andreas; Griess, Verena C; Knoke, Thomas

    2015-08-01

    Multi-criteria decision analysis (MCDA) is a decision aid frequently used in the field of forest management planning. It includes the evaluation of multiple criteria such as the production of timber and non-timber forest products and tangible as well as intangible values of ecosystem services (ES). Hence, it is beneficial compared to those methods that take a purely financial perspective. Accordingly, MCDA methods are increasingly popular in the wide field of sustainability assessment. Hybrid approaches allow aggregating MCDA and, potentially, other decision-making techniques to make use of their individual benefits and leading to a more holistic view of the actual consequences that come with certain decisions. This review is providing a comprehensive overview of hybrid approaches that are used in forest management planning. Today, the scientific world is facing increasing challenges regarding the evaluation of ES and the trade-offs between them, for example between provisioning and regulating services. As the preferences of multiple stakeholders are essential to improve the decision process in multi-purpose forestry, participatory and hybrid approaches turn out to be of particular importance. Accordingly, hybrid methods show great potential for becoming most relevant in future decision making. Based on the review presented here, the development of models for the use in planning processes should focus on participatory modeling and the consideration of uncertainty regarding available information.

  4. A novel finite element method for the modeling of multiple reflections in photonic integrated circuits

    NASA Astrophysics Data System (ADS)

    Ging, John A.; O'Dowd, Ronan

    2006-04-01

    The complex transverse waveguide geometries of integrated photonic devices warrant the application of intricate Numerical Methods when modelling these types of Planar Lightwave Circuits (PLC). To aggravate the problem, difficulties also arise when dealing with back-reflections at interfaces, counter-propagating signals and other associated losses. Routines such as the Finite Element Method (FEM) and Finite Difference Method (FDM) are utilised in simulating the propagation of light through the core waveguide structures of these PLCs. In this paper a novel FEM reliant upon device cross-sectional symmetry is proposed, developed and discussed in regards to its advantages in precision over other procedures. Upon completion of this analysis, the propagation constant and effective refractive indices are known and extensions may be employed to accurately model propagation through the device and outline any reflections or losses that may ensue. A clear and concise review of some of the foremost available schemes is also presented here. These techniques, such as the Bidirectional Eigenmode Propagation Method (BEP) and the Beam Propagation Method (BPM) will be discussed and an effective and precise 3-dimensional model is presented. Due to the myriad of available techniques and algorithms, a comparative study is drawn, listing the advantages and failures of the major methods while suggesting improvements to their application. Necessary considerations such as simulation time and the trade-off between computer memory requirements and accuracy of the solution are also acknowledged.

  5. CytoSolve: A Scalable Computational Method for Dynamic Integration of Multiple Molecular Pathway Models.

    PubMed

    Ayyadurai, V A Shiva; Dewey, C Forbes

    2011-03-01

    A grand challenge of computational systems biology is to create a molecular pathway model of the whole cell. Current approaches involve merging smaller molecular pathway models' source codes to create a large monolithic model (computer program) that runs on a single computer. Such a larger model is difficult, if not impossible, to maintain given ongoing updates to the source codes of the smaller models. This paper describes a new system called CytoSolve that dynamically integrates computations of smaller models that can run in parallel across different machines without the need to merge the source codes of the individual models. This approach is demonstrated on the classic Epidermal Growth Factor Receptor (EGFR) model of Kholodenko. The EGFR model is split into four smaller models and each smaller model is distributed on a different machine. Results from four smaller models are dynamically integrated to generate identical results to the monolithic EGFR model running on a single machine. The overhead for parallel and dynamic computation is approximately twice that of a monolithic model running on a single machine. The CytoSolve approach provides a scalable method since smaller models may reside on any computer worldwide, where the source code of each model can be independently maintained and updated.

  6. Integrating Multiple Geophysical Methods to Quantify Alpine Groundwater- Surface Water Interactions: Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Glas, R. L.; Lautz, L.; McKenzie, J. M.; Baker, E. A.; Somers, L. D.; Aubry-Wake, C.; Wigmore, O.; Mark, B. G.; Moucha, R.

    2016-12-01

    Groundwater- surface water interactions in alpine catchments are often poorly understood as groundwater and hydrologic data are difficult to acquire in these remote areas. The Cordillera Blanca of Peru is a region where dry-season water supply is increasingly stressed due to the accelerated melting of glaciers throughout the range, affecting millions of people country-wide. The alpine valleys of the Cordillera Blanca have shown potential for significant groundwater storage and discharge to valley streams, which could buffer the dry-season variability of streamflow throughout the watershed as glaciers continue to recede. Known as pampas, the clay-rich, low-relief valley bottoms are interfingered with talus deposits, providing a likely pathway for groundwater recharged at the valley edges to be stored and slowly released to the stream throughout the year by springs. Multiple geophysical methods were used to determine areas of groundwater recharge and discharge as well as aquifer geometry of the pampa system. Seismic refraction tomography, vertical electrical sounding (VES), electrical resistivity tomography (ERT), and horizontal-to-vertical spectral ratio (HVSR) seismic methods were used to determine the physical properties of the unconsolidated valley sediments, the depth to saturation, and the depth to bedrock for a representative section of the Quilcayhuanca Valley in the Cordillera Blanca. Depth to saturation and lithological boundaries were constrained by comparing geophysical results to continuous records of water levels and sediment core logs from a network of seven piezometers installed to depths of up to 6 m. Preliminary results show an average depth to bedrock for the study area of 25 m, which varies spatially along with water table depths across the valley. The conceptual model of groundwater flow and storage derived from these geophysical data will be used to inform future groundwater flow models of the area, allowing for the prediction of groundwater

  7. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    USGS Publications Warehouse

    Symstad, Amy; Fisichelli, Nicholas A.; Miller, Brian; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  8. Statistical Methods in Integrative Genomics.

    PubMed

    Richardson, Sylvia; Tseng, George C; Sun, Wei

    2016-06-01

    Statistical methods in integrative genomics aim to answer important biology questions by jointly analyzing multiple types of genomic data (vertical integration) or aggregating the same type of data across multiple studies (horizontal integration). In this article, we introduce different types of genomic data and data resources, and then review statistical methods of integrative genomics, with emphasis on the motivation and rationale of these methods. We conclude with some summary points and future research directions.

  9. Statistical Methods in Integrative Genomics

    PubMed Central

    Richardson, Sylvia; Tseng, George C.; Sun, Wei

    2016-01-01

    Statistical methods in integrative genomics aim to answer important biology questions by jointly analyzing multiple types of genomic data (vertical integration) or aggregating the same type of data across multiple studies (horizontal integration). In this article, we introduce different types of genomic data and data resources, and then review statistical methods of integrative genomics, with emphasis on the motivation and rationale of these methods. We conclude with some summary points and future research directions. PMID:27482531

  10. Integrated QSAR study for inhibitors of hedgehog signal pathway against multiple cell lines:a collaborative filtering method

    PubMed Central

    2012-01-01

    Background The Hedgehog Signaling Pathway is one of signaling pathways that are very important to embryonic development. The participation of inhibitors in the Hedgehog Signal Pathway can control cell growth and death, and searching novel inhibitors to the functioning of the pathway are in a great demand. As the matter of fact, effective inhibitors could provide efficient therapies for a wide range of malignancies, and targeting such pathway in cells represents a promising new paradigm for cell growth and death control. Current research mainly focuses on the syntheses of the inhibitors of cyclopamine derivatives, which bind specifically to the Smo protein, and can be used for cancer therapy. While quantitatively structure-activity relationship (QSAR) studies have been performed for these compounds among different cell lines, none of them have achieved acceptable results in the prediction of activity values of new compounds. In this study, we proposed a novel collaborative QSAR model for inhibitors of the Hedgehog Signaling Pathway by integration the information from multiple cell lines. Such a model is expected to substantially improve the QSAR ability from single cell lines, and provide useful clues in developing clinically effective inhibitors and modifications of parent lead compounds for target on the Hedgehog Signaling Pathway. Results In this study, we have presented: (1) a collaborative QSAR model, which is used to integrate information among multiple cell lines to boost the QSAR results, rather than only a single cell line QSAR modeling. Our experiments have shown that the performance of our model is significantly better than single cell line QSAR methods; and (2) an efficient feature selection strategy under such collaborative environment, which can derive the commonly important features related to the entire given cell lines, while simultaneously showing their specific contributions to a specific cell-line. Based on feature selection results, we have

  11. Effects of complex internal structures on rheology of multiple emulsions particles in 2D from a boundary integral method.

    PubMed

    Wang, Jingtao; Liu, Jinxia; Han, Junjie; Guan, Jing

    2013-02-08

    A boundary integral method is developed to investigate the effects of inner droplets and asymmetry of internal structures on rheology of two-dimensional multiple emulsion particles with arbitrary numbers of layers and droplets within each layer. Under a modest extensional flow, the number increment of layers and inner droplets, and the collision among inner droplets subject the particle to stronger shears. In addition, the coalescence or release of inner droplets changes the internal structure of the multiple emulsion particles. Since the rheology of such particles is sensitive to internal structures and their change, modeling them as the core-shell particles to obtain the viscosity equation of a single particle should be modified by introducing the time-dependable volume fraction Φ(t) of the core instead of the fixed Φ. An asymmetric internal structure induces an oriented contact and merging of the outer and inner interface. The start time of the interface merging is controlled by adjusting the viscosity ratio and enhancing the asymmetry, which is promising in the controlled release of inner droplets through hydrodynamics for targeted drug delivery.

  12. Acceleration of canonical molecular dynamics simulations using macroscopic expansion of the fast multipole method combined with the multiple timestep integrator algorithm

    NASA Astrophysics Data System (ADS)

    Kawata, Masaaki; Mikami, Masuhiro

    A canonical molecular dynamics (MD) simulation was accelerated by using an efficient implementation of the multiple timestep integrator algorithm combined with the periodic fast multiple method (MEFMM) for both Coulombic and van der Waals interactions. Although a significant reduction in computational cost has been obtained previously by using the integrated method, in which the MEFMM was used only to calculate Coulombic interactions (Kawata, M., and Mikami, M., 2000, J. Comput. Chem., in press), the extension of this method to include van der Waals interactions yielded further acceleration of the overall MD calculation by a factor of about two. Compared with conventional methods, such as the velocity-Verlet algorithm combined with the Ewald method (timestep of 0.25fs), the speedup by using the extended integrated method amounted to a factor of 500 for a 100 ps simulation. Therefore, the extended method reduces substantially the computational effort of large scale MD simulations.

  13. Formulation of an explicit-multiple-time-step time integration method for use in a global primitive equation grid model

    NASA Technical Reports Server (NTRS)

    Chao, W. C.

    1982-01-01

    With appropriate modifications, a recently proposed explicit-multiple-time-step scheme (EMTSS) is incorporated into the UCLA model. In this scheme, the linearized terms in the governing equations that generate the gravity waves are split into different vertical modes. Each mode is integrated with an optimal time step, and at periodic intervals these modes are recombined. The other terms are integrated with a time step dictated by the CFL condition for low-frequency waves. This large time step requires a special modification of the advective terms in the polar region to maintain stability. Test runs for 72 h show that EMTSS is a stable, efficient and accurate scheme.

  14. Multiple detectors "Influence Method".

    PubMed

    Rios, I J; Mayer, R E

    2016-05-01

    The "Influence Method" is conceived for the absolute determination of a nuclear particle flux in the absence of known detector efficiency and without the need to register coincidences of any kind. This method exploits the influence of the presence of one detector in the count rate of another detector, when they are placed one behind the other and define statistical estimators for the absolute number of incident particles and for the efficiency (Rios and Mayer, 2015a). Its detailed mathematical description was recently published (Rios and Mayer, 2015b) and its practical implementation in the measurement of a moderated neutron flux arising from an isotopic neutron source was exemplified in (Rios and Mayer, 2016). With the objective of further reducing the measurement uncertainties, in this article we extend the method for the case of multiple detectors placed one behind the other. The new estimators for the number of particles and the detection efficiency are herein derived. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Bayesian correlated clustering to integrate multiple datasets

    PubMed Central

    Kirk, Paul; Griffin, Jim E.; Savage, Richard S.; Ghahramani, Zoubin; Wild, David L.

    2012-01-01

    Motivation: The integration of multiple datasets remains a key challenge in systems biology and genomic medicine. Modern high-throughput technologies generate a broad array of different data types, providing distinct—but often complementary—information. We present a Bayesian method for the unsupervised integrative modelling of multiple datasets, which we refer to as MDI (Multiple Dataset Integration). MDI can integrate information from a wide range of different datasets and data types simultaneously (including the ability to model time series data explicitly using Gaussian processes). Each dataset is modelled using a Dirichlet-multinomial allocation (DMA) mixture model, with dependencies between these models captured through parameters that describe the agreement among the datasets. Results: Using a set of six artificially constructed time series datasets, we show that MDI is able to integrate a significant number of datasets simultaneously, and that it successfully captures the underlying structural similarity between the datasets. We also analyse a variety of real Saccharomyces cerevisiae datasets. In the two-dataset case, we show that MDI’s performance is comparable with the present state-of-the-art. We then move beyond the capabilities of current approaches and integrate gene expression, chromatin immunoprecipitation–chip and protein–protein interaction data, to identify a set of protein complexes for which genes are co-regulated during the cell cycle. Comparisons to other unsupervised data integration techniques—as well as to non-integrative approaches—demonstrate that MDI is competitive, while also providing information that would be difficult or impossible to extract using other methods. Availability: A Matlab implementation of MDI is available from http://www2.warwick.ac.uk/fac/sci/systemsbiology/research/software/. Contact: D.L.Wild@warwick.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID

  16. Fast and Broadband Signal Integrity Analysis of Multiple Vias in Heterogeneous 3D IC and Die-Level Packaging by Using Generalized Foldy-Lax Scattering Method

    NASA Astrophysics Data System (ADS)

    Chang, Xin

    This dissertation proposal is concerned with the use of fast and broadband full-wave electromagnetic methods for modeling high speed interconnects (e.g, vertical vias and horizontal traces) and passive components (e.g, decoupling capacitors) for structures of PCB and packages, in 3D IC, Die-level packaging and SIW based devices, to effectively modeling the designs signal integrity (SI) and power integrity (PI) aspects. The main contributions finished in this thesis is to create a novel methodology, which hybridizes the Foldy-Lax multiple scattering equations based fast full wave method, method of moment (MoM) based 1D technology, modes decoupling based geometry decomposition and cavity modes expansions, to model and simulate the electromagnetic scattering effects for the irregular power/ground planes, multiple vias and traces, for fast and accurate analysis of link level simulation on multilayer electronic structures. For the modeling details, the interior massively-coupled multiple vias problem is modeled most-analytically by using the Foldy-Lax multiple scattering equations. The dyadic Green's functions of the magnetic field are expressed in terms of waveguide modes in the vertical direction and vector cylindrical wave expansions or cavity modes expansions in the horizontal direction, combined with 2D MoM realized by 1D technology. For the incident field of the case of vias in the arbitrarily shaped antipad in finite large cavity/waveguide, the exciting and scattering field coefficients are calculated based on the transformation which converts surface integration of magnetic surface currents in antipad into 1D line integration of surface charges on the vias and on the ground plane. Geometry decomposition method is applied to model and integrate both the vertical and horizontal interconnects/traces in arbitrarily shaped power/ground planes. Moreover, a new form of multiple scattering equations is derived for solving coupling effects among mixed metallic

  17. PHYLOViZ 2.0: providing scalable data integration and visualization for multiple phylogenetic inference methods.

    PubMed

    Nascimento, Marta; Sousa, Adriano; Ramirez, Mário; Francisco, Alexandre P; Carriço, João A; Vaz, Cátia

    2017-01-01

    High Throughput Sequencing provides a cost effective means of generating high resolution data for hundreds or even thousands of strains, and is rapidly superseding methodologies based on a few genomic loci. The wealth of genomic data deposited on public databases such as Sequence Read Archive/European Nucleotide Archive provides a powerful resource for evolutionary analysis and epidemiological surveillance. However, many of the analysis tools currently available do not scale well to these large datasets, nor provide the means to fully integrate ancillary data. Here we present PHYLOViZ 2.0, an extension of PHYLOViZ tool, a platform independent Java tool that allows phylogenetic inference and data visualization for large datasets of sequence based typing methods, including Single Nucleotide Polymorphism (SNP) and whole genome/core genome Multilocus Sequence Typing (wg/cgMLST) analysis. PHYLOViZ 2.0 incorporates new data analysis algorithms and new visualization modules, as well as the capability of saving projects for subsequent work or for dissemination of results. http://www.phyloviz.net/ (licensed under GPLv3). cvaz@inesc-id.ptSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Integration of microchip electrophoresis with electrochemical detection using an epoxy-based molding method to embed multiple electrode materials.

    PubMed

    Johnson, Alicia S; Selimovic, Asmira; Martin, R Scott

    2011-11-01

    This paper describes the use of epoxy-encapsulated electrodes to integrate microchip-based electrophoresis with electrochemical detection. Devices with various electrode combinations can easily be developed. This includes a palladium decoupler with a downstream working electrode material of either gold, mercury/gold, platinum, glassy carbon, or a carbon fiber bundle. Additional device components such as the platinum wires for the electrophoresis separation and the counter electrode for detection can also be integrated into the epoxy base. The effect of the decoupler configuration was studied in terms of the separation performance, detector noise, and the ability to analyze samples of a high ionic strength. The ability of both glassy carbon and carbon fiber bundle electrodes to analyze a complex mixture was demonstrated. It was also shown that a PDMS-based valving microchip can be used along with the epoxy-embedded electrodes to integrate microdialysis sampling with microchip electrophoresis and electrochemical detection, with the microdialysis tubing also being embedded in the epoxy substrate. This approach enables one to vary the detection electrode material as desired in a manner where the electrodes can be polished and modified as is done with electrochemical flow cells used in liquid chromatography.

  19. A fuzzy integral method based on the ensemble of neural networks to analyze fMRI data for cognitive state classification across multiple subjects.

    PubMed

    Cacha, L A; Parida, S; Dehuri, S; Cho, S-B; Poznanski, R R

    2016-12-01

    The huge number of voxels in fMRI over time poses a major challenge to for effective analysis. Fast, accurate, and reliable classifiers are required for estimating the decoding accuracy of brain activities. Although machine-learning classifiers seem promising, individual classifiers have their own limitations. To address this limitation, the present paper proposes a method based on the ensemble of neural networks to analyze fMRI data for cognitive state classification for application across multiple subjects. Similarly, the fuzzy integral (FI) approach has been employed as an efficient tool for combining different classifiers. The FI approach led to the development of a classifiers ensemble technique that performs better than any of the single classifier by reducing the misclassification, the bias, and the variance. The proposed method successfully classified the different cognitive states for multiple subjects with high accuracy of classification. Comparison of the performance improvement, while applying ensemble neural networks method, vs. that of the individual neural network strongly points toward the usefulness of the proposed method.

  20. Integrated College Methods Courses.

    ERIC Educational Resources Information Center

    Freeland, Kent; Willis, Melinda

    This study compared the performance of two groups of preservice teachers at Kentucky's Morehead State University. One group had taken four of their methods courses (reading, language arts, social studies, and mathematics) in an integrated fashion from four faculty members. This group was termed the block group. The other group (the nonblock group)…

  1. Multiple-stage integrating accelerometer

    DOEpatents

    Devaney, H.F.

    1984-06-27

    An accelerometer assembly is provided for use in activating a switch in response to multiple acceleration pulses in series. The accelerometer includes a housing forming a chamber. An inertial mass or piston is slidably disposed in the chamber and spring biased toward a first or reset position. A damping system is also provided to damp piston movement in response to first and subsequent acceleration pulses. Additionally, a cam, including a Z-shaped slot, and cooperating follower pin slidably received therein are mounted to the piston and the housing. The middle or cross-over leg of the Z-shaped slot cooperates with the follower pin to block or limit piston movement and prevent switch activation in response to a lone acceleration pulse. The switch of the assembly is only activated after two or more separate acceleration pulses are sensed and the piston reaches the end of the chamber opposite the reset position.

  2. Multiple-stage integrating accelerometer

    DOEpatents

    Devaney, Howard F.

    1986-01-01

    An accelerometer assembly is provided for use in activating a switch in response to multiple acceleration pulses in series. The accelerometer includes a housing forming a chamber. An inertial mass or piston is slidably disposed in the chamber and spring biased toward a first or reset position. A damping system is also provided to damp piston movement in response to first and subsequent acceleration pulses. Additionally, a cam, including a Z-shaped slot, and cooperating follower pin slidably received therein are mounted to the piston and the housing. The middle or cross-over leg of the Z-shaped slot cooperates with the follower pin to block or limit piston movement and prevent switch activation in response to a lone acceleration pulse. The switch of the assembly is only activated after two or more separate acceleration pulses are sensed and the piston reaches the end of the chamber opposite the reset position.

  3. Improving Inferences from Multiple Methods.

    ERIC Educational Resources Information Center

    Shotland, R. Lance; Mark, Melvin M.

    1987-01-01

    Multiple evaluation methods (MEMs) can cause an inferential challenge, although there are strategies to strengthen inferences. Practical and theoretical issues involved in the use by social scientists of MEMs, three potential problems in drawing inferences from MEMs, and short- and long-term strategies for alleviating these problems are outlined.…

  4. Improving Inferences from Multiple Methods.

    ERIC Educational Resources Information Center

    Shotland, R. Lance; Mark, Melvin M.

    1987-01-01

    Multiple evaluation methods (MEMs) can cause an inferential challenge, although there are strategies to strengthen inferences. Practical and theoretical issues involved in the use by social scientists of MEMs, three potential problems in drawing inferences from MEMs, and short- and long-term strategies for alleviating these problems are outlined.…

  5. Method for deploying multiple spacecraft

    NASA Technical Reports Server (NTRS)

    Sharer, Peter J. (Inventor)

    2007-01-01

    A method for deploying multiple spacecraft is disclosed. The method can be used in a situation where a first celestial body is being orbited by a second celestial body. The spacecraft are loaded onto a single spaceship that contains the multiple spacecraft and the spacecraft is launched from the second celestial body towards a third celestial body. The spacecraft are separated from each other while in route to the third celestial body. Each of the spacecraft is then subjected to the gravitational field of the third celestial body and each of the spacecraft assumes a different, independent orbit about the first celestial body. In those situations where the spacecraft are launched from Earth, the Sun can act as the first celestial body, the Earth can act as the second celestial body and the Moon can act as the third celestial body.

  6. Accelerated Adaptive Integration Method

    PubMed Central

    2015-01-01

    Conformational changes that occur upon ligand binding may be too slow to observe on the time scales routinely accessible using molecular dynamics simulations. The adaptive integration method (AIM) leverages the notion that when a ligand is either fully coupled or decoupled, according to λ, barrier heights may change, making some conformational transitions more accessible at certain λ values. AIM adaptively changes the value of λ in a single simulation so that conformations sampled at one value of λ seed the conformational space sampled at another λ value. Adapting the value of λ throughout a simulation, however, does not resolve issues in sampling when barriers remain high regardless of the λ value. In this work, we introduce a new method, called Accelerated AIM (AcclAIM), in which the potential energy function is flattened at intermediate values of λ, promoting the exploration of conformational space as the ligand is decoupled from its receptor. We show, with both a simple model system (Bromocyclohexane) and the more complex biomolecule Thrombin, that AcclAIM is a promising approach to overcome high barriers in the calculation of free energies, without the need for any statistical reweighting or additional processors. PMID:24780083

  7. Integrated Spectrophotometric Properties of Multiple Stellar Populations

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-chul; Cartwright, Charles

    2016-01-01

    There is mounting evidence that almost all the Milky Way globular clusters (MWGCs) are of multiple stellar populations. Several earlier works have revealed that the color-magnitude diagrams of MWGCs are best reproduced by the combination of stellar populations with different ages and metallicities. However, their integrated spectrophotometric properties have not yet been validated. In this work, we employ the most up-to-dated stellar evolutionary tracks and isochrones from several different groups and calculate the integrated broadband colors and spectral indices for the Milky Way globular clusters and compare the theoretical predictions to the observations.

  8. Vertically Integrated Multiple Nanowire Field Effect Transistor.

    PubMed

    Lee, Byung-Hyun; Kang, Min-Ho; Ahn, Dae-Chul; Park, Jun-Young; Bang, Tewook; Jeon, Seung-Bae; Hur, Jae; Lee, Dongil; Choi, Yang-Kyu

    2015-12-09

    A vertically integrated multiple channel-based field-effect transistor (FET) with the highest number of nanowires reported ever is demonstrated on a bulk silicon substrate without use of wet etching. The driving current is increased by 5-fold due to the inherent vertically stacked five-level nanowires, thus showing good feasibility of three-dimensional integration-based high performance transistor. The developed fabrication process, which is simple and reproducible, is used to create multiple stiction-free and uniformly sized nanowires with the aid of the one-route all-dry etching process (ORADEP). Furthermore, the proposed FET is revamped to create nonvolatile memory with the adoption of a charge trapping layer for enhanced practicality. Thus, this research suggests an ultimate design for the end-of-the-roadmap devices to overcome the limits of scaling.

  9. Is the Evaluation of the Students' Values Possible? An Integrated Approach to Determining the Weights of Students' Personal Goals Using Multiple-Criteria Methods

    ERIC Educational Resources Information Center

    Dadelo, Stanislav; Turskis, Zenonas; Zavadskas, Edmundas Kazimieras; Kacerauskas, Tomas; Dadeliene, Ruta

    2016-01-01

    To maximize the effectiveness of a decision, it is necessary to support decision-making with integrated methods. It can be assumed that subjective evaluation (considering only absolute values) is only remotely connected with the evaluation of real processes. Therefore, relying solely on these values in process management decision-making would be a…

  10. Is the Evaluation of the Students' Values Possible? An Integrated Approach to Determining the Weights of Students' Personal Goals Using Multiple-Criteria Methods

    ERIC Educational Resources Information Center

    Dadelo, Stanislav; Turskis, Zenonas; Zavadskas, Edmundas Kazimieras; Kacerauskas, Tomas; Dadeliene, Ruta

    2016-01-01

    To maximize the effectiveness of a decision, it is necessary to support decision-making with integrated methods. It can be assumed that subjective evaluation (considering only absolute values) is only remotely connected with the evaluation of real processes. Therefore, relying solely on these values in process management decision-making would be a…

  11. Optical multiple sample vacuum integrating sphere

    NASA Technical Reports Server (NTRS)

    Butner, C. L. (Inventor)

    1986-01-01

    An integrating sphere comprised of a uniform difusely reflecting spherical cavity, having mutually transverse input and output ports, and a linear sample transport mechanism is described. The sample transport mechanism is secured so that the multiple samples can be brought into registration with the input port, one at a time, without having to open or disassemble the apparatus when a change of sample is desired. A vacuum tight seal is provided between the cavity and the transport mechanism. This maintains the integrity of a vacuum generated with the sphere when attached to the source of optical energy. The device is utilized to test emissive characteristics such as the relative fluorescence quantum efficiency of a dye sample placed in the path of a monochromatic optical energy source coupled to the input port while having a light detector coupled to the output port.

  12. Functional integral approach for multiplicative stochastic processes.

    PubMed

    Arenas, Zochil González; Barci, Daniel G

    2010-05-01

    We present a functional formalism to derive a generating functional for correlation functions of a multiplicative stochastic process represented by a Langevin equation. We deduce a path integral over a set of fermionic and bosonic variables without performing any time discretization. The usual prescriptions to define the Wiener integral appear in our formalism in the definition of Green's functions in the Grassman sector of the theory. We also study nonperturbative constraints imposed by Becchi, Rouet and Stora symmetry (BRS) and supersymmetry on correlation functions. We show that the specific prescription to define the stochastic process is wholly contained in tadpole diagrams. Therefore, in a supersymmetric theory, the stochastic process is uniquely defined since tadpole contributions cancels at all order of perturbation theory.

  13. Integrated management of multiple reservoir field developments

    SciTech Connect

    Lyons, S.L.; Chan, H.M.; Harper, J.L.; Boyett, B.A.; Dowson, P.R.; Bette, S.

    1995-10-01

    This paper consists of two sections. The authors first describe the coupling of a pipeline network model to a reservoir simulator and then the application of this new simulator to optimize the production strategy of two Mobil field developments. Mobil`s PEGASUS simulator is an integrated all purpose reservoir simulator that handles black-oil, compositional, faulted and naturally fractured reservoirs. The authors have extended the simulator to simultaneously model multiple reservoirs coupled with surface pipeline networks and processes. This allows them to account for the effects of geology, well placement, and surface production facilities on well deliverability in a fully integrated fashion. They have also developed a gas contract allocation system that takes the user-specified constraints, target rates and swing factors and automatically assigns rates to the individual wells of each reservoir. This algorithm calculates the overall deliverability and automatically reduces the user-specified target rates to meet the deliverability constraints. The algorithm and solution technique are described. This enhanced simulator has been applied to model a Mobil field development in the Southern Gas Basin, offshore United Kingdom, which consists of three separate gas reservoirs connected via a pipeline network. The simulator allowed the authors to accurately determine the impact on individual reservoir and total field performance by varying the development timing of these reservoirs. Several development scenarios are shown to illustrate the capabilities of PEGASUS. Another application of this technology is in the field developments in North Sumatra, Indonesia. Here the objective is to economically optimize the development of multiple fields to feed the PT Arun LNG facility. Consideration of a range of gas compositions, well productivity`s, and facilities constraints in an integrated fashion results in improved management of these assets. Model specifics are discussed.

  14. Integrative Analysis of Prognosis Data on Multiple Cancer Subtypes

    PubMed Central

    Liu, Jin; Huang, Jian; Zhang, Yawei; Lan, Qing; Rothman, Nathaniel; Zheng, Tongzhang; Ma, Shuangge

    2014-01-01

    Summary In cancer research, profiling studies have been extensively conducted, searching for genes/SNPs associated with prognosis. Cancer is diverse. Examining the similarity and difference in the genetic basis of multiple subtypes of the same cancer can lead to a better understanding of their connections and distinctions. Classic meta-analysis methods analyze each subtype separately and then compare analysis results across subtypes. Integrative analysis methods, in contrast, analyze the raw data on multiple subtypes simultaneously and can outperform meta-analysis methods. In this study, prognosis data on multiple subtypes of the same cancer are analyzed. An AFT (accelerated failure time) model is adopted to describe survival. The genetic basis of multiple subtypes is described using the heterogeneity model, which allows a gene/SNP to be associated with prognosis of some subtypes but not others. A compound penalization method is developed to identify genes that contain important SNPs associated with prognosis. The proposed method has an intuitive formulation and is realized using an iterative algorithm. Asymptotic properties are rigorously established. Simulation shows that the proposed method has satisfactory performance and outperforms a penalization-based meta-analysis method and a regularized thresholding method. An NHL (non-Hodgkin lymphoma) prognosis study with SNP measurements is analyzed. Genes associated with the three major subtypes, namely DLBCL, FL, and CLL/SLL, are identified. The proposed method identifies genes that are different from alternatives and have important implications and satisfactory prediction performance. PMID:24766212

  15. [Etnography as an Integrative Method].

    PubMed

    Gómez, Ángela Viviana Pérez

    2012-06-01

    Ethnography is understood from three perspectives: approach, methodology and text. In the health field, ethnography can be used not only from the standpoint of the research process, but also from the very instances of medical consultation, diagnose and treatment. The pacient appreciates the fact of being heard and understood as a subject who has her/his own story and is involved in a particular culture related to her/his own status and to the effectsa caused by life experiences. Analysis of the literature related to ethnography, participanting observation and an relationship between health and qualitative research. There is a diversity of opinions and attitudes about ethnography, its validity and usefulness as well as in considerations related to its method and the techniques that nourish it. Ethnography is an integrative approach that may resorty to multiple tools for collecting, analyzing and interpreting the data. Therefore, ethnography constitutes an option for the physician when performing individual assessment. Ethnography provides an opportunity to approach the reality of an individual or group of individuals in order to obtain information about the matter under investigation, its understanding and interpretation. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  16. Multiple network interface core apparatus and method

    SciTech Connect

    Underwood, Keith D; Hemmert, Karl Scott

    2011-04-26

    A network interface controller and network interface control method comprising providing a single integrated circuit as a network interface controller and employing a plurality of network interface cores on the single integrated circuit.

  17. Predicting Protein Function via Semantic Integration of Multiple Networks.

    PubMed

    Yu, Guoxian; Fu, Guangyuan; Wang, Jun; Zhu, Hailong

    2016-01-01

    Determining the biological functions of proteins is one of the key challenges in the post-genomic era. The rapidly accumulated large volumes of proteomic and genomic data drives to develop computational models for automatically predicting protein function in large scale. Recent approaches focus on integrating multiple heterogeneous data sources and they often get better results than methods that use single data source alone. In this paper, we investigate how to integrate multiple biological data sources with the biological knowledge, i.e., Gene Ontology (GO), for protein function prediction. We propose a method, called SimNet, to Semantically integrate multiple functional association Networks derived from heterogenous data sources. SimNet firstly utilizes GO annotations of proteins to capture the semantic similarity between proteins and introduces a semantic kernel based on the similarity. Next, SimNet constructs a composite network, obtained as a weighted summation of individual networks, and aligns the network with the kernel to get the weights assigned to individual networks. Then, it applies a network-based classifier on the composite network to predict protein function. Experiment results on heterogenous proteomic data sources of Yeast, Human, Mouse, and Fly show that, SimNet not only achieves better (or comparable) results than other related competitive approaches, but also takes much less time. The Matlab codes of SimNet are available at https://sites.google.com/site/guoxian85/simnet.

  18. A comparative study of Conroy and Monte Carlo methods applied to multiple quadratures and multiple scattering

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Fluellen, A.

    1978-01-01

    An efficient numerical method of multiple quadratures, the Conroy method, is applied to the problem of computing multiple scattering contributions in the radiative transfer through realistic planetary atmospheres. A brief error analysis of the method is given and comparisons are drawn with the more familiar Monte Carlo method. Both methods are stochastic problem-solving models of a physical or mathematical process and utilize the sampling scheme for points distributed over a definite region. In the Monte Carlo scheme the sample points are distributed randomly over the integration region. In the Conroy method, the sample points are distributed systematically, such that the point distribution forms a unique, closed, symmetrical pattern which effectively fills the region of the multidimensional integration. The methods are illustrated by two simple examples: one, of multidimensional integration involving two independent variables, and the other, of computing the second order scattering contribution to the sky radiance.

  19. Multiple protocol fluorometer and method

    DOEpatents

    Kolber, Zbigniew S.; Falkowski, Paul G.

    2000-09-19

    A multiple protocol fluorometer measures photosynthetic parameters of phytoplankton and higher plants using actively stimulated fluorescence protocols. The measured parameters include spectrally-resolved functional and optical absorption cross sections of PSII, extent of energy transfer between reaction centers of PSII, F.sub.0 (minimal), F.sub.m (maximal) and F.sub.v (variable) components of PSII fluorescence, photochemical and non-photochemical quenching, size of the plastoquinone (PQ) pool, and the kinetics of electron transport between Q.sub.a and PQ pool and between PQ pool and PSI. The multiple protocol fluorometer, in one embodiment, is equipped with an excitation source having a controlled spectral output range between 420 nm and 555 nm and capable of generating flashlets having a duration of 0.125-32 .mu.s, an interval between 0.5 .mu.s and 2 seconds, and peak optical power of up to 2 W/cm.sup.2. The excitation source is also capable of generating, simultaneous with the flashlets, a controlled continuous, background illumination.

  20. SPARSE INTEGRATIVE CLUSTERING OF MULTIPLE OMICS DATA SETS

    PubMed Central

    Wang, Sijian; Mo, Qianxing

    2012-01-01

    High resolution microarrays and second-generation sequencing platforms are powerful tools to investigate genome-wide alterations in DNA copy number, methylation, and gene expression associated with a disease. An integrated genomic profiling approach measuring multiple omics data types simultaneously in the same set of biological samples would render an integrated data resolution that would not be available with any single data type. In this study, we use penalized latent variable regression methods for joint modeling of multiple omics data types to identify common latent variables that can be used to cluster patient samples into biologically and clinically relevant disease subtypes. We consider lasso (Tibshirani, 1996), elastic net (Zou and Hastie, 2005), and fused lasso (Tibshirani et al., 2005) methods to induce sparsity in the coefficient vectors, revealing important genomic features that have significant contributions to the latent variables. An iterative ridge regression is used to compute the sparse coefficient vectors. In model selection, a uniform design (Fang and Wang, 1994) is used to seek “experimental” points that scattered uniformly across the search domain for efficient sampling of tuning parameter combinations. We compared our method to sparse singular value decomposition (SVD) and penalized Gaussian mixture model (GMM) using both real and simulated data sets. The proposed method is applied to integrate genomic, epigenomic, and transcriptomic data for subtype analysis in breast and lung cancer data sets. PMID:24587839

  1. An efficient method for multiple sequence alignment

    SciTech Connect

    Kim, J.; Pramanik, S.

    1994-12-31

    Multiple sequence alignment has been a useful method in the study of molecular evolution and sequence-structure relationships. This paper presents a new method for multiple sequence alignment based on simulated annealing technique. Dynamic programming has been widely used to find an optimal alignment. However, dynamic programming has several limitations to obtain optimal alignment. It requires long computation time and cannot apply certain types of cost functions. We describe detail mechanisms of simulated annealing for multiple sequence alignment problem. It is shown that simulated annealing can be an effective approach to overcome the limitations of dynamic programming in multiple sequence alignment problem.

  2. Integrating Multiple Intelligences in EFL/ESL Classrooms

    ERIC Educational Resources Information Center

    Bas, Gokhan

    2008-01-01

    This article deals with the integration of the theory of Multiple Intelligences in EFL/ESL classrooms. In this study, after the theory of multiple intelligences was presented shortly, the integration of this theory into English classrooms. Intelligence types in MI Theory were discussed and some possible application ways of these intelligence types…

  3. Perturbative Methods in Path Integration

    NASA Astrophysics Data System (ADS)

    Johnson-Freyd, Theodore Paul

    This dissertation addresses a number of related questions concerning perturbative "path" integrals. Perturbative methods are one of the few successful ways physicists have worked with (or even defined) these infinite-dimensional integrals, and it is important as mathematicians to check that they are correct. Chapter 0 provides a detailed introduction. We take a classical approach to path integrals in Chapter 1. Following standard arguments, we posit a Feynman-diagrammatic description of the asymptotics of the time-evolution operator for the quantum mechanics of a charged particle moving nonrelativistically through a curved manifold under the influence of an external electromagnetic field. We check that our sum of Feynman diagrams has all desired properties: it is coordinate-independent and well-defined without ultraviolet divergences, it satisfies the correct composition law, and it satisfies Schrodinger's equation thought of as a boundary-value problem in PDE. Path integrals in quantum mechanics and elsewhere in quantum field theory are almost always of the shape ∫ f es for some functions f (the "observable") and s (the "action"). In Chapter 2 we step back to analyze integrals of this type more generally. Integration by parts provides algebraic relations between the values of ∫ (-) es for different inputs, which can be packaged into a Batalin--Vilkovisky-type chain complex. Using some simple homological perturbation theory, we study the version of this complex that arises when f and s are taken to be polynomial functions, and power series are banished. We find that in such cases, the entire scheme-theoretic critical locus (complex points included) of s plays an important role, and that one can uniformly (but noncanonically) integrate out in a purely algebraic way the contributions to the integral from all "higher modes," reducing ∫ f es to an integral over the critical locus. This may help explain the presence of analytic continuation in questions like the

  4. Geometric integrators for multiple time-scale simulation

    NASA Astrophysics Data System (ADS)

    Jia, Zhidong; Leimkuhler, Ben

    2006-05-01

    In this paper, we review and extend recent research on averaging integrators for multiple time-scale simulation such as are needed for physical N-body problems including molecular dynamics, materials modelling and celestial mechanics. A number of methods have been proposed for direct numerical integration of multiscale problems with special structure, such as the mollified impulse method (Garcia-Archilla, Sanz-Serna and Skeel 1999 SIAM J. Sci. Comput. 20 930-63) and the reversible averaging method (Leimkuhler and Reich 2001 J. Comput. Phys. 171 95-114). Features of problems of interest, such as thermostatted coarse-grained molecular dynamics, require extension of the standard framework. At the same time, in some applications the computation of averages plays a crucial role, but the available methods have deficiencies in this regard. We demonstrate that a new approach based on the introduction of shadow variables, which mirror physical variables, has promised for broadening the usefulness of multiscale methods and enhancing accuracy of or simplifying computation of averages. The shadow variables must be computed from an auxiliary equation. While a geometric integrator in the extended space is possible, in practice we observe enhanced long-term energy behaviour only through use of a variant of the method which controls drift of the shadow variables using dissipation and sacrifices the formal geometric properties such as time-reversibility and volume preservation in the enlarged phase space, stabilizing the corresponding properties in the physical variables. The method is applied to a gravitational three-body problem as well as a partially thermostatted model problem for a dilute gas of diatomic molecules.

  5. Integrating Learning Styles and Multiple Intelligences.

    ERIC Educational Resources Information Center

    Silver, Harvey; Strong, Richard; Perini, Matthew

    1997-01-01

    Multiple-intelligences theory (MI) explores how cultures and disciplines shape human potential. Both MI and learning-style theories reject dominant ideologies of intelligence. Whereas learning styles are concerned with differences in the learning process, MI centers on learning content and products. Blending learning styles and MI theories via…

  6. Building a cognitive map by assembling multiple path integration systems.

    PubMed

    Wang, Ranxiao Frances

    2016-06-01

    Path integration and cognitive mapping are two of the most important mechanisms for navigation. Path integration is a primitive navigation system which computes a homing vector based on an animal's self-motion estimation, while cognitive map is an advanced spatial representation containing richer spatial information about the environment that is persistent and can be used to guide flexible navigation to multiple locations. Most theories of navigation conceptualize them as two distinctive, independent mechanisms, although the path integration system may provide useful information for the integration of cognitive maps. This paper demonstrates a fundamentally different scenario, where a cognitive map is constructed in three simple steps by assembling multiple path integrators and extending their basic features. The fact that a collection of path integration systems can be turned into a cognitive map suggests the possibility that cognitive maps may have evolved directly from the path integration system.

  7. Longitudinal comparative evaluation of the equivalence of an integrated peer-support and clinical staffing model for residential mental health rehabilitation: a mixed methods protocol incorporating multiple stakeholder perspectives.

    PubMed

    Parker, Stephen; Dark, Frances; Newman, Ellie; Korman, Nicole; Meurk, Carla; Siskind, Dan; Harris, Meredith

    2016-06-02

    A novel staffing model integrating peer support workers and clinical staff within a unified team is being trialled at community based residential rehabilitation units in Australia. A mixed-methods protocol for the longitudinal evaluation of the outcomes, expectations and experiences of care by consumers and staff under this staffing model in two units will be compared to one unit operating a traditional clinical staffing. The study is unique with regards to the context, the longitudinal approach and consideration of multiple stakeholder perspectives. The longitudinal mixed methods design integrates a quantitative evaluation of the outcomes of care for consumers at three residential rehabilitation units with an applied qualitative research methodology. The quantitative component utilizes a prospective cohort design to explore whether equivalent outcomes are achieved through engagement at residential rehabilitation units operating integrated and clinical staffing models. Comparative data will be available from the time of admission, discharge and 12-month period post-discharge from the units. Additionally, retrospective data for the 12-month period prior to admission will be utilized to consider changes in functioning pre and post engagement with residential rehabilitation care. The primary outcome will be change in psychosocial functioning, assessed using the total score on the Health of the Nation Outcome Scales (HoNOS). Planned secondary outcomes will include changes in symptomatology, disability, recovery orientation, carer quality of life, emergency department presentations, psychiatric inpatient bed days, and psychological distress and wellbeing. Planned analyses will include: cohort description; hierarchical linear regression modelling of the predictors of change in HoNOS following CCU care; and descriptive comparisons of the costs associated with the two staffing models. The qualitative component utilizes a pragmatic approach to grounded theory, with

  8. Temporal Characterization of Hydrates System Dynamics beneath Seafloor Mounds. Integrating Time-Lapse Electrical Resistivity Methods and In Situ Observations of Multiple Oceanographic Parameters

    SciTech Connect

    Lutken, Carol; Macelloni, Leonardo; D'Emidio, Marco; Dunbar, John; Higley, Paul

    2015-01-31

    detect short-term changes within the hydrates system, identify relationships/impacts of local oceanographic parameters on the hydrates system, and improve our understanding of how seafloor instability is affected by hydrates-driven changes. A 2009 DCR survey of MC118 demonstrated that we could image resistivity anomalies to a depth of 75m below the seafloor in water depths of 1km. We reconfigured this system to operate autonomously on the seafloor in a pre-programmed mode, for periods of months. We designed and built a novel seafloor lander and deployment capability that would allow us to investigate the seafloor at potential deployment sites and deploy instruments only when conditions met our criteria. This lander held the DCR system, controlling computers, and battery power supply, as well as instruments to record oceanographic parameters. During the first of two cruises to the study site, we conducted resistivity surveying, selected a monitoring site, and deployed the instrumented lander and DCR, centered on what appeared to be the most active locations within the site, programmed to collect a DCR profile, weekly. After a 4.5-month residence on the seafloor, the team recovered all equipment. Unfortunately, several equipment failures occurred prior to recovery of the instrument packages. Prior to the failures, however, two resistivity profiles were collected together with oceanographic data. Results show, unequivocally, that significant changes can occur in both hydrate volume and distribution during time periods as brief as one week. Occurrences appear to be controlled by both deep and near-surface structure. Results have been integrated with seismic data from the area and show correspondence in space of hydrate and structures, including faults and gas chimneys.

  9. Integral Methodological Pluralism in Science Education Research: Valuing Multiple Perspectives

    ERIC Educational Resources Information Center

    Davis, Nancy T.; Callihan, Laurie P.

    2013-01-01

    This article examines the multiple methodologies used in educational research and proposes a model that includes all of them as contributing to understanding educational contexts and research from multiple perspectives. The model, based on integral theory (Wilber in a theory of everything. Shambhala, Boston, 2000) values all forms of research as…

  10. Integral methodological pluralism in science education research: valuing multiple perspectives

    NASA Astrophysics Data System (ADS)

    Davis, Nancy T.; Callihan, Laurie P.

    2013-09-01

    This article examines the multiple methodologies used in educational research and proposes a model that includes all of them as contributing to understanding educational contexts and research from multiple perspectives. The model, based on integral theory (Wilber in a theory of everything. Shambhala, Boston, 2000) values all forms of research as true, but partial. Consideration of objective (exterior) forms of research and data and subjective (interior) forms of research and data are further divided into individual and collective domains. Taking this categorization system one step further reveals eight indigenous perspectives that form a framework for considering research methodologies. Each perspective has unique questions, data sources, methods and quality criteria designed to reveal what is "true" from that view. As science educators who guide our students' research, this framework offers a useful guide to explain differences in types of research, the purpose and validity of each. It allows professional science educators to appreciate multiple forms of research while maintaining rigorous quality criteria. Use of this framework can also help avoid problems of imposing quality criteria of one methodology on research data and questions gathered using another methodology. This model is explored using the second author's dissertation research. Finally a decision chart is provided to use with those who are starting inquiries to guide their thinking and choice of appropriate methodologies to use when conducting research.

  11. Multiple crossbar network: Integrated supercomputing framework

    SciTech Connect

    Hoebelheinrich, R. )

    1989-01-01

    At Los Alamos National Laboratory, site of one of the world's most powerful scientific supercomputing facilities, a prototype network for an environment that links supercomputers and workstations is being developed. Driven by a need to provide graphics data at movie rates across a network from a Cray supercomputer to a Sun scientific workstation, the network is called the Multiple Crossbar Network (MCN). It is intended to be coarsely grained, loosely coupled, general-purpose interconnection network that will vastly increase the speed at which supercomputers communicate with each other in large networks. The components of the network are described, as well as work done in collaboration with vendors who are interested in providing commercial products. 9 refs.

  12. A Fuzzy Logic Framework for Integrating Multiple Learned Models

    SciTech Connect

    Hartog, Bobi Kai Den

    1999-03-01

    The Artificial Intelligence field of Integrating Multiple Learned Models (IMLM) explores ways to combine results from sets of trained programs. Aroclor Interpretation is an ill-conditioned problem in which trained programs must operate in scenarios outside their training ranges because it is intractable to train them completely. Consequently, they fail in ways related to the scenarios. We developed a general-purpose IMLM solution, the Combiner, and applied it to Aroclor Interpretation. The Combiner's first step, Scenario Identification (M), learns rules from very sparse, synthetic training data consisting of results from a suite of trained programs called Methods. S1 produces fuzzy belief weights for each scenario by approximately matching the rules. The Combiner's second step, Aroclor Presence Detection (AP), classifies each of three Aroclors as present or absent in a sample. The third step, Aroclor Quantification (AQ), produces quantitative values for the concentration of each Aroclor in a sample. AP and AQ use automatically learned empirical biases for each of the Methods in each scenario. Through fuzzy logic, AP and AQ combine scenario weights, automatically learned biases for each of the Methods in each scenario, and Methods' results to determine results for a sample.

  13. Integrating Multiple Criteria Evaluation and GIS in Ecotourism: a Review

    NASA Astrophysics Data System (ADS)

    Mohd, Z. H.; Ujang, U.

    2016-09-01

    The concept of 'Eco-tourism' is increasingly heard in recent decades. Ecotourism is one adventure that environmentally responsible intended to appreciate the nature experiences and cultures. Ecotourism should have low impact on environment and must contribute to the prosperity of local residents. This article reviews the use of Multiple Criteria Evaluation (MCE) and Geographic Information System (GIS) in ecotourism. Multiple criteria evaluation mostly used to land suitability analysis or fulfill specific objectives based on various attributes that exist in the selected area. To support the process of environmental decision making, the application of GIS is used to display and analysis the data through Analytic Hierarchy Process (AHP). Integration between MCE and GIS tool is important to determine the relative weight for the criteria used objectively. With the MCE method, it can resolve the conflict between recreation and conservation which is to minimize the environmental and human impact. Most studies evidences that the GIS-based AHP as a multi criteria evaluation is a strong and effective in tourism planning which can aid in the development of ecotourism industry effectively.

  14. Handling missing rows in multi-omics data integration: multiple imputation in multiple factor analysis framework.

    PubMed

    Voillet, Valentin; Besse, Philippe; Liaubet, Laurence; San Cristobal, Magali; González, Ignacio

    2016-10-03

    In omics data integration studies, it is common, for a variety of reasons, for some individuals to not be present in all data tables. Missing row values are challenging to deal with because most statistical methods cannot be directly applied to incomplete datasets. To overcome this issue, we propose a multiple imputation (MI) approach in a multivariate framework. In this study, we focus on multiple factor analysis (MFA) as a tool to compare and integrate multiple layers of information. MI involves filling the missing rows with plausible values, resulting in M completed datasets. MFA is then applied to each completed dataset to produce M different configurations (the matrices of coordinates of individuals). Finally, the M configurations are combined to yield a single consensus solution. We assessed the performance of our method, named MI-MFA, on two real omics datasets. Incomplete artificial datasets with different patterns of missingness were created from these data. The MI-MFA results were compared with two other approaches i.e., regularized iterative MFA (RI-MFA) and mean variable imputation (MVI-MFA). For each configuration resulting from these three strategies, the suitability of the solution was determined against the true MFA configuration obtained from the original data and a comprehensive graphical comparison showing how the MI-, RI- or MVI-MFA configurations diverge from the true configuration was produced. Two approaches i.e., confidence ellipses and convex hulls, to visualize and assess the uncertainty due to missing values were also described. We showed how the areas of ellipses and convex hulls increased with the number of missing individuals. A free and easy-to-use code was proposed to implement the MI-MFA method in the R statistical environment. We believe that MI-MFA provides a useful and attractive method for estimating the coordinates of individuals on the first MFA components despite missing rows. MI-MFA configurations were close to the true

  15. Lamp method and apparatus using multiple reflections

    DOEpatents

    MacLennan, Donald A.; Turner, Brian; Kipling, Kent

    1999-01-01

    A method wherein the light in a sulfur or selenium lamp is reflected through the fill a multiplicity of times to convert ultraviolet radiation to visible. A light emitting device comprised of an electrodeless envelope which bears a light reflecting covering around a first portion which does not crack due to differential thermal expansion and which has a second portion which comprises a light transmissive aperture.

  16. Lamp method and apparatus using multiple reflections

    DOEpatents

    MacLennan, D.A.; Turner, B.; Kipling, K.

    1999-05-11

    A method wherein the light in a sulfur or selenium lamp is reflected through the fill a multiplicity of times to convert ultraviolet radiation to visible is disclosed. A light emitting device comprised of an electrodeless envelope which bears a light reflecting covering around a first portion which does not crack due to differential thermal expansion and which has a second portion which comprises a light transmissive aperture. 20 figs.

  17. Shared mental models of integrated care: aligning multiple stakeholder perspectives.

    PubMed

    Evans, Jenna M; Baker, G Ross

    2012-01-01

    Health service organizations and professionals are under increasing pressure to work together to deliver integrated patient care. A common understanding of integration strategies may facilitate the delivery of integrated care across inter-organizational and inter-professional boundaries. This paper aims to build a framework for exploring and potentially aligning multiple stakeholder perspectives of systems integration. The authors draw from the literature on shared mental models, strategic management and change, framing, stakeholder management, and systems theory to develop a new construct, Mental Models of Integrated Care (MMIC), which consists of three types of mental models, i.e. integration-task, system-role, and integration-belief. The MMIC construct encompasses many of the known barriers and enablers to integrating care while also providing a comprehensive, theory-based framework of psychological factors that may influence inter-organizational and inter-professional relations. While the existing literature on integration focuses on optimizing structures and processes, the MMIC construct emphasizes the convergence and divergence of stakeholders' knowledge and beliefs, and how these underlying cognitions influence interactions (or lack thereof) across the continuum of care. MMIC may help to: explain what differentiates effective from ineffective integration initiatives; determine system readiness to integrate; diagnose integration problems; and develop interventions for enhancing integrative processes and ultimately the delivery of integrated care. Global interest and ongoing challenges in integrating care underline the need for research on the mental models that characterize the behaviors of actors within health systems; the proposed framework offers a starting point for applying a cognitive perspective to health systems integration.

  18. Axisymmetric heat conduction analysis under steady state by improved multiple-reciprocity boundary element method

    SciTech Connect

    Ochiai, Yoshihiro

    1995-09-01

    Heat-conduction analysis under steady state without heat generation can easily be treated by the boundary element method. However, in the case with heat conduction with heat generation can approximately be solved without a domain integral by an improved multiple-reciprocity boundary element method. The convention multiple-reciprocity boundary element method is not suitable for complicated heat generation. In the improved multiple-reciprocity boundary element method, on the other hand, the domain integral in each step is divided into point, line, and area integrals. In order to solve the problem, the contour lines of heat generation, which approximate the actual heat generation, are used.

  19. So Each May Learn: Integrating Learning Styles and Multiple Intelligences.

    ERIC Educational Resources Information Center

    Silver, Harvey F.; Strong, Richard W.; Perini, Matthew J.

    This book shows educators at all grade levels and in all content areas how to implement a holistic learning program that seamlessly integrates learning styles and multiple intelligences into instruction, curriculum, and assessment. It is designed to assist teachers in helping students become more reflective, self-aware learners. The book includes:…

  20. Content Integration across Multiple Documents Reduces Memory for Sources

    ERIC Educational Resources Information Center

    Braasch, Jason L. G.; McCabe, Rebecca M.; Daniel, Frances

    2016-01-01

    The current experiments systematically examined semantic content integration as a mechanism for explaining source inattention and forgetting when reading-to-remember multiple texts. For all 3 experiments, degree of semantic overlap was manipulated amongst messages provided by various information sources. In Experiment 1, readers' source…

  1. Content Integration across Multiple Documents Reduces Memory for Sources

    ERIC Educational Resources Information Center

    Braasch, Jason L. G.; McCabe, Rebecca M.; Daniel, Frances

    2016-01-01

    The current experiments systematically examined semantic content integration as a mechanism for explaining source inattention and forgetting when reading-to-remember multiple texts. For all 3 experiments, degree of semantic overlap was manipulated amongst messages provided by various information sources. In Experiment 1, readers' source…

  2. An Alternative Method for Multiplication of Rhotrices. Classroom Notes

    ERIC Educational Resources Information Center

    Sani, B.

    2004-01-01

    In this article, an alternative multiplication method for rhotrices is proposed. The method establishes some relationships between rhotrices and matrices. This article has discussed a modified multiplication method for rhotrices. The method has a direct relationship with matrix multiplication, and so rhotrices under this multiplication procedure…

  3. Multiple frequency method for operating electrochemical sensors

    DOEpatents

    Martin, Louis P [San Ramon, CA

    2012-05-15

    A multiple frequency method for the operation of a sensor to measure a parameter of interest using calibration information including the steps of exciting the sensor at a first frequency providing a first sensor response, exciting the sensor at a second frequency providing a second sensor response, using the second sensor response at the second frequency and the calibration information to produce a calculated concentration of the interfering parameters, using the first sensor response at the first frequency, the calculated concentration of the interfering parameters, and the calibration information to measure the parameter of interest.

  4. Multiple Model Methods for Cost Function Based Multiple Hypothesis Trackers

    DTIC Science & Technology

    2006-03-01

    MHT’s Gaussian mixture with Multiple Model Adaptive Estimators (MMAEs) or Interacting Multiple Model (IMM) estimators, and replacing the elemental...Kalman Filtering . . . . . . . . . . . . . . . . . . . . . . . . . 2-2 2.3.1 Dynamics Design Models . . . . . . . . . . . . . . . 2-3 2.3.2 Propagation ...Track Life of Various Merging and Pruning Algorithms . . 2-30 3.1. Constant Velocity Truth Model Driven by White Gaussian Noise . . 3-3 3.2. Constant

  5. Automatic numerical integration methods for Feynman integrals through 3-loop

    NASA Astrophysics Data System (ADS)

    de Doncker, E.; Yuasa, F.; Kato, K.; Ishikawa, T.; Olagbemi, O.

    2015-05-01

    We give numerical integration results for Feynman loop diagrams through 3-loop such as those covered by Laporta [1]. The methods are based on automatic adaptive integration, using iterated integration and extrapolation with programs from the QUADPACK package, or multivariate techniques from the ParInt package. The Dqags algorithm from QuadPack accommodates boundary singularities of fairly general types. PARINT is a package for multivariate integration layered over MPI (Message Passing Interface), which runs on clusters and incorporates advanced parallel/distributed techniques such as load balancing among processes that may be distributed over a network of nodes. Results are included for 3-loop self-energy diagrams without IR (infra-red) or UV (ultra-violet) singularities. A procedure based on iterated integration and extrapolation yields a novel method of numerical regularization for integrals with UV terms, and is applied to a set of 2-loop self-energy diagrams with UV singularities.

  6. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  7. Integration methods for molecular dynamics

    SciTech Connect

    Leimkuhler, B.J.; Reich, S.; Skeel, R.D.

    1996-12-31

    Classical molecular dynamics simulation of a macromolecule requires the use of an efficient time-stepping scheme that can faithfully approximate the dynamics over many thousands of timesteps. Because these problems are highly nonlinear, accurate approximation of a particular solution trajectory on meaningful time intervals is neither obtainable nor desired, but some restrictions, such as symplecticness, can be imposed on the discretization which tend to imply good long term behavior. The presence of a variety of types and strengths of interatom potentials in standard molecular models places severe restrictions on the timestep for numerical integration used in explicit integration schemes, so much recent research has concentrated on the search for alternatives that possess (1) proper dynamical properties, and (2) a relative insensitivity to the fastest components of the dynamics. We survey several recent approaches. 48 refs., 2 figs.

  8. A multiple index integrating different levels of organization.

    PubMed

    Cortes, Rui; Hughes, Samantha; Coimbra, Ana; Monteiro, Sandra; Pereira, Vítor; Lopes, Marisa; Pereira, Sandra; Pinto, Ana; Sampaio, Ana; Santos, Cátia; Carrola, João; de Jesus, Joaquim; Varandas, Simone

    2016-10-01

    Many methods in freshwater biomonitoring tend to be restricted to a few levels of biological organization, limiting the potential spectrum of measurable of cause-effect responses to different anthropogenic impacts. We combined distinct organisational levels, covering biological biomarkers (histopathological and biochemical reactions in liver and fish gills), community based bioindicators (fish guilds, invertebrate metrics/traits and chironomid pupal exuviae) and ecosystem functional indicators (decomposition rates) to assess ecological status at designated Water Framework Directive monitoring sites, covering a gradient of human impact across several rivers in northern Portugal. We used Random Forest to rank the variables that contributed more significantly to successfully predict the different classes of ecological status and also to provide specific cut levels to discriminate each WFD class based on reference condition. A total of 59 Biological Quality Elements and functional indicators were determined using this procedure and subsequently applied to develop the integrated Multiple Ecological Level Index (MELI Index), a potentially powerful bioassessment tool. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Comparison of photopeak integration methods

    NASA Astrophysics Data System (ADS)

    Kennedy, G.

    1990-12-01

    Several methods for the calculation of gamma-ray photopeak areas have been compared for the case of a small peak on a high Compton background. 980 similar spectra were accumulated with a germanium detector using a weak 137Cs source to produce a peak at 662 keV on a Compton background generated by a 60Co source. A computer program was written to calculate the area of the 662 keV peak using the total- and partial-peak-area methods, a modification of Sterlinski's method, Loska's method and least-squares fitting of Gaussian peak shapes with linear and quadratic background. The precision attained was highly dependent on the number of channels used to estimate the background, and the best precision, about 9.5%, was obtained with the partial-peak-area method, the modified Sterlinski method and least-squares fitting with variable peak position, fixed peak width and linear background. The methods were also evaluated for their sensitivity to uncertainty in the peak centroid position. Considering precision, ease of use, reliability and universal applicability, the total-peak-area method using several channels for background estimation and the least-squares-fitting method are recommended.

  10. HMC algorithm with multiple time scale integration and mass preconditioning

    NASA Astrophysics Data System (ADS)

    Urbach, C.; Jansen, K.; Shindler, A.; Wenger, U.

    2006-01-01

    We present a variant of the HMC algorithm with mass preconditioning (Hasenbusch acceleration) and multiple time scale integration. We have tested this variant for standard Wilson fermions at β=5.6 and at pion masses ranging from 380 to 680 MeV. We show that in this situation its performance is comparable to the recently proposed HMC variant with domain decomposition as preconditioner. We give an update of the "Berlin Wall" figure, comparing the performance of our variant of the HMC algorithm to other published performance data. Advantages of the HMC algorithm with mass preconditioning and multiple time scale integration are that it is straightforward to implement and can be used in combination with a wide variety of lattice Dirac operators.

  11. Integrated control system and method

    DOEpatents

    Wang, Paul Sai Keat; Baldwin, Darryl; Kim, Myoungjin

    2013-10-29

    An integrated control system for use with an engine connected to a generator providing electrical power to a switchgear is disclosed. The engine receives gas produced by a gasifier. The control system includes an electronic controller associated with the gasifier, engine, generator, and switchgear. A gas flow sensor monitors a gas flow from the gasifier to the engine through an engine gas control valve and provides a gas flow signal to the electronic controller. A gas oversupply sensor monitors a gas oversupply from the gasifier and provides an oversupply signal indicative of gas not provided to the engine. A power output sensor monitors a power output of the switchgear and provide a power output signal. The electronic controller changes gas production of the gasifier and the power output rating of the switchgear based on the gas flow signal, the oversupply signal, and the power output signal.

  12. Case studies: Soil mapping using multiple methods

    NASA Astrophysics Data System (ADS)

    Petersen, Hauke; Wunderlich, Tina; Hagrey, Said A. Al; Rabbel, Wolfgang; Stümpel, Harald

    2010-05-01

    Soil is a non-renewable resource with fundamental functions like filtering (e.g. water), storing (e.g. carbon), transforming (e.g. nutrients) and buffering (e.g. contamination). Degradation of soils is meanwhile not only to scientists a well known fact, also decision makers in politics have accepted this as a serious problem for several environmental aspects. National and international authorities have already worked out preservation and restoration strategies for soil degradation, though it is still work of active research how to put these strategies into real practice. But common to all strategies the description of soil state and dynamics is required as a base step. This includes collecting information from soils with methods ranging from direct soil sampling to remote applications. In an intermediate scale mobile geophysical methods are applied with the advantage of fast working progress but disadvantage of site specific calibration and interpretation issues. In the framework of the iSOIL project we present here some case studies for soil mapping performed using multiple geophysical methods. We will present examples of combined field measurements with EMI-, GPR-, magnetic and gammaspectrometric techniques carried out with the mobile multi-sensor-system of Kiel University (GER). Depending on soil type and actual environmental conditions, different methods show a different quality of information. With application of diverse methods we want to figure out, which methods or combination of methods will give the most reliable information concerning soil state and properties. To investigate the influence of varying material we performed mapping campaigns on field sites with sandy, loamy and loessy soils. Classification of measured or derived attributes show not only the lateral variability but also gives hints to a variation in the vertical distribution of soil material. For all soils of course soil water content can be a critical factor concerning a succesful

  13. Robust rotational-velocity-Verlet integration methods

    NASA Astrophysics Data System (ADS)

    Rozmanov, Dmitri; Kusalik, Peter G.

    2010-05-01

    Two rotational integration algorithms for rigid-body dynamics are proposed in velocity-Verlet formulation. The first method uses quaternion dynamics and was derived from the original rotational leap-frog method by Svanberg [Mol. Phys. 92, 1085 (1997)]; it produces time consistent positions and momenta. The second method is also formulated in terms of quaternions but it is not quaternion specific and can be easily adapted for any other orientational representation. Both the methods are tested extensively and compared to existing rotational integrators. The proposed integrators demonstrated performance at least at the level of previously reported rotational algorithms. The choice of simulation parameters is also discussed.

  14. Fast integral methods for integrated optical systems simulations: a review

    NASA Astrophysics Data System (ADS)

    Kleemann, Bernd H.

    2015-09-01

    Boundary integral equation methods (BIM) or simply integral methods (IM) in the context of optical design and simulation are rigorous electromagnetic methods solving Helmholtz or Maxwell equations on the boundary (surface or interface of the structures between two materials) for scattering or/and diffraction purposes. This work is mainly restricted to integral methods for diffracting structures such as gratings, kinoforms, diffractive optical elements (DOEs), micro Fresnel lenses, computer generated holograms (CGHs), holographic or digital phase holograms, periodic lithographic structures, and the like. In most cases all of the mentioned structures have dimensions of thousands of wavelengths in diameter. Therefore, the basic methods necessary for the numerical treatment are locally applied electromagnetic grating diffraction algorithms. Interestingly, integral methods belong to the first electromagnetic methods investigated for grating diffraction. The development started in the mid 1960ies for gratings with infinite conductivity and it was mainly due to the good convergence of the integral methods especially for TM polarization. The first integral equation methods (IEM) for finite conductivity were the methods by D. Maystre at Fresnel Institute in Marseille: in 1972/74 for dielectric, and metallic gratings, and later for multiprofile, and other types of gratings and for photonic crystals. Other methods such as differential and modal methods suffered from unstable behaviour and slow convergence compared to BIMs for metallic gratings in TM polarization from the beginning to the mid 1990ies. The first BIM for gratings using a parametrization of the profile was developed at Karl-Weierstrass Institute in Berlin under a contract with Carl Zeiss Jena works in 1984-1986 by A. Pomp, J. Creutziger, and the author. Due to the parametrization, this method was able to deal with any kind of surface grating from the beginning: whether profiles with edges, overhanging non

  15. Integrating Multiple Evidence Sources to Predict Adverse Drug Reactions Based on a Systems Pharmacology Model

    PubMed Central

    Cao, D-S; Xiao, N; Li, Y-J; Zeng, W-B; Liang, Y-Z; Lu, A-P; Xu, Q-S; Chen, AF

    2015-01-01

    Identifying potential adverse drug reactions (ADRs) is critically important for drug discovery and public health. Here we developed a multiple evidence fusion (MEF) method for the large-scale prediction of drug ADRs that can handle both approved drugs and novel molecules. MEF is based on the similarity reference by collaborative filtering, and integrates multiple similarity measures from various data types, taking advantage of the complementarity in the data. We used MEF to integrate drug-related and ADR-related data from multiple levels, including the network structural data formed by known drug–ADR relationships for predicting likely unknown ADRs. On cross-validation, it obtains high sensitivity and specificity, substantially outperforming existing methods that utilize single or a few data types. We validated our prediction by their overlap with drug–ADR associations that are known in databases. The proposed computational method could be used for complementary hypothesis generation and rapid analysis of potential drug–ADR interactions. PMID:26451329

  16. NEXT Propellant Management System Integration With Multiple Ion Thrusters

    NASA Technical Reports Server (NTRS)

    Sovey, James S.; Soulas, George C.; Herman, Daniel A.

    2011-01-01

    As a critical part of the NEXT test validation process, a multiple-string integration test was performed on the NEXT propellant management system and ion thrusters. The objectives of this test were to verify that the PMS is capable of providing stable flow control to multiple thrusters operating over the NEXT system throttling range and to demonstrate to potential users that the NEXT PMS is ready for transition to flight. A test plan was developed for the sub-system integration test for verification of PMS and thruster system performance and functionality requirements. Propellant management system calibrations were checked during the single and multi-thruster testing. The low pressure assembly total flow rates to the thruster(s) were within 1.4 percent of the calibrated support equipment flow rates. The inlet pressures to the main, cathode, and neutralizer ports of Thruster PM1R were measured as the PMS operated in 1-thruster, 2-thruster, and 3-thruster configurations. It was found that the inlet pressures to Thruster PM1R for 2-thruster and 3-thruster operation as well as single thruster operation with the PMS compare very favorably indicating that flow rates to Thruster PM1R were similar in all cases. Characterizations of discharge losses, accelerator grid current, and neutralizer performance were performed as more operating thrusters were added to the PMS. There were no variations in these parameters as thrusters were throttled and single and multiple thruster operations were conducted. The propellant management system power consumption was at a fixed voltage to the DCIU and a fixed thermal throttle temperature of 75 C. The total power consumed by the PMS was 10.0, 17.9, and 25.2 W, respectively, for single, 2-thruster, and 3-thruster operation with the PMS. These sub-system integration tests of the PMS, the DCIU Simulator, and multiple thrusters addressed, in part, the NEXT PMS and propulsion system performance and functionality requirements.

  17. Decreasing Multicollinearity: A Method for Models with Multiplicative Functions.

    ERIC Educational Resources Information Center

    Smith, Kent W.; Sasaki, M. S.

    1979-01-01

    A method is proposed for overcoming the problem of multicollinearity in multiple regression equations where multiplicative independent terms are entered. The method is not a ridge regression solution. (JKS)

  18. Early Gnathostome Phylogeny Revisited: Multiple Method Consensus

    PubMed Central

    Qiao, Tuo; King, Benedict; Long, John A.; Ahlberg, Per E.; Zhu, Min

    2016-01-01

    A series of recent studies recovered consistent phylogenetic scenarios of jawed vertebrates, such as the paraphyly of placoderms with respect to crown gnathostomes, and antiarchs as the sister group of all other jawed vertebrates. However, some of the phylogenetic relationships within the group have remained controversial, such as the positions of Entelognathus, ptyctodontids, and the Guiyu-lineage that comprises Guiyu, Psarolepis and Achoania. The revision of the dataset in a recent study reveals a modified phylogenetic hypothesis, which shows that some of these phylogenetic conflicts were sourced from a few inadvertent miscodings. The interrelationships of early gnathostomes are addressed based on a combined new dataset with 103 taxa and 335 characters, which is the most comprehensive morphological dataset constructed to date. This dataset is investigated in a phylogenetic context using maximum parsimony (MP), Bayesian inference (BI) and maximum likelihood (ML) approaches in an attempt to explore the consensus and incongruence between the hypotheses of early gnathostome interrelationships recovered from different methods. Our findings consistently corroborate the paraphyly of placoderms, all ‘acanthodians’ as a paraphyletic stem group of chondrichthyans, Entelognathus as a stem gnathostome, and the Guiyu-lineage as stem sarcopterygians. The incongruence using different methods is less significant than the consensus, and mainly relates to the positions of the placoderm Wuttagoonaspis, the stem chondrichthyan Ramirosuarezia, and the stem osteichthyan Lophosteus—the taxa that are either poorly known or highly specialized in character complement. Given that the different performances of each phylogenetic approach, our study provides an empirical case that the multiple phylogenetic analyses of morphological data are mutually complementary rather than redundant. PMID:27649538

  19. Multiple Testing with Modified Bonferroni Methods.

    ERIC Educational Resources Information Center

    Li, Jianmin; And Others

    This paper discusses the issue of multiple testing and overall Type I error rates in contexts other than multiple comparisons of means. It demonstrates, using a 5 x 5 correlation matrix, the application of 5 recently developed modified Bonferroni procedures developed by the following authors: (1) Y. Hochberg (1988); (2) B. S. Holland and M. D.…

  20. Can the meaning of multiple words be integrated unconsciously?

    PubMed Central

    van Gaal, Simon; Naccache, Lionel; Meuwese, Julia D. I.; van Loon, Anouk M.; Leighton, Alexandra H.; Cohen, Laurent; Dehaene, Stanislas

    2014-01-01

    What are the limits of unconscious language processing? Can language circuits process simple grammatical constructions unconsciously and integrate the meaning of several unseen words? Using behavioural priming and electroencephalography (EEG), we studied a specific rule-based linguistic operation traditionally thought to require conscious cognitive control: the negation of valence. In a masked priming paradigm, two masked words were successively (Experiment 1) or simultaneously presented (Experiment 2), a modifier (‘not’/‘very’) and an adjective (e.g. ‘good’/‘bad’), followed by a visible target noun (e.g. ‘peace’/‘murder’). Subjects indicated whether the target noun had a positive or negative valence. The combination of these three words could either be contextually consistent (e.g. ‘very bad - murder’) or inconsistent (e.g. ‘not bad - murder’). EEG recordings revealed that grammatical negations could unfold partly unconsciously, as reflected in similar occipito-parietal N400 effects for conscious and unconscious three-word sequences forming inconsistent combinations. However, only conscious word sequences elicited P600 effects, later in time. Overall, these results suggest that multiple unconscious words can be rapidly integrated and that an unconscious negation can automatically ‘flip the sign’ of an unconscious adjective. These findings not only extend the limits of subliminal combinatorial language processes, but also highlight how consciousness modulates the grammatical integration of multiple words. PMID:24639583

  1. EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS

    EPA Science Inventory

    NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...

  2. EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS

    EPA Science Inventory

    NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...

  3. Research in Mathematics Education: Multiple Methods for Multiple Uses

    ERIC Educational Resources Information Center

    Battista, Michael; Smith, Margaret S.; Boerst, Timothy; Sutton, John; Confrey, Jere; White, Dorothy; Knuth, Eric; Quander, Judith

    2009-01-01

    Recent federal education policies and reports have generated considerable debate about the meaning, methods, and goals of "scientific research" in mathematics education. Concentrating on the critical problem of determining which educational programs and practices reliably improve students' mathematics achievement, these policies and reports focus…

  4. Research in Mathematics Education: Multiple Methods for Multiple Uses

    ERIC Educational Resources Information Center

    Battista, Michael; Smith, Margaret S.; Boerst, Timothy; Sutton, John; Confrey, Jere; White, Dorothy; Knuth, Eric; Quander, Judith

    2009-01-01

    Recent federal education policies and reports have generated considerable debate about the meaning, methods, and goals of "scientific research" in mathematics education. Concentrating on the critical problem of determining which educational programs and practices reliably improve students' mathematics achievement, these policies and reports focus…

  5. [Integrated risk evaluation of multiple disasters affecting longyan yield in Fujian Province, East China].

    PubMed

    Chen, Jia-Jin; Wang, Jia-Yi; Li, Li-Chun; Lin, Jing; Yang, Kai; Ma, Zhi-Guo; Xu, Zong-Huan

    2012-03-01

    In this study, an index system for the integrated risk evaluation of multiple disasters on the Longyan production in Fujian Province was constructed, based on the analysis of the major environmental factors affecting the Longyan growth and yield, and from the viewpoints of potential hazard of disaster-causing factors, vulnerability of hazard-affected body, and disaster prevention and mitigation capability of Longyan growth regions in the Province. In addition, an integrated evaluation model of multiple disasters was established to evaluate the risks of the major agro-meteorological disasters affecting the Longyan yield, based on the yearly meteorological data, Longyan planting area and yield, and other socio-economic data in Longyan growth region in Fujian, and by using the integral weight of risk indices determined by AHP and entropy weight coefficient methods. In the Province, the Longyan growth regions with light integrated risk of multiple disasters were distributed in the coastal counties (except Dongshan County) with low elevation south of Changle, the regions with severe and more severe integrated risk were mainly in Zhangping of Longyan, Dongshan, Pinghe, Nanjin, and Hua' an of Zhangzhou, Yongchun and Anxi of Quanzhou, north mountainous areas of Putian and Xianyou, Minqing, Minhou, Luoyuan, and mountainous areas of Fuzhou, and Fuan, Xiapu, and mountainous areas of Ninde, among which, the regions with severe integrated risk were in Dongshan, Zhangping, and other mountainous areas with high altitudes, and the regions with moderate integrated risk were distributed in the other areas of the Province.

  6. Dissociating conflict adaptation from feature integration: a multiple regression approach.

    PubMed

    Notebaert, Wim; Verguts, Tom

    2007-10-01

    Congruency effects are typically smaller after incongruent than after congruent trials. One explanation is in terms of higher levels of cognitive control after detection of conflict (conflict adaptation; e.g., M. M. Botvinick, T. S. Braver, D. M. Barch, C. S. Carter, & J. D. Cohen, 2001). An alternative explanation for these results is based on feature repetition and/or integration effects (e.g., B. Hommel, R. W. Proctor, & K.-P. Vu, 2004; U. Mayr, E. Awh, & P. Laurey, 2003). Previous attempts to dissociate feature integration from conflict adaptation focused on a particular subset of the data in which feature transitions were held constant (J. G. Kerns et al., 2004) or in which congruency transitions were held constant (C. Akcay & E. Hazeltine, in press), but this has a number of disadvantages. In this article, the authors present a multiple regression solution for this problem and discuss its possibilities and pitfalls.

  7. 77 FR 74027 - Certain Integrated Circuit Packages Provided with Multiple Heat-Conducting Paths and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-12

    ... From the Federal Register Online via the Government Publishing Office INTERNATIONAL TRADE COMMISSION Certain Integrated Circuit Packages Provided with Multiple Heat- Conducting Paths and Products... integrated circuit packages provided with multiple heat-conducting paths and products containing same...

  8. Method and systems for collecting data from multiple fields of view

    NASA Technical Reports Server (NTRS)

    Schwemmer, Geary K. (Inventor)

    2002-01-01

    Systems and methods for processing light from multiple fields (48, 54, 55) of view without excessive machinery for scanning optical elements. In an exemplary embodiment of the invention, multiple holographic optical elements (41, 42, 43, 44, 45), integrated on a common film (4), diffract and project light from respective fields of view.

  9. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods.

  10. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  11. Integrated presentation of ecological risk from multiple stressors

    NASA Astrophysics Data System (ADS)

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-10-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  12. Integrated presentation of ecological risk from multiple stressors.

    PubMed

    Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman

    2016-10-26

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.

  13. Integrated presentation of ecological risk from multiple stressors

    PubMed Central

    Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman

    2016-01-01

    Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic. PMID:27782171

  14. Methods for biological data integration: perspectives and challenges

    PubMed Central

    Gligorijević, Vladimir; Pržulj, Nataša

    2015-01-01

    Rapid technological advances have led to the production of different types of biological data and enabled construction of complex networks with various types of interactions between diverse biological entities. Standard network data analysis methods were shown to be limited in dealing with such heterogeneous networked data and consequently, new methods for integrative data analyses have been proposed. The integrative methods can collectively mine multiple types of biological data and produce more holistic, systems-level biological insights. We survey recent methods for collective mining (integration) of various types of networked biological data. We compare different state-of-the-art methods for data integration and highlight their advantages and disadvantages in addressing important biological problems. We identify the important computational challenges of these methods and provide a general guideline for which methods are suited for specific biological problems, or specific data types. Moreover, we propose that recent non-negative matrix factorization-based approaches may become the integration methodology of choice, as they are well suited and accurate in dealing with heterogeneous data and have many opportunities for further development. PMID:26490630

  15. Integral Deferred Correction methods for scientific computing

    NASA Astrophysics Data System (ADS)

    Morton, Maureen Marilla

    Since high order numerical methods frequently can attain accurate solutions more efficiently than low order methods, we develop and analyze new high order numerical integrators for the time discretization of ordinary and partial differential equations. Our novel methods address some of the issues surrounding high order numerical time integration, such as the difficulty of many popular methods' construction and handling the effects of disparate behaviors produce by different terms in the equations to be solved. We are motivated by the simplicity of how Deferred Correction (DC) methods achieve high order accuracy [72, 27]. DC methods are numerical time integrators that, rather than calculating tedious coefficients for order conditions, instead construct high order accurate solutions by iteratively improving a low order preliminary numerical solution. With each iteration, an error equation is solved, the error decreases, and the order of accuracy increases. Later, DC methods were adjusted to include an integral formulation of the residual, which stabilizes the method. These Spectral Deferred Correction (SDC) methods [25] motivated Integral Deferred Corrections (IDC) methods. Typically, SDC methods are limited to increasing the order of accuracy by one with each iteration due to smoothness properties imposed by the gridspacing. However, under mild assumptions, explicit IDC methods allow for any explicit rth order Runge-Kutta (RK) method to be used within each iteration, and then an order of accuracy increase of r is attained after each iteration [18]. We extend these results to the construction of implicit IDC methods that use implicit RK methods, and we prove analogous results for order of convergence. One means of solving equations with disparate parts is by semi-implicit integrators, handling a "fast" part implicitly and a "slow" part explicitly. We incorporate additive RK (ARK) integrators into the iterations of IDC methods in order to construct new arbitrary order

  16. Nonlinear multiplicative dendritic integration in neuron and network models

    PubMed Central

    Zhang, Danke; Li, Yuanqing; Rasch, Malte J.; Wu, Si

    2013-01-01

    Neurons receive inputs from thousands of synapses distributed across dendritic trees of complex morphology. It is known that dendritic integration of excitatory and inhibitory synapses can be highly non-linear in reality and can heavily depend on the exact location and spatial arrangement of inhibitory and excitatory synapses on the dendrite. Despite this known fact, most neuron models used in artificial neural networks today still only describe the voltage potential of a single somatic compartment and assume a simple linear summation of all individual synaptic inputs. We here suggest a new biophysical motivated derivation of a single compartment model that integrates the non-linear effects of shunting inhibition, where an inhibitory input on the route of an excitatory input to the soma cancels or “shunts” the excitatory potential. In particular, our integration of non-linear dendritic processing into the neuron model follows a simple multiplicative rule, suggested recently by experiments, and allows for strict mathematical treatment of network effects. Using our new formulation, we further devised a spiking network model where inhibitory neurons act as global shunting gates, and show that the network exhibits persistent activity in a low firing regime. PMID:23658543

  17. An advanced Gibbs-Duhem integration method: theory and applications.

    PubMed

    van 't Hof, A; Peters, C J; de Leeuw, S W

    2006-02-07

    The conventional Gibbs-Duhem integration method is very convenient for the prediction of phase equilibria of both pure components and mixtures. However, it turns out to be inefficient. The method requires a number of lengthy simulations to predict the state conditions at which phase coexistence occurs. This number is not known from the outset of the numerical integration process. Furthermore, the molecular configurations generated during the simulations are merely used to predict the coexistence condition and not the liquid- and vapor-phase densities and mole fractions at coexistence. In this publication, an advanced Gibbs-Duhem integration method is presented that overcomes above-mentioned disadvantage and inefficiency. The advanced method is a combination of Gibbs-Duhem integration and multiple-histogram reweighting. Application of multiple-histogram reweighting enables the substitution of the unknown number of simulations by a fixed and predetermined number. The advanced method has a retroactive nature; a current simulation improves the predictions of previously computed coexistence points as well. The advanced Gibbs-Duhem integration method has been applied for the prediction of vapor-liquid equilibria of a number of binary mixtures. The method turned out to be very convenient, much faster than the conventional method, and provided smooth simulation results. As the employed force fields perfectly predict pure-component vapor-liquid equilibria, the binary simulations were very well suitable for testing the performance of different sets of combining rules. Employing Lorentz-Hudson-McCoubrey combining rules for interactions between unlike molecules, as opposed to Lorentz-Berthelot combining rules for all interactions, considerably improved the agreement between experimental and simulated data.

  18. 77 FR 33486 - Certain Integrated Circuit Packages Provided With Multiple Heat-Conducting Paths and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... COMMISSION Certain Integrated Circuit Packages Provided With Multiple Heat- Conducting Paths and Products.... International Trade Commission has received a complaint entitled Certain Integrated Circuit Packages Provided... sale within the United States after importation of certain integrated circuit packages provided...

  19. Principles and methods of integrative genomic analyses in cancer.

    PubMed

    Kristensen, Vessela N; Lingjærde, Ole Christian; Russnes, Hege G; Vollan, Hans Kristian M; Frigessi, Arnoldo; Børresen-Dale, Anne-Lise

    2014-05-01

    Combined analyses of molecular data, such as DNA copy-number alteration, mRNA and protein expression, point to biological functions and molecular pathways being deregulated in multiple cancers. Genomic, metabolomic and clinical data from various solid cancers and model systems are emerging and can be used to identify novel patient subgroups for tailored therapy and monitoring. The integrative genomics methodologies that are used to interpret these data require expertise in different disciplines, such as biology, medicine, mathematics, statistics and bioinformatics, and they can seem daunting. The objectives, methods and computational tools of integrative genomics that are available to date are reviewed here, as is their implementation in cancer research.

  20. Methods of geometrical integration in accelerator physics

    NASA Astrophysics Data System (ADS)

    Andrianov, S. N.

    2016-12-01

    In the paper we consider a method of geometric integration for a long evolution of the particle beam in cyclic accelerators, based on the matrix representation of the operator of particles evolution. This method allows us to calculate the corresponding beam evolution in terms of two-dimensional matrices including for nonlinear effects. The ideology of the geometric integration introduces in appropriate computational algorithms amendments which are necessary for preserving the qualitative properties of maps presented in the form of the truncated series generated by the operator of evolution. This formalism extends both on polarized and intense beams. Examples of practical applications are described.

  1. Differential temperature integrating diagnostic method and apparatus

    DOEpatents

    Doss, James D.; McCabe, Charles W.

    1976-01-01

    A method and device for detecting the presence of breast cancer in women by integrating the temperature difference between the temperature of a normal breast and that of a breast having a malignant tumor. The breast-receiving cups of a brassiere are each provided with thermally conductive material next to the skin, with a thermistor attached to the thermally conductive material in each cup. The thermistors are connected to adjacent arms of a Wheatstone bridge. Unbalance currents in the bridge are integrated with respect to time by means of an electrochemical integrator. In the absence of a tumor, both breasts maintain substantially the same temperature, and the bridge remains balanced. If the tumor is present in one breast, a higher temperature in that breast unbalances the bridge and the electrochemical cells integrate the temperature difference with respect to time.

  2. Multiple attenuation to reflection seismic data using Radon filter and Wave Equation Multiple Rejection (WEMR) method

    SciTech Connect

    Erlangga, Mokhammad Puput

    2015-04-16

    Separation between signal and noise, incoherent or coherent, is important in seismic data processing. Although we have processed the seismic data, the coherent noise is still mixing with the primary signal. Multiple reflections are a kind of coherent noise. In this research, we processed seismic data to attenuate multiple reflections in the both synthetic and real seismic data of Mentawai. There are several methods to attenuate multiple reflection, one of them is Radon filter method that discriminates between primary reflection and multiple reflection in the τ-p domain based on move out difference between primary reflection and multiple reflection. However, in case where the move out difference is too small, the Radon filter method is not enough to attenuate the multiple reflections. The Radon filter also produces the artifacts on the gathers data. Except the Radon filter method, we also use the Wave Equation Multiple Elimination (WEMR) method to attenuate the long period multiple reflection. The WEMR method can attenuate the long period multiple reflection based on wave equation inversion. Refer to the inversion of wave equation and the magnitude of the seismic wave amplitude that observed on the free surface, we get the water bottom reflectivity which is used to eliminate the multiple reflections. The WEMR method does not depend on the move out difference to attenuate the long period multiple reflection. Therefore, the WEMR method can be applied to the seismic data which has small move out difference as the Mentawai seismic data. The small move out difference on the Mentawai seismic data is caused by the restrictiveness of far offset, which is only 705 meter. We compared the real free multiple stacking data after processing with Radon filter and WEMR process. The conclusion is the WEMR method can more attenuate the long period multiple reflection than the Radon filter method on the real (Mentawai) seismic data.

  3. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, S. N.; Berke, L.; Gallagher, R. H.

    1991-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EEs) are integrated with the global compatibility conditions (CCs) to form the governing set of equations. In IFM the CCs are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  4. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.

    1990-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  5. An integrated modelling framework for neural circuits with multiple neuromodulators

    PubMed Central

    Vemana, Vinith

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. PMID:28100828

  6. An integrated modelling framework for neural circuits with multiple neuromodulators.

    PubMed

    Joshi, Alok; Youssofzadeh, Vahab; Vemana, Vinith; McGinnity, T M; Prasad, Girijesh; Wong-Lin, KongFatt

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. © 2017 The Authors.

  7. New method for finding multiple meaningful trajectories

    NASA Astrophysics Data System (ADS)

    Bao, Zhonghao; Flachs, Gerald M.; Jordan, Jay B.

    1995-07-01

    Mathematical foundations and algorithms for efficiently finding multiple meaningful trajectories (FMMT) in a sequence of digital images are presented. A meaningful trajectory is motion created by a sentient being or by a device under the control of a sentient being. It is smooth and predictable over short time intervals. A meaningful trajectory can suddenly appear or disappear in sequence images. The development of the FMMT is based on these assumptions. A finite state machine in the FMMT is used to model the trajectories under the conditions of occlusions and false targets. Each possible trajectory is associated with an initial state of a finite state machine. When two frames of data are available, a linear predictor is used to predict the locations of all possible trajectories. All trajectories within a certain error bound are moved to a monitoring trajectory state. When trajectories attain three consecutive good predictions, they are moved to a valid trajectory state and considered to be locked into a tracking mode. If an object is occluded while in the valid trajectory state, the predicted position is used to continue to track; however, the confidence in the trajectory is lowered. If the trajectory confidence falls below a lower limit, the trajectory is terminated. Results are presented that illustrate the FMMT applied to track multiple munitions fired from a missile in a sequence of images. Accurate trajectories are determined even in poor images where the probabilities of miss and false alarm are very high.

  8. Bioluminescent bioreporter integrated circuit detection methods

    DOEpatents

    Simpson, Michael L.; Paulus, Michael J.; Sayler, Gary S.; Applegate, Bruce M.; Ripp, Steven A.

    2005-06-14

    Disclosed are monolithic bioelectronic devices comprising a bioreporter and an OASIC. These bioluminescent bioreporter integrated circuit are useful in detecting substances such as pollutants, explosives, and heavy-metals residing in inhospitable areas such as groundwater, industrial process vessels, and battlefields. Also disclosed are methods and apparatus for detection of particular analytes, including ammonia and estrogen compounds.

  9. Collaborative Teaching of an Integrated Methods Course

    ERIC Educational Resources Information Center

    Zhou, George; Kim, Jinyoung; Kerekes, Judit

    2011-01-01

    With an increasing diversity in American schools, teachers need to be able to collaborate in teaching. University courses are widely considered as a stage to demonstrate or model the ways of collaboration. To respond to this call, three authors team taught an integrated methods course at an urban public university in the city of New York.…

  10. Implicit integration methods for dislocation dynamics

    DOE PAGES

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; ...

    2015-01-20

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events, and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. Here, this paper investigates the viability of high order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a waymore » of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.« less

  11. Implicit integration methods for dislocation dynamics

    SciTech Connect

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; Hommes, G.; Aubry, S.; Arsenlis, A.

    2015-01-20

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events, and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. Here, this paper investigates the viability of high order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a way of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.

  12. Implicit integration methods for dislocation dynamics

    NASA Astrophysics Data System (ADS)

    Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; Hommes, G.; Aubry, S.; Arsenlis, A.

    2015-03-01

    In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. This paper investigates the viability of high-order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a way of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.

  13. Multiple Integrated Complementary Healing Approaches: Energetics & Light for bone.

    PubMed

    Gray, Michael G; Lackey, Brett R; Patrick, Evelyn F; Gray, Sandra L; Hurley, Susan G

    2016-01-01

    A synergistic-healing strategy that combines molecular targeting within a system-wide perspective is presented as the Multiple Integrated Complementary Healing Approaches: Energetics And Light (MICHAEL). The basis of the MICHAEL approach is the realization that environmental, nutritional and electromagnetic factors form a regulatory framework involved in bone and nerve healing. The interactions of light, energy, and nutrition with neural, hormonal and cellular pathways will be presented. Energetic therapies including electrical, low-intensity pulsed ultrasound and light based treatments affect growth, differentiation and proliferation of bone and nerve and can be utilized for their healing benefits. However, the benefits of these therapies can be impaired by the absence of nutritional, hormonal and organismal factors. For example, lack of sleep, disrupted circadian rhythms and vitamin-D deficiency can impair healing. Molecular targets, such as the Wnt pathway, protein kinase B and glucocorticoid signaling systems can be modulated by nutritional components, including quercetin, curcumin and Mg(2+) to enhance the healing process. The importance of water and water-regulation will be presented as an integral component. The effects of exercise and acupuncture on bone healing will also be discussed within the context of the MICHAEL approach.

  14. Using Multiple Ontologies to Integrate Complex Biological Data

    PubMed Central

    Petri, Victoria; Pasko, Dean; Bromberg, Susan; Wu, Wenhua; Chen, Jiali; Nenasheva, Nataliya; Kwitek, Anne; Twigger, Simon; Jacob, Howard

    2005-01-01

    The strength of the rat as a model organism lies in its utility in pharmacology, biochemistry and physiology research. Data resulting from such studies is difficult to represent in databases and the creation of user-friendly data mining tools has proved difficult. The Rat Genome Database has developed a comprehensive ontology-based data structure and annotation system to integrate physiological data along with environmental and experimental factors, as well as genetic and genomic information. RGD uses multiple ontologies to integrate complex biological information from the molecular level to the whole organism, and to develop data mining and presentation tools. This approach allows RGD to indicate not only the phenotypes seen in a strain but also the specific values under each diet and atmospheric condition, as well as gender differences. Harnessing the power of ontologies in this way allows the user to gather and filter data in a customized fashion, so that a researcher can retrieve all phenotype readings for which a high hypoxia is a factor. Utilizing the same data structure for expression data, pathways and biological processes, RGD will provide a comprehensive research platform which allows users to investigate the conditions under which biological processes are altered and to elucidate the mechanisms of disease. PMID:18629202

  15. Integrated molecular profiling of SOD2 expression in multiple myeloma.

    PubMed

    Hurt, Elaine M; Thomas, Suneetha B; Peng, Benjamin; Farrar, William L

    2007-05-01

    Reactive oxygen species are known to be involved in several cellular processes, including cell signaling. SOD2 is a key enzyme in the conversion of reactive oxygen species and has been implicated in a host of disease states, including cancer. Using an integrated, whole-cell approach encompassing epigenetics, genomics, and proteomics, we have defined the role of SOD2 in multiple myeloma. We show that the SOD2 promoter is methylated in several cell lines and there is a correlative decrease in expression. Furthermore, myeloma patient samples have decreased SOD2 expression compared with healthy donors. Overexpression of SOD2 results in decreased proliferation and altered sensitivity to 2-methoxyestradiol-induced DNA damage and apoptosis. Genomic profiling revealed regulation of 65 genes, including genes involved in tumorigenesis, and proteomic analysis identified activation of the JAK/STAT pathway. Analysis of nearly 400 activated transcription factors identified 31 transcription factors with altered DNA binding activity, including XBP1, NFAT, forkhead, and GAS binding sites. Integration of data from our gestalt molecular analysis has defined a role for SOD2 in cellular proliferation, JAK/STAT signaling, and regulation of several transcription factors.

  16. Tools and Models for Integrating Multiple Cellular Networks

    SciTech Connect

    Gerstein, Mark

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  17. Fidelity of the Integrated Force Method Solution

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale; Halford, Gary; Coroneos, Rula; Patnaik, Surya

    2002-01-01

    The theory of strain compatibility of the solid mechanics discipline was incomplete since St. Venant's 'strain formulation' in 1876. We have addressed the compatibility condition both in the continuum and the discrete system. This has lead to the formulation of the Integrated Force Method. A dual Integrated Force Method with displacement as the primal variable has also been formulated. A modest finite element code (IFM/Analyzers) based on the IFM theory has been developed. For a set of standard test problems the IFM results were compared with the stiffness method solutions and the MSC/Nastran code. For the problems IFM outperformed the existing methods. Superior IFM performance is attributed to simultaneous compliance of equilibrium equation and compatibility condition. MSC/Nastran organization expressed reluctance to accept the high fidelity IFM solutions. This report discusses the solutions to the examples. No inaccuracy was detected in the IFM solutions. A stiffness method code with a small programming effort can be improved to reap the many IFM benefits when implemented with the IFMD elements. Dr. Halford conducted a peer-review on the Integrated Force Method. Reviewers' response is included.

  18. Multiple Shooting-Local Linearization method for the identification of dynamical systems

    NASA Astrophysics Data System (ADS)

    Carbonell, F.; Iturria-Medina, Y.; Jimenez, J. C.

    2016-08-01

    The combination of the multiple shooting strategy with the generalized Gauss-Newton algorithm turns out in a recognized method for estimating parameters in ordinary differential equations (ODEs) from noisy discrete observations. A key issue for an efficient implementation of this method is the accurate integration of the ODE and the evaluation of the derivatives involved in the optimization algorithm. In this paper, we study the feasibility of the Local Linearization (LL) approach for the simultaneous numerical integration of the ODE and the evaluation of such derivatives. This integration approach results in a stable method for the accurate approximation of the derivatives with no more computational cost than that involved in the integration of the ODE. The numerical simulations show that the proposed Multiple Shooting-Local Linearization method recovers the true parameters value under different scenarios of noisy data.

  19. Green Function Calculation for Full-potential Multiple Scattering Methods

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Stocks, G. Malcolm; Nicholson, Don

    2001-03-01

    The Green function in the multiple scattering theory of Korringa(J.Korringa, Physica) 13, 392 (1947)., Kohn and Rostoker(W.Kohn and N.Rostoker, Phys. Rev.) 94, 1111 (1954). provides a very convenient approach to the electronic structure calculation for solids. The Green function was originally developed for muffin-tin potentials(J.S. Faulkner and G.M. Stocks, Phys. Rev.) B 21, 3222 (1980)., but can be generalized to the full potential case in which the one-electron potential associated with each atom is of arbitrary geometric shape. In this talk, we present our numerical techniques for Green function calculation in our newly developed full potential multiple scattering method code. We test the calculated Green function against the analytical expression for the case of three dimensional space filling simple analytic potentials. We show how the surface integral technique is used for the calculation of the single site scattering matrices and irregular solutions. We also discuss the L-convergence properties of the Green function.

  20. Numerical methods for engine-airframe integration

    SciTech Connect

    Murthy, S.N.B.; Paynter, G.C.

    1986-01-01

    Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison of full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment.

  1. Multiple tag labeling method for DNA sequencing

    DOEpatents

    Mathies, Richard A.; Huang, Xiaohua C.; Quesada, Mark A.

    1995-01-01

    A DNA sequencing method described which uses single lane or channel electrophoresis. Sequencing fragments are separated in said lane and detected using a laser-excited, confocal fluorescence scanner. Each set of DNA sequencing fragments is separated in the same lane and then distinguished using a binary coding scheme employing only two different fluorescent labels. Also described is a method of using radio-isotope labels.

  2. Multiple tag labeling method for DNA sequencing

    DOEpatents

    Mathies, R.A.; Huang, X.C.; Quesada, M.A.

    1995-07-25

    A DNA sequencing method is described which uses single lane or channel electrophoresis. Sequencing fragments are separated in the lane and detected using a laser-excited, confocal fluorescence scanner. Each set of DNA sequencing fragments is separated in the same lane and then distinguished using a binary coding scheme employing only two different fluorescent labels. Also described is a method of using radioisotope labels. 5 figs.

  3. Package for integrated optic circuit and method

    DOEpatents

    Kravitz, S.H.; Hadley, G.R.; Warren, M.E.; Carson, R.F.; Armendariz, M.G.

    1998-08-04

    A structure and method are disclosed for packaging an integrated optic circuit. The package comprises a first wall having a plurality of microlenses formed therein to establish channels of optical communication with an integrated optic circuit within the package. A first registration pattern is provided on an inside surface of one of the walls of the package for alignment and attachment of the integrated optic circuit. The package in one embodiment may further comprise a fiber holder for aligning and attaching a plurality of optical fibers to the package and extending the channels of optical communication to the fibers outside the package. In another embodiment, a fiber holder may be used to hold the fibers and align the fibers to the package. The fiber holder may be detachably connected to the package. 6 figs.

  4. Package for integrated optic circuit and method

    DOEpatents

    Kravitz, Stanley H.; Hadley, G. Ronald; Warren, Mial E.; Carson, Richard F.; Armendariz, Marcelino G.

    1998-01-01

    A structure and method for packaging an integrated optic circuit. The package comprises a first wall having a plurality of microlenses formed therein to establish channels of optical communication with an integrated optic circuit within the package. A first registration pattern is provided on an inside surface of one of the walls of the package for alignment and attachment of the integrated optic circuit. The package in one embodiment may further comprise a fiber holder for aligning and attaching a plurality of optical fibers to the package and extending the channels of optical communication to the fibers outside the package. In another embodiment, a fiber holder may be used to hold the fibers and align the fibers to the package. The fiber holder may be detachably connected to the package.

  5. A Method for Obtaining Integrable Couplings

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-Sen; Chen, Wei; Liao, Bo; Gong, Xin-Bo

    2006-06-01

    By making use of the vector product in R3, a commuting operation is introduced so that R3 becomes a Lie algebra. The resulting loop algebra tilde R3 is presented, from which the well-known AKNS hierarchy is produced. Again via applying the superposition of the commuting operations of the Lie algebra, a commuting operation in R6 is constructed so that R6 becomes a Lie algebra. Thanks to the corresponding loop algebra tilde R3 of the Lie algebra R3, the integrable coupling of the AKNS system is obtained. The method presented in this paper is rather simple and can be used to work out integrable coupling systems of the other known integrable hierarchies of soliton equations.

  6. Evaluation of Scheduling Methods for Multiple Runways

    NASA Technical Reports Server (NTRS)

    Bolender, Michael A.; Slater, G. L.

    1996-01-01

    Several scheduling strategies are analyzed in order to determine the most efficient means of scheduling aircraft when multiple runways are operational and the airport is operating at different utilization rates. The study compares simulation data for two and three runway scenarios to results from queuing theory for an M/D/n queue. The direction taken, however, is not to do a steady-state, or equilibrium, analysis since this is not the case during a rush period at a typical airport. Instead, a transient analysis of the delay per aircraft is performed. It is shown that the scheduling strategy that reduces the delay depends upon the density of the arrival traffic. For light traffic, scheduling aircraft to their preferred runways is sufficient; however, as the arrival rate increases, it becomes more important to separate traffic by weight class. Significant delay reduction is realized when aircraft that belong to the heavy and small weight classes are sent to separate runways with large aircraft put into the 'best' landing slot.

  7. Recursive integral method for transmission eigenvalues

    NASA Astrophysics Data System (ADS)

    Huang, Ruihao; Struthers, Allan A.; Sun, Jiguang; Zhang, Ruming

    2016-12-01

    Transmission eigenvalue problems arise from inverse scattering theory for inhomogeneous media. These non-selfadjoint problems are numerically challenging because of a complicated spectrum. In this paper, we propose a novel recursive contour integral method for matrix eigenvalue problems from finite element discretizations of transmission eigenvalue problems. The technique tests (using an approximate spectral projection) if a region contains eigenvalues. Regions that contain eigenvalues are subdivided and tested recursively until eigenvalues are isolated with a specified precision. The method is fully parallel and requires no a priori spectral information. Numerical examples show the method is effective and robust.

  8. Students' Use of "Look Back" Strategies in Multiple Solution Methods

    ERIC Educational Resources Information Center

    Lee, Shin-Yi

    2016-01-01

    The purpose of this study was to investigate the relationship between both 9th-grade and 1st-year undergraduate students' use of "look back" strategies and problem solving performance in multiple solution methods, the difference in their use of look back strategies and problem solving performance in multiple solution methods, and the…

  9. Comparability of Multiple Rank Order and Paired Comparison Methods.

    ERIC Educational Resources Information Center

    Rounds, James B., Jr.; And Others

    1978-01-01

    Two studies compared multiple rank order and paired comparison methods in terms of psychometric characteristics and user reactions. Individual and group item responses, preference counts, and Thurstone normal transform scale values obtained by the multiple rank order method were found to be similar to those obtained by paired comparisons.…

  10. Students' Use of "Look Back" Strategies in Multiple Solution Methods

    ERIC Educational Resources Information Center

    Lee, Shin-Yi

    2016-01-01

    The purpose of this study was to investigate the relationship between both 9th-grade and 1st-year undergraduate students' use of "look back" strategies and problem solving performance in multiple solution methods, the difference in their use of look back strategies and problem solving performance in multiple solution methods, and the…

  11. A Method for Comparing Completely Standardized Solutions in Multiple Groups.

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2000-01-01

    Outlines a method for comparing completely standardized solutions in multiple groups. The method is based on a correlation structure analysis of equal-size samples and uses the correlation distribution theory implemented in the structural equation modeling program RAMONA. (SLD)

  12. A rapid and reliable strategy for chromosomal integration of gene(s) with multiple copies

    PubMed Central

    Gu, Pengfei; Yang, Fan; Su, Tianyuan; Wang, Qian; Liang, Quanfeng; Qi, Qingsheng

    2015-01-01

    Direct optimization of the metabolic pathways on the chromosome requires tools that can fine tune the overexpression of a desired gene or optimize the combination of multiple genes. Although plasmid-dependent overexpression has been used for this task, fundamental issues concerning its genetic stability and operational repeatability have not been addressed. Here, we describe a rapid and reliable strategy for chromosomal integration of gene(s) with multiple copies (CIGMC), which uses the flippase from the yeast 2-μm plasmid. Using green fluorescence protein as a model, we verified that the fluorescent intensity was in accordance with the integration copy number of the target gene. When a narrow-host-range replicon, R6K, was used in the integrative plasmid, the maximum integrated copy number of Escherichia coli reached 15. Applying the CIGMC method to optimize the overexpression of single or multiple genes in amino acid biosynthesis, we successfully improved the product yield and stability of the production. As a flexible strategy, CIGMC can be used in various microorganisms other than E. coli. PMID:25851494

  13. MSblender: a probabilistic approach for integrating peptide identifications from multiple database search engines

    PubMed Central

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I.; Marcotte, Edward M.

    2011-01-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for all possible PSMs and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for all detected proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652

  14. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    PubMed

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  15. 77 FR 39735 - Certain Integrated Circuit Packages Provided With Multiple Heat-Conducting Paths and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-05

    ... Integrated Circuit Packages Provided With Multiple Heat- Conducting Paths and Products Containing Same... within the United States after importation of certain integrated circuit packages provided with multiple... importation, or the sale within the United States after importation of certain integrated circuit...

  16. An evaluation of sampling effects on multiple DNA barcoding methods leads to an integrative approach for delimiting species: a case study of the North American tarantula genus Aphonopelma (Araneae, Mygalomorphae, Theraphosidae).

    PubMed

    Hamilton, Chris A; Hendrixson, Brent E; Brewer, Michael S; Bond, Jason E

    2014-02-01

    The North American tarantula genus Aphonopelma provides one of the greatest challenges to species delimitation and downstream identification in spiders because traditional morphological characters appear ineffective for evaluating limits of intra- and interspecific variation in the group. We evaluated the efficacy of numerous molecular-based approaches to species delimitation within Aphonopelma based upon the most extensive sampling of theraphosids to date, while also investigating the sensitivity of randomized taxon sampling on the reproducibility of species boundaries. Mitochondrial DNA (cytochrome c oxidase subunit I) sequences were sampled from 682 specimens spanning the genetic, taxonomic, and geographic breadth of the genus within the United States. The effects of random taxon sampling compared traditional Neighbor-Joining with three modern quantitative species delimitation approaches (ABGD, P ID(Liberal), and GMYC). Our findings reveal remarkable consistency and congruence across various approaches and sampling regimes, while highlighting highly divergent outcomes in GMYC. Our investigation allowed us to integrate methodologies into an efficient, consistent, and more effective general methodological workflow for estimating species boundaries within the mygalomorph spider genus Aphonopelma. Taken alone, these approaches are not particularly useful - especially in the absence of prior knowledge of the focal taxa. Only through the incorporation of multiple lines of evidence, employed in a hypothesis-testing framework, can the identification and delimitation of confident species boundaries be determined. A key point in studying closely related species, and perhaps one of the most important aspects of DNA barcoding, is to combine a sampling strategy that broadly identifies the extent of genetic diversity across the distributions of the species of interest and incorporates previous knowledge into the "species equation" (morphology, molecules, and natural history).

  17. Methods for monitoring multiple gene expression

    SciTech Connect

    Berka, Randy; Bachkirova, Elena; Rey, Michael

    2012-05-01

    The present invention relates to methods for monitoring differential expression of a plurality of genes in a first filamentous fungal cell relative to expression of the same genes in one or more second filamentous fungal cells using microarrays containing Trichoderma reesei ESTs or SSH clones, or a combination thereof. The present invention also relates to computer readable media and substrates containing such array features for monitoring expression of a plurality of genes in filamentous fungal cells.

  18. Methods for monitoring multiple gene expression

    SciTech Connect

    Berka, Randy; Bachkirova, Elena; Rey, Michael

    2013-10-01

    The present invention relates to methods for monitoring differential expression of a plurality of genes in a first filamentous fungal cell relative to expression of the same genes in one or more second filamentous fungal cells using microarrays containing Trichoderma reesei ESTs or SSH clones, or a combination thereof. The present invention also relates to computer readable media and substrates containing such array features for monitoring expression of a plurality of genes in filamentous fungal cells.

  19. Methods for monitoring multiple gene expression

    DOEpatents

    Berka, Randy; Bachkirova, Elena; Rey, Michael

    2008-06-01

    The present invention relates to methods for monitoring differential expression of a plurality of genes in a first filamentous fungal cell relative to expression of the same genes in one or more second filamentous fungal cells using microarrays containing Trichoderma reesei ESTs or SSH clones, or a combination thereof. The present invention also relates to computer readable media and substrates containing such array features for monitoring expression of a plurality of genes in filamentous fungal cells.

  20. Harmonizing multiple methods for reconstructing historical potential and reference evapotranspiration

    USGS Publications Warehouse

    Belaineh, Getachew; Sumner, David; Carter, Edward; Clapp, David

    2013-01-01

    Potential evapotranspiration (PET) and reference evapotranspiration (RET) data are usually critical components of hydrologic analysis. Many different equations are available to estimate PET and RET. Most of these equations, such as the Priestley-Taylor and Penman- Monteith methods, rely on detailed meteorological data collected at ground-based weather stations. Few weather stations collect enough data to estimate PET or RET using one of the more complex evapotranspiration equations. Currently, satellite data integrated with ground meteorological data are used with one of these evapotranspiration equations to accurately estimate PET and RET. However, earlier than the last few decades, historical reconstructions of PET and RET needed for many hydrologic analyses are limited by the paucity of satellite data and of some types of ground data. Air temperature stands out as the most generally available meteorological ground data type over the last century. Temperature-based approaches used with readily available historical temperature data offer the potential for long period-of-record PET and RET historical reconstructions. A challenge is the inconsistency between the more accurate, but more data intensive, methods appropriate for more recent periods and the less accurate, but less data intensive, methods appropriate to the more distant past. In this study, multiple methods are harmonized in a seamless reconstruction of historical PET and RET by quantifying and eliminating the biases of the simple Hargreaves-Samani method relative to the more complex and accurate Priestley-Taylor and Penman-Monteith methods. This harmonization process is used to generate long-term, internally consistent, spatiotemporal databases of PET and RET.

  1. Multistep Methods for Integrating the Solar System

    DTIC Science & Technology

    1988-07-01

    Technical Report 1055 [Multistep Methods for Integrating the Solar System 0 Panayotis A. Skordos’ MIT Artificial Intelligence Laboratory DTIC S D g8...RMA ELEENT. PROECT. TASK Artific ial Inteligence Laboratory ARE1A G WORK UNIT NUMBERS 545 Technology Square Cambridge, MA 02139 IL. CONTROLLING...describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology, supported by the Advanced Research Projects

  2. Enhanced performance for the interacting multiple model estimator with integrated multiple filters

    NASA Astrophysics Data System (ADS)

    Sabordo, Madeleine G.; Aboutanios, Elias

    2015-05-01

    In this paper, we propose a new approach to target visibility for the Interacting Multiple Model (IMM) algorithm. We introduce the IMM Integrated Multiple Filters (IMF) to selectively engage a suitable filter appropriate for gated clutter density at each time step and investigate five model sets that model the dynamic motion of a manoeuvring target. The model sets are incorporated into the IMM-IMF tracker to estimate the behaviour of the target. We employ the Dynamic Error Spectrum (DES) to assess the effectiveness of the tracker with target visibility concept incorporated and to compare the performance of the model sets in enhancing tracking performance. Results show that the new version of target visibility significantly improves the performance of the tracker. Simulation results also demonstrate that the 2CV-CA-2CT model set proves to be the most robust at the cost of computational resource. The CV-CA model is the fastest tracker. However, it is the least robust in terms of performance. These results assist decision makers and researchers in choosing appropriate models for IMMtrackers. Augmenting the capability of the tracker improves the ability of the platform to identify possible threats and consequently, enhance situational awareness.

  3. Impaired functional integration in multiple sclerosis: a graph theory study.

    PubMed

    Rocca, Maria A; Valsasina, Paola; Meani, Alessandro; Falini, Andrea; Comi, Giancarlo; Filippi, Massimo

    2016-01-01

    Aim of this study was to explore the topological organization of functional brain network connectivity in a large cohort of multiple sclerosis (MS) patients and to assess whether its disruption contributes to disease clinical manifestations. Graph theoretical analysis was applied to resting state fMRI data from 246 MS patients and 55 matched healthy controls (HC). Functional connectivity between 116 cortical and subcortical brain regions was estimated using a bivariate correlation analysis. Global network properties (network degree, global efficiency, hierarchy, path length and assortativity) were abnormal in MS patients vs HC, and contributed to distinguish cognitively impaired MS patients (34%) from HC, but not the main MS clinical phenotypes. Compared to HC, MS patients also showed: (1) a loss of hubs in the superior frontal gyrus, precuneus and anterior cingulum in the left hemisphere; (2) a different lateralization of basal ganglia hubs (mostly located in the left hemisphere in HC, and in the right hemisphere in MS patients); and (3) a formation of hubs, not seen in HC, in the left temporal pole and cerebellum. MS patients also experienced a decreased nodal degree in the bilateral caudate nucleus and right cerebellum. Such a modification of regional network properties contributed to cognitive impairment and phenotypic variability of MS. An impairment of global integration (likely to reflect a reduced competence in information exchange between distant brain areas) occurs in MS and is associated with cognitive deficits. A regional redistribution of network properties contributes to cognitive status and phenotypic variability of these patients.

  4. Method and apparatus for controlling multiple motors

    DOEpatents

    Jones, Rollin G.; Kortegaard, Bert L.; Jones, David F.

    1987-01-01

    A method and apparatus are provided for simultaneously controlling a plurality of stepper motors. Addressing circuitry generates address data for each motor in a periodic address sequence. Memory circuits respond to the address data for each motor by accessing a corresponding memory location containing a first operational data set functionally related to a direction for moving the motor, speed data, and rate of speed change. First logic circuits respond to the first data set to generate a motor step command. Second logic circuits respond to the command from the first logic circuits to generate a third data set for replacing the first data set in memory with a current operational motor status, which becomes the first data set when the motor is next addressed.

  5. Monte Carlo methods for multidimensional integration for European option pricing

    NASA Astrophysics Data System (ADS)

    Todorov, V.; Dimov, I. T.

    2016-10-01

    In this paper, we illustrate examples of highly accurate Monte Carlo and quasi-Monte Carlo methods for multiple integrals related to the evaluation of European style options. The idea is that the value of the option is formulated in terms of the expectation of some random variable; then the average of independent samples of this random variable is used to estimate the value of the option. First we obtain an integral representation for the value of the option using the risk neutral valuation formula. Then with an appropriations change of the constants we obtain a multidimensional integral over the unit hypercube of the corresponding dimensionality. Then we compare a specific type of lattice rules over one of the best low discrepancy sequence of Sobol for numerical integration. Quasi-Monte Carlo methods are compared with Adaptive and Crude Monte Carlo techniques for solving the problem. The four approaches are completely different thus it is a question of interest to know which one of them outperforms the other for evaluation multidimensional integrals in finance. Some of the advantages and disadvantages of the developed algorithms are discussed.

  6. Orthogonal matrix factorization enables integrative analysis of multiple RNA binding proteins

    PubMed Central

    Stražar, Martin; Žitnik, Marinka; Zupan, Blaž; Ule, Jernej; Curk, Tomaž

    2016-01-01

    Motivation: RNA binding proteins (RBPs) play important roles in post-transcriptional control of gene expression, including splicing, transport, polyadenylation and RNA stability. To model protein–RNA interactions by considering all available sources of information, it is necessary to integrate the rapidly growing RBP experimental data with the latest genome annotation, gene function, RNA sequence and structure. Such integration is possible by matrix factorization, where current approaches have an undesired tendency to identify only a small number of the strongest patterns with overlapping features. Because protein–RNA interactions are orchestrated by multiple factors, methods that identify discriminative patterns of varying strengths are needed. Results: We have developed an integrative orthogonality-regularized nonnegative matrix factorization (iONMF) to integrate multiple data sources and discover non-overlapping, class-specific RNA binding patterns of varying strengths. The orthogonality constraint halves the effective size of the factor model and outperforms other NMF models in predicting RBP interaction sites on RNA. We have integrated the largest data compendium to date, which includes 31 CLIP experiments on 19 RBPs involved in splicing (such as hnRNPs, U2AF2, ELAVL1, TDP-43 and FUS) and processing of 3’UTR (Ago, IGF2BP). We show that the integration of multiple data sources improves the predictive accuracy of retrieval of RNA binding sites. In our study the key predictive factors of protein–RNA interactions were the position of RNA structure and sequence motifs, RBP co-binding and gene region type. We report on a number of protein-specific patterns, many of which are consistent with experimentally determined properties of RBPs. Availability and implementation: The iONMF implementation and example datasets are available at https://github.com/mstrazar/ionmf. Contact: tomaz.curk@fri.uni-lj.si Supplementary information: Supplementary data are available

  7. Integration of multiple view plus depth data for free viewpoint 3D display

    NASA Astrophysics Data System (ADS)

    Suzuki, Kazuyoshi; Yoshida, Yuko; Kawamoto, Tetsuya; Fujii, Toshiaki; Mase, Kenji

    2014-03-01

    This paper proposes a method for constructing a reasonable scale of end-to-end free-viewpoint video system that captures multiple view and depth data, reconstructs three-dimensional polygon models of objects, and display them on virtual 3D CG spaces. This system consists of a desktop PC and four Kinect sensors. First, multiple view plus depth data at four viewpoints are captured by Kinect sensors simultaneously. Then, the captured data are integrated to point cloud data by using camera parameters. The obtained point cloud data are sampled to volume data that consists of voxels. Since volume data that are generated from point cloud data are sparse, those data are made dense by using global optimization algorithm. Final step is to reconstruct surfaces on dense volume data by discrete marching cubes method. Since accuracy of depth maps affects to the quality of 3D polygon model, a simple inpainting method for improving depth maps is also presented.

  8. Multiple time step integrators in ab initio molecular dynamics

    SciTech Connect

    Luehr, Nathan; Martínez, Todd J.; Markland, Thomas E.

    2014-02-28

    Multiple time-scale algorithms exploit the natural separation of time-scales in chemical systems to greatly accelerate the efficiency of molecular dynamics simulations. Although the utility of these methods in systems where the interactions are described by empirical potentials is now well established, their application to ab initio molecular dynamics calculations has been limited by difficulties associated with splitting the ab initio potential into fast and slowly varying components. Here we present two schemes that enable efficient time-scale separation in ab initio calculations: one based on fragment decomposition and the other on range separation of the Coulomb operator in the electronic Hamiltonian. We demonstrate for both water clusters and a solvated hydroxide ion that multiple time-scale molecular dynamics allows for outer time steps of 2.5 fs, which are as large as those obtained when such schemes are applied to empirical potentials, while still allowing for bonds to be broken and reformed throughout the dynamics. This permits computational speedups of up to 4.4x, compared to standard Born-Oppenheimer ab initio molecular dynamics with a 0.5 fs time step, while maintaining the same energy conservation and accuracy.

  9. A parallel multiple path tracing method based on OptiX for infrared image generation

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Wang, Xia; Liu, Li; Long, Teng; Wu, Zimu

    2015-12-01

    Infrared image generation technology is being widely used in infrared imaging system performance evaluation, battlefield environment simulation and military personnel training, which require a more physically accurate and efficient method for infrared scene simulation. A parallel multiple path tracing method based on OptiX was proposed to solve the problem, which can not only increase computational efficiency compared to serial ray tracing using CPU, but also produce relatively accurate results. First, the flaws of current ray tracing methods in infrared simulation were analyzed and thus a multiple path tracing method based on OptiX was developed. Furthermore, the Monte Carlo integration was employed to solve the radiation transfer equation, in which the importance sampling method was applied to accelerate the integral convergent rate. After that, the framework of the simulation platform and its sensor effects simulation diagram were given. Finally, the results showed that the method could generate relatively accurate radiation images if a precise importance sampling method was available.

  10. Multiple Methods: Research Methods in Education Projects at NSF

    ERIC Educational Resources Information Center

    Suter, Larry E.

    2005-01-01

    Projects on science and mathematics education research supported by the National Science Foundation (US government) rarely employ a single method of study. Studies of educational practices that use experimental design are very rare. The most common research method is the case study method and the second most common is some form of experimental…

  11. Relationship between Multiple Regression and Selected Multivariable Methods.

    ERIC Educational Resources Information Center

    Schumacker, Randall E.

    The relationship of multiple linear regression to various multivariate statistical techniques is discussed. The importance of the standardized partial regression coefficient (beta weight) in multiple linear regression as it is applied in path, factor, LISREL, and discriminant analyses is emphasized. The multivariate methods discussed in this paper…

  12. In Silico Gene Prioritization by Integrating Multiple Data Sources

    PubMed Central

    Zhou, Yingyao; Shields, Robert; Chanda, Sumit K.; Elston, Robert C.; Li, Jing

    2011-01-01

    Identifying disease genes is crucial to the understanding of disease pathogenesis, and to the improvement of disease diagnosis and treatment. In recent years, many researchers have proposed approaches to prioritize candidate genes by considering the relationship of candidate genes and existing known disease genes, reflected in other data sources. In this paper, we propose an expandable framework for gene prioritization that can integrate multiple heterogeneous data sources by taking advantage of a unified graphic representation. Gene-gene relationships and gene-disease relationships are then defined based on the overall topology of each network using a diffusion kernel measure. These relationship measures are in turn normalized to derive an overall measure across all networks, which is utilized to rank all candidate genes. Based on the informativeness of available data sources with respect to each specific disease, we also propose an adaptive threshold score to select a small subset of candidate genes for further validation studies. We performed large scale cross-validation analysis on 110 disease families using three data sources. Results have shown that our approach consistently outperforms other two state of the art programs. A case study using Parkinson disease (PD) has identified four candidate genes (UBB, SEPT5, GPR37 and TH) that ranked higher than our adaptive threshold, all of which are involved in the PD pathway. In particular, a very recent study has observed a deletion of TH in a patient with PD, which supports the importance of the TH gene in PD pathogenesis. A web tool has been implemented to assist scientists in their genetic studies. PMID:21731658

  13. Integrated Force Method for Indeterminate Structures

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale A.; Halford, Gary R.; Patnaik, Surya N.

    2008-01-01

    Two methods of solving indeterminate structural-mechanics problems have been developed as products of research on the theory of strain compatibility. In these methods, stresses are considered to be the primary unknowns (in contrast to strains and displacements being considered as the primary unknowns in some prior methods). One of these methods, denoted the integrated force method (IFM), makes it possible to compute stresses, strains, and displacements with high fidelity by use of modest finite-element models that entail relatively small amounts of computation. The other method, denoted the completed Beltrami Mitchell formulation (CBMF), enables direct determination of stresses in an elastic continuum with general boundary conditions, without the need to first calculate displacements as in traditional methods. The equilibrium equation, the compatibility condition, and the material law are the three fundamental concepts of the theory of structures. For almost 150 years, it has been commonly supposed that the theory is complete. However, until now, the understanding of the compatibility condition remained incomplete, and the compatibility condition was confused with the continuity condition. Furthermore, the compatibility condition as applied to structures in its previous incomplete form was inconsistent with the strain formulation in elasticity.

  14. Cardiac power integral: a new method for monitoring cardiovascular performance

    PubMed Central

    Rimehaug, Audun E; Lyng, Oddveig; Nordhaug, Dag O; Løvstakken, Lasse; Aadahl, Petter; Kirkeby-Garstad, Idar

    2013-01-01

    Cardiac power (PWR) is the continuous product of flow and pressure in the proximal aorta. Our aim was to validate the PWR integral as a marker of left ventricular energy transfer to the aorta, by comparing it to stroke work (SW) under multiple different loading and contractility conditions in subjects without obstructions in the left ventricular outflow tract. Six pigs were under general anesthesia equipped with transit time flow probes on their proximal aortas and Millar micromanometer catheters in their descending aortas to measure PWR, and Leycom conductance catheters in their left ventricles to measure SW. The PWR integral was calculated as the time integral of PWR per cardiac cycle. SW was calculated as the area encompassed by the pressure–volume loop (PV loop). The relationship between the PWR integral and SW was tested during extensive mechanical and pharmacological interventions that affected the loading conditions and myocardial contractility. The PWR integral displayed a strong correlation with SW in all pigs (R2 > 0.95, P < 0.05) under all conditions, using a linear model. Regression analysis and Bland Altman plots also demonstrated a stable relationship. A mixed linear analysis indicated that the slope of the SW-to-PWR-integral relationship was similar among all six animals, whereas loading and contractility conditions tended to affect the slope. The PWR integral followed SW and appeared to be a promising parameter for monitoring the energy transferred from the left ventricle to the aorta. This conclusion motivates further studies to determine whether the PWR integral can be evaluated using less invasive methods, such as echocardiography combined with a radial artery catheter. PMID:24400160

  15. Cardiac power integral: a new method for monitoring cardiovascular performance.

    PubMed

    Rimehaug, Audun E; Lyng, Oddveig; Nordhaug, Dag O; Løvstakken, Lasse; Aadahl, Petter; Kirkeby-Garstad, Idar

    2013-11-01

    Cardiac power (PWR) is the continuous product of flow and pressure in the proximal aorta. Our aim was to validate the PWR integral as a marker of left ventricular energy transfer to the aorta, by comparing it to stroke work (SW) under multiple different loading and contractility conditions in subjects without obstructions in the left ventricular outflow tract. Six pigs were under general anesthesia equipped with transit time flow probes on their proximal aortas and Millar micromanometer catheters in their descending aortas to measure PWR, and Leycom conductance catheters in their left ventricles to measure SW. The PWR integral was calculated as the time integral of PWR per cardiac cycle. SW was calculated as the area encompassed by the pressure-volume loop (PV loop). The relationship between the PWR integral and SW was tested during extensive mechanical and pharmacological interventions that affected the loading conditions and myocardial contractility. The PWR integral displayed a strong correlation with SW in all pigs (R (2) > 0.95, P < 0.05) under all conditions, using a linear model. Regression analysis and Bland Altman plots also demonstrated a stable relationship. A mixed linear analysis indicated that the slope of the SW-to-PWR-integral relationship was similar among all six animals, whereas loading and contractility conditions tended to affect the slope. The PWR integral followed SW and appeared to be a promising parameter for monitoring the energy transferred from the left ventricle to the aorta. This conclusion motivates further studies to determine whether the PWR integral can be evaluated using less invasive methods, such as echocardiography combined with a radial artery catheter.

  16. Methods of Genomic Competency Integration in Practice

    PubMed Central

    Jenkins, Jean; Calzone, Kathleen A.; Caskey, Sarah; Culp, Stacey; Weiner, Marsha; Badzek, Laurie

    2015-01-01

    Purpose Genomics is increasingly relevant to health care, necessitating support for nurses to incorporate genomic competencies into practice. The primary aim of this project was to develop, implement, and evaluate a year-long genomic education intervention that trained, supported, and supervised institutional administrator and educator champion dyads to increase nursing capacity to integrate genomics through assessments of program satisfaction and institutional achieved outcomes. Design Longitudinal study of 23 Magnet Recognition Program® Hospitals (21 intervention, 2 controls) participating in a 1-year new competency integration effort aimed at increasing genomic nursing competency and overcoming barriers to genomics integration in practice. Methods Champion dyads underwent genomic training consisting of one in-person kick-off training meeting followed by monthly education webinars. Champion dyads designed institution-specific action plans detailing objectives, methods or strategies used to engage and educate nursing staff, timeline for implementation, and outcomes achieved. Action plans focused on a minimum of seven genomic priority areas: champion dyad personal development; practice assessment; policy content assessment; staff knowledge needs assessment; staff development; plans for integration; and anticipated obstacles and challenges. Action plans were updated quarterly, outlining progress made as well as inclusion of new methods or strategies. Progress was validated through virtual site visits with the champion dyads and chief nursing officers. Descriptive data were collected on all strategies or methods utilized, and timeline for achievement. Descriptive data were analyzed using content analysis. Findings The complexity of the competency content and the uniqueness of social systems and infrastructure resulted in a significant variation of champion dyad interventions. Conclusions Nursing champions can facilitate change in genomic nursing capacity through

  17. Generating nonlinear FM chirp radar signals by multiple integrations

    DOEpatents

    Doerry, Armin W [Albuquerque, NM

    2011-02-01

    A phase component of a nonlinear frequency modulated (NLFM) chirp radar pulse can be produced by performing digital integration operations over a time interval defined by the pulse width. Each digital integration operation includes applying to a respectively corresponding input parameter value a respectively corresponding number of instances of digital integration.

  18. Multiple Frequency Contrast Source Inversion Method for Vertical Electromagnetic Profiling: 2D Simulation Results and Analyses

    NASA Astrophysics Data System (ADS)

    Li, Jinghe; Song, Linping; Liu, Qing Huo

    2016-02-01

    A simultaneous multiple frequency contrast source inversion (CSI) method is applied to reconstructing hydrocarbon reservoir targets in a complex multilayered medium in two dimensions. It simulates the effects of a salt dome sedimentary formation in the context of reservoir monitoring. In this method, the stabilized biconjugate-gradient fast Fourier transform (BCGS-FFT) algorithm is applied as a fast solver for the 2D volume integral equation for the forward computation. The inversion technique with CSI combines the efficient FFT algorithm to speed up the matrix-vector multiplication and the stable convergence of the simultaneous multiple frequency CSI in the iteration process. As a result, this method is capable of making quantitative conductivity image reconstruction effectively for large-scale electromagnetic oil exploration problems, including the vertical electromagnetic profiling (VEP) survey investigated here. A number of numerical examples have been demonstrated to validate the effectiveness and capacity of the simultaneous multiple frequency CSI method for a limited array view in VEP.

  19. Thermoelastic analysis of multiple defects with the extended finite element method

    NASA Astrophysics Data System (ADS)

    Jia, Honggang; Nie, Yufeng

    2016-12-01

    In this paper, the extended finite element method (XFEM) is adopted to analyze the interaction between a single macroscopic inclusion and a single macroscopic crack as well as that between multiple macroscopic or microscopic defects under thermal/mechanical load. The effects of different shapes of multiple inclusions on the material thermomechanical response are investigated, and the level set method is coupled with XFEM to analyze the interaction of multiple defects. Further, the discretized extended finite element approximations in relation to thermoelastic problems of multiple defects under displacement or temperature field are given. Also, the interfaces of cracks or materials are represented by level set functions, which allow the mesh assignment not to conform to crack or material interfaces. Moreover, stress intensity factors of cracks are obtained by the interaction integral method or the M-integral method, and the stress/strain/stiffness fields are simulated in the case of multiple cracks or multiple inclusions. Finally, some numerical examples are provided to demonstrate the accuracy of our proposed method.

  20. The integration of occupational therapy into primary care: a multiple case study design

    PubMed Central

    2013-01-01

    Background For over two decades occupational therapists have been encouraged to enhance their roles within primary care and focus on health promotion and prevention activities. While there is a clear fit between occupational therapy and primary care, there have been few practice examples, despite a growing body of evidence to support the role. In 2010, the province of Ontario, Canada provided funding to include occupational therapists as members of Family Health Teams, an interprofessional model of primary care. The integration of occupational therapists into this model of primary care is one of the first large scale initiatives of its kind in North America. The objective of the study was to examine how occupational therapy services are being integrated into primary care teams and understand the structures supporting the integration. Methods A multiple case study design was used to provide an in-depth description of the integration of occupational therapy. Four Family Health Teams with occupational therapists as part of the team were identified. Data collection included in-depth interviews, document analyses, and questionnaires. Results Each Family Health Team had a unique organizational structure that contributed to the integration of occupational therapy. Communication, trust and understanding of occupational therapy were key elements in the integration of occupational therapy into Family Health Teams, and were supported by a number of strategies including co-location, electronic medical records and team meetings. An understanding of occupational therapy was critical for integration into the team and physicians were less likely to understand the occupational therapy role than other health providers. Conclusion With an increased emphasis on interprofessional primary care, new professions will be integrated into primary healthcare teams. The study found that explicit strategies and structures are required to facilitate the integration of a new professional group

  1. Solution methods for very highly integrated circuits.

    SciTech Connect

    Nong, Ryan; Thornquist, Heidi K.; Chen, Yao; Mei, Ting; Santarelli, Keith R.; Tuminaro, Raymond Stephen

    2010-12-01

    While advances in manufacturing enable the fabrication of integrated circuits containing tens-to-hundreds of millions of devices, the time-sensitive modeling and simulation necessary to design these circuits poses a significant computational challenge. This is especially true for mixed-signal integrated circuits where detailed performance analyses are necessary for the individual analog/digital circuit components as well as the full system. When the integrated circuit has millions of devices, performing a full system simulation is practically infeasible using currently available Electrical Design Automation (EDA) tools. The principal reason for this is the time required for the nonlinear solver to compute the solutions of large linearized systems during the simulation of these circuits. The research presented in this report aims to address the computational difficulties introduced by these large linearized systems by using Model Order Reduction (MOR) to (i) generate specialized preconditioners that accelerate the computation of the linear system solution and (ii) reduce the overall dynamical system size. MOR techniques attempt to produce macromodels that capture the desired input-output behavior of larger dynamical systems and enable substantial speedups in simulation time. Several MOR techniques that have been developed under the LDRD on 'Solution Methods for Very Highly Integrated Circuits' will be presented in this report. Among those presented are techniques for linear time-invariant dynamical systems that either extend current approaches or improve the time-domain performance of the reduced model using novel error bounds and a new approach for linear time-varying dynamical systems that guarantees dimension reduction, which has not been proven before. Progress on preconditioning power grid systems using multi-grid techniques will be presented as well as a framework for delivering MOR techniques to the user community using Trilinos and the Xyce circuit simulator

  2. {sf MBsums} -- a {sf Mathematica} Package for the Representation of Mellin-Barnes Integrals by Multiple Sums

    NASA Astrophysics Data System (ADS)

    Ochman, M.; Riemann, T.

    Feynman integrals may be represented by the Mathematica packages AMBRE and MB as multiple Mellin-Barnes integrals. With the Mathematica package MBsums these Mellin-Barnes integrals are transformed into multiple sums.

  3. Integrability: mathematical methods for studying solitary waves theory

    NASA Astrophysics Data System (ADS)

    Wazwaz, Abdul-Majid

    2014-03-01

    In recent decades, substantial experimental research efforts have been devoted to linear and nonlinear physical phenomena. In particular, studies of integrable nonlinear equations in solitary waves theory have attracted intensive interest from mathematicians, with the principal goal of fostering the development of new methods, and physicists, who are seeking solutions that represent physical phenomena and to form a bridge between mathematical results and scientific structures. The aim for both groups is to build up our current understanding and facilitate future developments, develop more creative results and create new trends in the rapidly developing field of solitary waves. The notion of the integrability of certain partial differential equations occupies an important role in current and future trends, but a unified rigorous definition of the integrability of differential equations still does not exist. For example, an integrable model in the Painlevé sense may not be integrable in the Lax sense. The Painlevé sense indicates that the solution can be represented as a Laurent series in powers of some function that vanishes on an arbitrary surface with the possibility of truncating the Laurent series at finite powers of this function. The concept of Lax pairs introduces another meaning of the notion of integrability. The Lax pair formulates the integrability of nonlinear equation as the compatibility condition of two linear equations. However, it was shown by many researchers that the necessary integrability conditions are the existence of an infinite series of generalized symmetries or conservation laws for the given equation. The existence of multiple soliton solutions often indicates the integrability of the equation but other tests, such as the Painlevé test or the Lax pair, are necessary to confirm the integrability for any equation. In the context of completely integrable equations, studies are flourishing because these equations are able to describe the

  4. A fast and high performance multiple data integration algorithm for identifying human disease genes

    PubMed Central

    2015-01-01

    Background Integrating multiple data sources is indispensable in improving disease gene identification. It is not only due to the fact that disease genes associated with similar genetic diseases tend to lie close with each other in various biological networks, but also due to the fact that gene-disease associations are complex. Although various algorithms have been proposed to identify disease genes, their prediction performances and the computational time still should be further improved. Results In this study, we propose a fast and high performance multiple data integration algorithm for identifying human disease genes. A posterior probability of each candidate gene associated with individual diseases is calculated by using a Bayesian analysis method and a binary logistic regression model. Two prior probability estimation strategies and two feature vector construction methods are developed to test the performance of the proposed algorithm. Conclusions The proposed algorithm is not only generated predictions with high AUC scores, but also runs very fast. When only a single PPI network is employed, the AUC score is 0.769 by using F2 as feature vectors. The average running time for each leave-one-out experiment is only around 1.5 seconds. When three biological networks are integrated, the AUC score using F3 as feature vectors increases to 0.830, and the average running time for each leave-one-out experiment takes only about 12.54 seconds. It is better than many existing algorithms. PMID:26399620

  5. Integrating Multiple Data Views for Improved Malware Analysis

    SciTech Connect

    Anderson, Blake H.

    2014-01-31

    Exploiting multiple views of a program makes obfuscating the intended behavior of a program more difficult allowing for better performance in classification, clustering, and phylogenetic reconstruction.

  6. Information Integration in Multiple Cue Judgment: A Division of Labor Hypothesis

    ERIC Educational Resources Information Center

    Juslin, Peter; Karlsson, Linnea; Olsson, Henrik

    2008-01-01

    There is considerable evidence that judgment is constrained to additive integration of information. The authors propose an explanation of why serial and additive cognitive integration can produce accurate multiple cue judgment both in additive and non-additive environments in terms of an adaptive division of labor between multiple representations.…

  7. Validating Measurement of Knowledge Integration in Science Using Multiple-Choice and Explanation Items

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Linn, Marcia C.

    2011-01-01

    This study explores measurement of a construct called knowledge integration in science using multiple-choice and explanation items. We use construct and instructional validity evidence to examine the role multiple-choice and explanation items plays in measuring students' knowledge integration ability. For construct validity, we analyze item…

  8. Multiple Integrated Laser Engagement Simulation (MILES) Training and Evaluation Test (TET) Evaluator Guidebook

    DTIC Science & Technology

    1979-09-01

    7:,LEVEVEL$ Research Product 79-11 Multiple Integrated Laser Engagement Simulation -~ (MILES) Training and Evaluation Test (T ET) be Evaluator...COVEREO Multiple Integrated Laser Engagement Simulation (MILES) Training and Evaluation Test (TET) Evaluator Guidebook . 6. PERFORMING ORG. REPORT NUMBER...KEY WORDS (Continue on reverse side If neceaswry and identify by block nwmbsr) Unit Evaluation Engagement Simulation Unit Training Diagnosis

  9. Integration of Multiple Unmanned Systems in an Urban Search and Rescue Environment

    DTIC Science & Technology

    2013-03-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited INTEGRATION OF...MULTIPLE UNMANNED SYSTEMS IN AN URBAN SEARCH AND RESCUE ENVIRONMENT by Boon Heng Chua March 2013 Thesis Advisor: Oleg Yakimenko Second...AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE INTEGRATION OF MULTIPLE UNMANNED SYSTEMS IN AN URBAN SEARCH AND RESCUE ENVIRONMENT 5

  10. Integrating stakeholder values with multiple attributes to quantify watershed performance

    NASA Astrophysics Data System (ADS)

    Shriver, Deborah M.; Randhir, Timothy O.

    2006-08-01

    Integrating stakeholder values into the process of quantifying impairment of ecosystem functions is an important aspect of watershed assessment and planning. This study develops a classification and prioritization model to assess potential impairment in watersheds. A systematic evaluation of a broad set of abiotic, biotic, and human indicators of watershed structure and function was used to identify the level of degradation at a subbasin scale. Agencies and communities can use the method to effectively target and allocate resources to areas of greatest restoration need. The watershed performance measure (WPM) developed in this study is composed of three major components: (1) hydrologic processes (water quantity and quality), (2) biodiversity at a species scale (core and priority habitat for rare and endangered species and species richness) and landscape scale (impacts of fragmentation), and (3) urban impacts as assessed in the built environment (effective impervious area) and population effects (densities and density of toxic waste sites). Simulation modeling using the Soil and Water Assessment Tool (SWAT), monitoring information, and spatial analysis with GIS were used to assess each criterion in developing this model. Weights for attributes of potential impairment were determined through the use of the attribute prioritization procedure with a panel of expert stakeholders. This procedure uses preselected attributes and corresponding stakeholder values and is data intensive. The model was applied to all subbasins of the Chicopee River Watershed of western Massachusetts, an area with a mixture of rural, heavily forested lands, suburban, and urbanized areas. Highly impaired subbasins in one community were identified using this methodology and evaluated for principal forms of degradation and potential restoration policies and BMPs. This attribute-based prioritization method could be used in identifying baselines, prioritization policies, and adaptive community

  11. Integration Strategies for Learners with Severe Multiple Disabilities.

    ERIC Educational Resources Information Center

    Eichinger, Joanne; Woltman, Sheila

    1993-01-01

    This article reports the experiences of one school district as it moved from serving students with severe disabilities in segregated programs to a full inclusion model. Year one focused on getting started, planning, and beginning integration efforts and year two on implementation of a structured peer integration program. Applicability of the full…

  12. Multiple kernel learning with random effects for predicting longitudinal outcomes and data integration.

    PubMed

    Chen, Tianle; Zeng, Donglin; Wang, Yuanjia

    2015-12-01

    Predicting disease risk and progression is one of the main goals in many clinical research studies. Cohort studies on the natural history and etiology of chronic diseases span years and data are collected at multiple visits. Although, kernel-based statistical learning methods are proven to be powerful for a wide range of disease prediction problems, these methods are only well studied for independent data, but not for longitudinal data. It is thus important to develop time-sensitive prediction rules that make use of the longitudinal nature of the data. In this paper, we develop a novel statistical learning method for longitudinal data by introducing subject-specific short-term and long-term latent effects through a designed kernel to account for within-subject correlation of longitudinal measurements. Since the presence of multiple sources of data is increasingly common, we embed our method in a multiple kernel learning framework and propose a regularized multiple kernel statistical learning with random effects to construct effective nonparametric prediction rules. Our method allows easy integration of various heterogeneous data sources and takes advantage of correlation among longitudinal measures to increase prediction power. We use different kernels for each data source taking advantage of the distinctive feature of each data modality, and then optimally combine data across modalities. We apply the developed methods to two large epidemiological studies, one on Huntington's disease and the other on Alzheimer's Disease (Alzheimer's Disease Neuroimaging Initiative, ADNI) where we explore a unique opportunity to combine imaging and genetic data to study prediction of mild cognitive impairment, and show a substantial gain in performance while accounting for the longitudinal aspect of the data. © 2015, The International Biometric Society.

  13. Path Integral Monte Carlo Methods for Fermions

    NASA Astrophysics Data System (ADS)

    Ethan, Ethan; Dubois, Jonathan; Ceperley, David

    2014-03-01

    In general, Quantum Monte Carlo methods suffer from a sign problem when simulating fermionic systems. This causes the efficiency of a simulation to decrease exponentially with the number of particles and inverse temperature. To circumvent this issue, a nodal constraint is often implemented, restricting the Monte Carlo procedure from sampling paths that cause the many-body density matrix to change sign. Unfortunately, this high-dimensional nodal surface is not a priori known unless the system is exactly solvable, resulting in uncontrolled errors. We will discuss two possible routes to extend the applicability of finite-temperatue path integral Monte Carlo. First we extend the regime where signful simulations are possible through a novel permutation sampling scheme. Afterwards, we discuss a method to variationally improve the nodal surface by minimizing a free energy during simulation. Applications of these methods will include both free and interacting electron gases, concluding with discussion concerning extension to inhomogeneous systems. Support from DOE DE-FG52-09NA29456, DE-AC52-07NA27344, LLNL LDRD 10- ERD-058, and the Lawrence Scholar program.

  14. Integrated argument-based inquiry with multiple representation approach to promote scientific argumentation skill

    NASA Astrophysics Data System (ADS)

    Suminar, Iin; Muslim, Liliawati, Winny

    2017-05-01

    The purpose of this research was to identify student's written argument embedded in scientific inqury investigation and argumentation skill using integrated argument-based inquiry with multiple representation approach. This research was using quasi experimental method with the nonequivalent pretest-posttest control group design. Sample ot this research was 10th grade students at one of High School in Bandung using two classes, they were 26 students of experiment class and 26 students of control class. Experiment class using integrated argument-based inquiry with multiple representation approach, while control class using argument-based inquiry. This study was using argumentation worksheet and argumentation test. Argumentation worksheet encouraged students to formulate research questions, design experiment, observe experiment and explain the data as evidence, construct claim, warrant, embedded multiple modus representation and reflection. Argumentation testinclude problem which asks students to explain evidence, warrants, and backings support of each claim. The result of this research show experiment class students's argumentation skill performed better than control class students that of experiment class was 0.47 and control class was 0.31. The results of unequal variance t-test for independent means show that students'sargumentationskill of experiment class performed better significantly than students'sargumentationskill of control class.

  15. Pooling Data from Multiple Longitudinal Studies: The Role of Item Response Theory in Integrative Data Analysis

    PubMed Central

    Curran, Patrick J.; Hussong, Andrea M.; Cai, Li; Huang, Wenjing; Chassin, Laurie; Sher, Kenneth J.; Zucker, Robert A.

    2010-01-01

    There are a number of significant challenges encountered when studying development over an extended period of time including subject attrition, changing measurement structures across group and developmental period, and the need to invest substantial time and money. Integrative data analysis is an emerging set of methodologies that overcomes many of the challenges of single sample designs through the pooling of data drawn from multiple existing developmental studies. This approach is characterized by a host of advantages, but this also introduces several new complexities that must be addressed prior to broad adoption by developmental researchers. In this paper we focus on methods for fitting measurement models and creating scale scores using data drawn from multiple longitudinal studies. We present findings from the analysis of repeated measures of internalizing symptomatology that were pooled from three existing developmental studies. We describe and demonstrate each step in the analysis and we conclude with a discussion of potential limitations and directions for future research. PMID:18331129

  16. Curriculum Integration in Arts Education: Connecting Multiple Art Forms through the Idea of "Space"

    ERIC Educational Resources Information Center

    Bautista, Alfredo; Tan, Liang See; Ponnusamy, Letchmi Devi; Yau, Xenia

    2016-01-01

    Arts integration research has focused on documenting how the teaching of specific art forms can be integrated with "core" academic subject matters (e.g. science, mathematics and literacy). However, the question of how the teaching of multiple art forms themselves can be integrated in schools remains to be explored by educational…

  17. Curriculum Integration in Arts Education: Connecting Multiple Art Forms through the Idea of "Space"

    ERIC Educational Resources Information Center

    Bautista, Alfredo; Tan, Liang See; Ponnusamy, Letchmi Devi; Yau, Xenia

    2016-01-01

    Arts integration research has focused on documenting how the teaching of specific art forms can be integrated with "core" academic subject matters (e.g. science, mathematics and literacy). However, the question of how the teaching of multiple art forms themselves can be integrated in schools remains to be explored by educational…

  18. Identification of Functional Modules by Integration of Multiple Data Sources Using a Bayesian Network Classifier

    PubMed Central

    Wang, Jinlian; Zuo, Yiming; Liu, Lun; Man, Yangao; Tadesse, Mahlet G.; Ressom, Habtom W

    2014-01-01

    Background Prediction of functional modules is indispensable for detecting protein deregulation in human complex diseases such as cancer. Bayesian network (BN) is one of the most commonly used models to integrate heterogeneous data from multiple sources such as protein domain, interactome, functional annotation, genome-wide gene expression, and the literature. Methods and Results In this paper, we present a BN classifier that is customized to: 1) increase the ability to integrate diverse information from different sources, 2) effectively predict protein-protein interactions, 3) infer aberrant networks with scale-free and small world properties, and 4) group molecules into functional modules or pathways based on the primary function and biological features. Application of this model on discovering protein biomarkers of hepatocelluar carcinoma (HCC) leads to the identification of functional modules that provide insights into the mechanism of the development and progression of HCC. These functional modules include cell cycle deregulation, increased angiogenesis (e.g., vascular endothelial growth factor, blood vessel morphogenesis), oxidative metabolic alterations, and aberrant activation of signaling pathways involved in cellular proliferation, survival, and differentiation. Conclusion The discoveries and conclusions derived from our customized BN classifier are consistent with previously published results. The proposed approach for determining BN structure facilitates the integration of heterogeneous data from multiple sources to elucidate the mechanisms of complex diseases. PMID:24736851

  19. Evaluating real-time immunohistochemistry on multiple tissue samples, multiple targets and multiple antibody labeling methods

    PubMed Central

    2013-01-01

    Background Immunohistochemistry (IHC) is a well-established method for the analysis of protein expression in tissue specimens and constitutes one of the most common methods performed in pathology laboratories worldwide. However, IHC is a multi-layered method based on subjective estimations and differences in staining and interpretation has been observed between facilities, suggesting that the analysis of proteins on tissue would benefit from protocol optimization and standardization. Here we describe how the emerging and operator independent tool of real-time immunohistochemistry (RT-IHC) reveals a time resolved description of antibody interacting with target protein in formalin fixed paraffin embedded tissue. The aim was to understand the technical aspects of RT-IHC, regarding generalization of the concept and to what extent it can be considered a quantitative method. Results Three different antibodies labeled with fluorescent or radioactive labels were applied on nine different tissue samples from either human or mouse, and the results for all RT-IHC analyses distinctly show that the method is generally applicable. The collected binding curves showed that the majority of the antibody-antigen interactions did not reach equilibrium within 3 hours, suggesting that standardized protocols for immunohistochemistry are sometimes inadequately optimized. The impact of tissue size and thickness as well as the position of the section on the glass petri dish was assessed in order for practical details to be further elucidated for this emerging technique. Size and location was found to affect signal magnitude to a larger extent than thickness, but the signal from all measurements were still sufficient to trace the curvature. The curvature, representing the kinetics of the interaction, was independent of thickness, size and position and may be a promising parameter for the evaluation of e.g. biopsy sections of different sizes. Conclusions It was found that RT-IHC can be used

  20. Evaluating real-time immunohistochemistry on multiple tissue samples, multiple targets and multiple antibody labeling methods.

    PubMed

    Dubois, Louise; Andersson, Karl; Asplund, Anna; Björkelund, Hanna

    2013-12-18

    Immunohistochemistry (IHC) is a well-established method for the analysis of protein expression in tissue specimens and constitutes one of the most common methods performed in pathology laboratories worldwide. However, IHC is a multi-layered method based on subjective estimations and differences in staining and interpretation has been observed between facilities, suggesting that the analysis of proteins on tissue would benefit from protocol optimization and standardization. Here we describe how the emerging and operator independent tool of real-time immunohistochemistry (RT-IHC) reveals a time resolved description of antibody interacting with target protein in formalin fixed paraffin embedded tissue. The aim was to understand the technical aspects of RT-IHC, regarding generalization of the concept and to what extent it can be considered a quantitative method. Three different antibodies labeled with fluorescent or radioactive labels were applied on nine different tissue samples from either human or mouse, and the results for all RT-IHC analyses distinctly show that the method is generally applicable. The collected binding curves showed that the majority of the antibody-antigen interactions did not reach equilibrium within 3 hours, suggesting that standardized protocols for immunohistochemistry are sometimes inadequately optimized. The impact of tissue size and thickness as well as the position of the section on the glass petri dish was assessed in order for practical details to be further elucidated for this emerging technique. Size and location was found to affect signal magnitude to a larger extent than thickness, but the signal from all measurements were still sufficient to trace the curvature. The curvature, representing the kinetics of the interaction, was independent of thickness, size and position and may be a promising parameter for the evaluation of e.g. biopsy sections of different sizes. It was found that RT-IHC can be used for the evaluation of a number

  1. An Integrated Experimental Design for the Assessment of Multiple Toxicological End Points in Rat Bioassays.

    PubMed

    Manservisi, Fabiana; Marquillas, Clara Babot; Buscaroli, Annalisa; Huff, James; Lauriola, Michelina; Mandrioli, Daniele; Manservigi, Marco; Panzacchi, Simona; Silbergeld, Ellen K; Belpoggi, Fiorella

    2017-03-01

    For nearly five decades long-term studies in rodents have been the accepted benchmark for assessing chronic long-term toxic effects, particularly carcinogenicity, of chemicals. The European Food Safety Authority (EFSA) and the World Health Organization (WHO) have pointed out that the current set of internationally utilized test methods capture only some of the potential adverse effects associated with exposures to these agents over the lifetime. In this paper, we propose the adaption of the carcinogenicity bioassay to integrate additional protocols for comprehensive long-term toxicity assessment that includes developmental exposures and long-term outcomes, capable of generating information on a broad spectrum of different end points. An integrated study design based on a stepwise process is described that includes the priority end points of the Economic Co-operation and Development and the National Toxicology Program guidelines on carcinogenicity and chronic toxicity and developmental and reproductive toxicity. Integrating a comprehensive set of relevant toxicological end points in a single protocol represents an opportunity to optimize animal use in accordance with the 3Rs (replacement, reduction and refinement). This strategy has the potential to provide sufficient data on multiple windows of susceptibility of specific interest for risk assessments and public health decision-making by including prenatal, lactational, neonatal exposures and evaluating outcomes over the lifespan. This integrated study design is efficient in that the same generational cohort of rats used for evaluating long-term outcomes can be monitored in satellite parallel experiments to measure biomarkers and other parameters related to system-specific responses including metabolic alterations and endocrine disturbances. Citation: Manservisi F, Babot Marquillas C, Buscaroli A, Huff J, Lauriola M, Mandrioli D, Manservigi M, Panzacchi S, Silbergeld EK, Belpoggi F. 2017. An integrated experimental

  2. An Integrated Experimental Design for the Assessment of Multiple Toxicological End Points in Rat Bioassays

    PubMed Central

    Manservisi, Fabiana; Marquillas, Clara Babot; Buscaroli, Annalisa; Huff, James; Lauriola, Michelina; Mandrioli, Daniele; Manservigi, Marco; Panzacchi, Simona; Silbergeld, Ellen K.; Belpoggi, Fiorella

    2016-01-01

    Background: For nearly five decades long-term studies in rodents have been the accepted benchmark for assessing chronic long-term toxic effects, particularly carcinogenicity, of chemicals. The European Food Safety Authority (EFSA) and the World Health Organization (WHO) have pointed out that the current set of internationally utilized test methods capture only some of the potential adverse effects associated with exposures to these agents over the lifetime. Objectives: In this paper, we propose the adaption of the carcinogenicity bioassay to integrate additional protocols for comprehensive long-term toxicity assessment that includes developmental exposures and long-term outcomes, capable of generating information on a broad spectrum of different end points. Discussion: An integrated study design based on a stepwise process is described that includes the priority end points of the Economic Co-operation and Development and the National Toxicology Program guidelines on carcinogenicity and chronic toxicity and developmental and reproductive toxicity. Integrating a comprehensive set of relevant toxicological end points in a single protocol represents an opportunity to optimize animal use in accordance with the 3Rs (replacement, reduction and refinement). This strategy has the potential to provide sufficient data on multiple windows of susceptibility of specific interest for risk assessments and public health decision-making by including prenatal, lactational, neonatal exposures and evaluating outcomes over the lifespan. Conclusion: This integrated study design is efficient in that the same generational cohort of rats used for evaluating long-term outcomes can be monitored in satellite parallel experiments to measure biomarkers and other parameters related to system-specific responses including metabolic alterations and endocrine disturbances. Citation: Manservisi F, Babot Marquillas C, Buscaroli A, Huff J, Lauriola M, Mandrioli D, Manservigi M, Panzacchi S, Silbergeld

  3. A Method for Weight Multiplicity Computation Based on Berezin Quantization

    NASA Astrophysics Data System (ADS)

    Bar-Moshe, David

    2009-09-01

    Let G be a compact semisimple Lie group and T be a maximal torus of G. We describe a method for weight multiplicity computation in unitary irreducible representations of G, based on the theory of Berezin quantization on G/T. Let Γhol(Lλ) be the reproducing kernel Hilbert space of holomorphic sections of the homogeneous line bundle Lλ over G/T associated with the highest weight λ of the irreducible representation πλ of G. The multiplicity of a weight m in πλ is computed from functional analytical structure of the Berezin symbol of the projector in Γhol(Lλ) onto subspace of weight m. We describe a method of the construction of this symbol and the evaluation of the weight multiplicity as a rank of a Hermitian form. The application of this method is described in a number of examples.

  4. Retrieving and Integrating Data from Multiple Information Sources

    DTIC Science & Technology

    1993-04-30

    part of the report that provides the most NASA - See Handbook NHB 2200.2. meaningful and complete Information. When a NTIS - Leave blank. report is...axioms, but after the axioms.are built _ the domain model is no longer used or needed. In contrast, the domain model in SIMS is an integral part of the...intelligent information systems where, like SIMS, an explicit knowledge model is an integral part of an intelligent information agent. Some additional

  5. Using Images, Metaphor, and Hypnosis in Integrating Multiple Personality and Dissociative States: A Review of the Literature.

    ERIC Educational Resources Information Center

    Crawford, Carrie L.

    1990-01-01

    Reviews literature on hypnosis, imagery, and metaphor as applied to the treatment and integration of those with multiple personality disorder (MPD) and dissociative states. Considers diagnostic criteria of MPD; explores current theories of etiology and treatment; and suggests specific examples of various clinical methods of treatment using…

  6. Using Images, Metaphor, and Hypnosis in Integrating Multiple Personality and Dissociative States: A Review of the Literature.

    ERIC Educational Resources Information Center

    Crawford, Carrie L.

    1990-01-01

    Reviews literature on hypnosis, imagery, and metaphor as applied to the treatment and integration of those with multiple personality disorder (MPD) and dissociative states. Considers diagnostic criteria of MPD; explores current theories of etiology and treatment; and suggests specific examples of various clinical methods of treatment using…

  7. Multiple imputation methods for bivariate outcomes in cluster randomised trials.

    PubMed

    DiazOrdaz, K; Kenward, M G; Gomes, M; Grieve, R

    2016-09-10

    Missing observations are common in cluster randomised trials. The problem is exacerbated when modelling bivariate outcomes jointly, as the proportion of complete cases is often considerably smaller than the proportion having either of the outcomes fully observed. Approaches taken to handling such missing data include the following: complete case analysis, single-level multiple imputation that ignores the clustering, multiple imputation with a fixed effect for each cluster and multilevel multiple imputation. We contrasted the alternative approaches to handling missing data in a cost-effectiveness analysis that uses data from a cluster randomised trial to evaluate an exercise intervention for care home residents. We then conducted a simulation study to assess the performance of these approaches on bivariate continuous outcomes, in terms of confidence interval coverage and empirical bias in the estimated treatment effects. Missing-at-random clustered data scenarios were simulated following a full-factorial design. Across all the missing data mechanisms considered, the multiple imputation methods provided estimators with negligible bias, while complete case analysis resulted in biased treatment effect estimates in scenarios where the randomised treatment arm was associated with missingness. Confidence interval coverage was generally in excess of nominal levels (up to 99.8%) following fixed-effects multiple imputation and too low following single-level multiple imputation. Multilevel multiple imputation led to coverage levels of approximately 95% throughout. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  8. Method for measuring multiple scattering corrections between liquid scintillators

    SciTech Connect

    Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.; Wurtz, R. E.

    2016-04-11

    In this study, a time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.

  9. Method for measuring multiple scattering corrections between liquid scintillators

    NASA Astrophysics Data System (ADS)

    Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.; Wurtz, R. E.

    2016-07-01

    A time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.

  10. Method for measuring multiple scattering corrections between liquid scintillators

    DOE PAGES

    Verbeke, J. M.; Glenn, A. M.; Keefer, G. J.; ...

    2016-04-11

    In this study, a time-of-flight method is proposed to experimentally quantify the fractions of neutrons scattering between scintillators. An array of scintillators is characterized in terms of crosstalk with this method by measuring a californium source, for different neutron energy thresholds. The spectral information recorded by the scintillators can be used to estimate the fractions of neutrons multiple scattering. With the help of a correction to Feynman's point model theory to account for multiple scattering, these fractions can in turn improve the mass reconstruction of fissile materials under investigation.

  11. Parallel methods for dynamic simulation of multiple manipulator systems

    NASA Technical Reports Server (NTRS)

    Mcmillan, Scott; Sadayappan, P.; Orin, David E.

    1993-01-01

    In this paper, efficient dynamic simulation algorithms for a system of m manipulators, cooperating to manipulate a large load, are developed; their performance, using two possible forms of parallelism on a general-purpose parallel computer, is investigated. One form, temporal parallelism, is obtained with the use of parallel numerical integration methods. A speedup of 3.78 on four processors of CRAY Y-MP8 was achieved with a parallel four-point block predictor-corrector method for the simulation of a four manipulator system. These multi-point methods suffer from reduced accuracy, and when comparing these runs with a serial integration method, the speedup can be as low as 1.83 for simulations with the same accuracy. To regain the performance lost due to accuracy problems, a second form of parallelism is employed. Spatial parallelism allows most of the dynamics of each manipulator chain to be computed simultaneously. Used exclusively in the four processor case, this form of parallelism in conjunction with a serial integration method results in a speedup of 3.1 on four processors over the best serial method. In cases where there are either more processors available or fewer chains in the system, the multi-point parallel integration methods are still advantageous despite the reduced accuracy because both forms of parallelism can then combine to generate more parallel tasks and achieve greater effective speedups. This paper also includes results for these cases.

  12. Multi-Scale Segmentation of High Resolution Remote Sensing Images by Integrating Multiple Features

    NASA Astrophysics Data System (ADS)

    Di, Y.; Jiang, G.; Yan, L.; Liu, H.; Zheng, S.

    2017-05-01

    Most of multi-scale segmentation algorithms are not aiming at high resolution remote sensing images and have difficulty to communicate and use layers' information. In view of them, we proposes a method of multi-scale segmentation of high resolution remote sensing images by integrating multiple features. First, Canny operator is used to extract edge information, and then band weighted distance function is built to obtain the edge weight. According to the criterion, the initial segmentation objects of color images can be gained by Kruskal minimum spanning tree algorithm. Finally segmentation images are got by the adaptive rule of Mumford-Shah region merging combination with spectral and texture information. The proposed method is evaluated precisely using analog images and ZY-3 satellite images through quantitative and qualitative analysis. The experimental results show that the multi-scale segmentation of high resolution remote sensing images by integrating multiple features outperformed the software eCognition fractal network evolution algorithm (highest-resolution network evolution that FNEA) on the accuracy and slightly inferior to FNEA on the efficiency.

  13. Biomarker detection in the integration of multiple multi-class genomic studies

    PubMed Central

    Lu, Shuya; Li, Jia; Song, Chi; Shen, Kui; Tseng, George C.

    2010-01-01

    Motivation: Systematic information integration of multiple-related microarray studies has become an important issue as the technology becomes mature and prevalent in the past decade. The aggregated information provides more robust and accurate biomarker detection. So far, published meta-analysis methods for this purpose mostly consider two-class comparison. Methods for combining multi-class studies and considering expression pattern concordance are rarely explored. Results: In this article, we develop three integration methods for biomarker detection in multiple multi-class microarray studies: ANOVA-maxP, min-MCC and OW-min-MCC. We first consider a natural extension of combining P-values from the traditional ANOVA model. Since P-values from ANOVA do not guarantee to reflect the concordant expression pattern information across studies, we propose a multi-class correlation (MCC) measure to specifically seek for biomarkers of concordant inter-class patterns across a pair of studies. For both ANOVA and MCC approaches, we use extreme order statistics to identify biomarkers differentially expressed (DE) in all studies (i.e. ANOVA-maxP and min-MCC). The min-MCC method is further extended to identify biomarkers DE in partial studies by incorporating a recently developed optimally weighted (OW) technique (OW-min-MCC). All methods are evaluated by simulation studies and by three meta-analysis applications to multi-tissue mouse metabolism datasets, multi-condition mouse trauma datasets and multi-malignant-condition human prostate cancer datasets. The results show complementary strength of the three methods for different biological purposes. Availability: http://www.biostat.pitt.edu/bioinfo/ Contact: ctseng@pitt.edu Supplementary information: Supplementary data is available at Bioinformatics online. PMID:19965884

  14. Integrative Data Analysis: The Simultaneous Analysis of Multiple Data Sets

    ERIC Educational Resources Information Center

    Curran, Patrick J.; Hussong, Andrea M.

    2009-01-01

    There are both quantitative and methodological techniques that foster the development and maintenance of a cumulative knowledge base within the psychological sciences. Most noteworthy of these techniques is meta-analysis, which allows for the synthesis of summary statistics drawn from multiple studies when the original data are not available.…

  15. On Riemann zeroes, lognormal multiplicative chaos, and Selberg integral

    NASA Astrophysics Data System (ADS)

    Ostrovsky, Dmitry

    2016-02-01

    Rescaled Mellin-type transforms of the exponential functional of the Bourgade-Kuan-Rodgers statistic of Riemann zeroes are conjecturally related to the distribution of the total mass of the limit lognormal stochastic measure of Mandelbrot-Bacry-Muzy. The conjecture implies that a non-trivial, log-infinitely divisible probability distribution is associated with Riemann zeroes. For application, integral moments, covariance structure, multiscaling spectrum, and asymptotics associated with the exponential functional are computed in closed form using the known meromorphic extension of the Selberg integral.

  16. New compensation method for bulk optical sensors with multiple birefringences.

    PubMed

    Lee, K S

    1989-06-01

    The dielectric tensor of an anisotropic crystal with multiple perturbations is presented to include the effects of multiple perturbations. To study electromagnetic wave propagation in anisotropic crystals subject to various influences the perturbed dielectric tensor is substituted into Maxwell's equation. Then, a 2 x 2 transmission matrix formalism, based on a normal-mode approach, is extended to anisotropic crystals possessing multiple birefringences to develop compensation schemes for ac optical sensors employing the crystal. It is shown that a new compensation method utilizing two analyzers can eliminate the effects of both unwanted linear birefringences and unwanted circular birefringences on the stability of the ac bulk polarimetric optical sensor. The conditions (here referred to as the quenching condition) in which the compensation method becomes important are also derived for both the voltage (or electric field) and current (or magnetic field) sensors.

  17. A retrospective likelihood approach for efficient integration of multiple omics factors in case-control association studies.

    PubMed

    Balliu, Brunilda; Tsonaka, Roula; Boehringer, Stefan; Houwing-Duistermaat, Jeanine

    2015-03-01

    Integrative omics, the joint analysis of outcome and multiple types of omics data, such as genomics, epigenomics, and transcriptomics data, constitute a promising approach for powerful and biologically relevant association studies. These studies often employ a case-control design, and often include nonomics covariates, such as age and gender, that may modify the underlying omics risk factors. An open question is how to best integrate multiple omics and nonomics information to maximize statistical power in case-control studies that ascertain individuals based on the phenotype. Recent work on integrative omics have used prospective approaches, modeling case-control status conditional on omics, and nonomics risk factors. Compared to univariate approaches, jointly analyzing multiple risk factors with a prospective approach increases power in nonascertained cohorts. However, these prospective approaches often lose power in case-control studies. In this article, we propose a novel statistical method for integrating multiple omics and nonomics factors in case-control association studies. Our method is based on a retrospective likelihood function that models the joint distribution of omics and nonomics factors conditional on case-control status. The new method provides accurate control of Type I error rate and has increased efficiency over prospective approaches in both simulated and real data.

  18. On time integration methods and errors for ASCI applications

    SciTech Connect

    Knoll, D. A.; Mousseau, V. A.

    2004-01-01

    This talk is one of four to be given in the Multiphysics Solution Methods section of the workshop, Methods for Computational Physics. Some background and motivation is given for the various multiphysics time integration approaches. Various splitting methods as well as more modern coupled methods are discussed. Methods for assessing solution accuracy and time integration error are discussed. Finally, important open issues are highlighted.

  19. Satellite attitude prediction by multiple time scales method

    NASA Technical Reports Server (NTRS)

    Tao, Y. C.; Ramnath, R.

    1975-01-01

    An investigation is made of the problem of predicting the attitude of satellites under the influence of external disturbing torques. The attitude dynamics are first expressed in a perturbation formulation which is then solved by the multiple scales approach. The independent variable, time, is extended into new scales, fast, slow, etc., and the integration is carried out separately in the new variables. The theory is applied to two different satellite configurations, rigid body and dual spin, each of which may have an asymmetric mass distribution. The disturbing torques considered are gravity gradient and geomagnetic. Finally, as multiple time scales approach separates slow and fast behaviors of satellite attitude motion, this property is used for the design of an attitude control device. A nutation damping control loop, using the geomagnetic torque for an earth pointing dual spin satellite, is designed in terms of the slow equation.

  20. Generalized multiple internal standard method for quantitative liquid chromatography mass spectrometry.

    PubMed

    Hu, Yuan-Liang; Chen, Zeng-Ping; Chen, Yao; Shi, Cai-Xia; Yu, Ru-Qin

    2016-05-06

    In this contribution, a multiplicative effects model for generalized multiple-internal-standard method (MEMGMIS) was proposed to solve the signal instability problem of LC-MS over time. MEMGMIS model seamlessly integrates the multiple-internal-standard strategy with multivariate calibration method, and takes full use of all the information carried by multiple internal standards during the quantification of target analytes. Unlike the existing methods based on multiple internal standards, MEMGMIS does not require selecting an optimal internal standard for the quantification of a specific analyte from multiple internal standards used. MEMGMIS was applied to a proof-of-concept model system: the simultaneous quantitative analysis of five edible artificial colorants in two kinds of cocktail drinks. Experimental results demonstrated that MEMGMIS models established on LC-MS data of calibration samples prepared with ultrapure water could provide quite satisfactory concentration predictions for colorants in cocktail samples from their LC-MS data measured 10days after the LC-MS analysis of the calibration samples. The average relative prediction errors of MEMGMIS models did not exceed 6.0%, considerably better than the corresponding values of commonly used univariate calibration models combined with multiple internal standards. The advantages of good performance and simple implementation render MEMGMIS model a promising alternative tool in quantitative LC-MS assays. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. 360 degree viewable floating autostereoscopic display using integral photography and multiple semitransparent mirrors.

    PubMed

    Zhao, Dong; Su, Baiquan; Chen, Guowen; Liao, Hongen

    2015-04-20

    In this paper, we present a polyhedron-shaped floating autostereoscopic display viewable from 360 degrees using integral photography (IP) and multiple semitransparent mirrors. IP combined with polyhedron-shaped multiple semitransparent mirrors is used to achieve a 360 degree viewable floating three-dimensional (3D) autostereoscopic display, having the advantage of being able to be viewed by several observers from various viewpoints simultaneously. IP is adopted to generate a 3D autostereoscopic image with full parallax property. Multiple semitransparent mirrors reflect corresponding IP images, and the reflected IP images are situated around the center of the polyhedron-shaped display device for producing the floating display. The spatial reflected IP images reconstruct a floating autostereoscopic image viewable from 360 degrees. We manufactured two prototypes for producing such displays and performed two sets of experiments to evaluate the feasibility of the method described above. The results of our experiments showed that our approach can achieve a floating autostereoscopic display viewable from surrounding area. Moreover, it is shown the proposed method is feasible to facilitate the continuous viewpoint of a whole 360 degree display without flipping.

  2. Multiple Quantum Well (MQW) Devices For Monolithic Integrated Optoelectronics

    NASA Astrophysics Data System (ADS)

    Wood, Thomas H.

    1988-05-01

    Semiconductor MQWs represent a new technology for opto-electronics. These MQWs have an electroabsorption effect approximately 50 times larger than conventional semiconductors. They are compatible with existing source and detector material systems and produce devices that are compact and high speed, which makes them useful for monolithic integrated optoelectronic devices.

  3. Multiple Quantum Well(MQW) Devices For Monolithic Integrated Optoelectronics

    NASA Astrophysics Data System (ADS)

    Wood, Thomas H.

    1987-02-01

    A new technology for opto-electronics has been developed, semiconductor MQWs. These MQWs have an electroabsorption effect 30-60 times larger than conventional semiconductors. They are compatible with existing source and detector material systems and produce devices that are compact and high speed, which makes them useful for monolithic integrated optoelectronic devices.

  4. Partial-Credit Scoring Methods for Multiple-Choice Tests.

    ERIC Educational Resources Information Center

    Frary, Robert B.

    1989-01-01

    Multiple-choice response and scoring methods that attempt to determine an examinee's degree of knowledge about each item in order to produce a total test score are reviewed. There is apparently little advantage to such schemes; however, they may have secondary benefits such as providing feedback to enhance learning. (SLD)

  5. Methods for the Joint Meta-Analysis of Multiple Tests

    ERIC Educational Resources Information Center

    Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.

    2014-01-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…

  6. Evaluating Multiple Prevention Programs: Methods, Results, and Lessons Learned

    ERIC Educational Resources Information Center

    Adler-Baeder, Francesca; Kerpelman, Jennifer; Griffin, Melody M.; Schramm, David G.

    2010-01-01

    Extension faculty and agents/educators are increasingly collaborating with local and state agencies to provide and evaluate multiple, distinct programs, yet there is limited information about measuring outcomes and combining results across similar program types. This article explicates the methods and outcomes of a state-level evaluation of…

  7. Methods for the Joint Meta-Analysis of Multiple Tests

    ERIC Educational Resources Information Center

    Trikalinos, Thomas A.; Hoaglin, David C.; Small, Kevin M.; Terrin, Norma; Schmid, Christopher H.

    2014-01-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests'…

  8. Restructuring for Integrative Education: Multiple Perspectives, Multiple Contexts. Critical Studies in Education and Culture Series.

    ERIC Educational Resources Information Center

    Jennings, Todd, Ed.

    Integrative education is defined as education that promotes learning and teaching in nonfragmented ways that embrace notions of holism, complexity, and interconnection. Furthermore, integrative education embraces the links, rather than the divisions, between the academic disciplines (e.g., arts and sciences) and between various subjective and…

  9. Identifying multiple submissions in Internet research: preserving data integrity.

    PubMed

    Bowen, Anne M; Daniel, Candice M; Williams, Mark L; Baird, Grayson L

    2008-11-01

    Internet-based sexuality research with hidden populations has become increasingly popular. Respondent anonymity may encourage participation and lower social desirability, but associated disinhibition may promote multiple submissions, especially when incentives are offered. The goal of this study was to identify the usefulness of different variables for detecting multiple submissions from repeat responders and to explore incentive effects. The data included 1,900 submissions from a three-session Internet intervention with a pretest and three post-test questionnaires. Participants were men who have sex with men and incentives were offered to rural participants for completing each questionnaire. The final number of submissions included 1,273 "unique", 132 first submissions by "repeat responders" and 495 additional submissions by the "repeat responders" (N = 1,900). Four categories of repeat responders were identified: "infrequent" (2-5 submissions), "persistent" (6-10 submissions), "very persistent" (11-30 submissions), and "hackers" (more than 30 submissions). Internet Provider (IP) addresses, user names, and passwords were the most useful for identifying "infrequent" repeat responders. "Hackers" often varied their IP address and identifying information to prevent easy identification, but investigating the data for small variations in IP, using reverse telephone look up, and patterns across usernames and passwords were helpful. Incentives appeared to play a role in stimulating multiple submissions, especially from the more sophisticated "hackers". Finally, the web is ever evolving and it will be necessary to have good programmers and staff who evolve as fast as "hackers".

  10. Multiple integral representation for the trigonometric SOS model with domain wall boundaries

    NASA Astrophysics Data System (ADS)

    Galleas, W.

    2012-05-01

    Using the dynamical Yang-Baxter algebra we derive a functional equation for the partition function of the trigonometric SOS model with domain wall boundary conditions. The solution of the equation is given in terms of a multiple contour integral.

  11. Multiple Distinct Targeting Signals in Integral Peroxisomal Membrane Proteins

    PubMed Central

    Jones, Jacob M.; Morrell, James C.; Gould, Stephen J.

    2001-01-01

    Peroxisomal proteins are synthesized on free polysomes and then transported from the cytoplasm to peroxisomes. This process is mediated by two short well-defined targeting signals in peroxisomal matrix proteins, but a well-defined targeting signal has not yet been described for peroxisomal membrane proteins (PMPs). One assumption in virtually all prior studies of PMP targeting is that a given protein contains one, and only one, distinct targeting signal. Here, we show that the metabolite transporter PMP34, an integral PMP, contains at least two nonoverlapping sets of targeting information, either of which is sufficient for insertion into the peroxisome membrane. We also show that another integral PMP, the peroxin PEX13, also contains two independent sets of peroxisomal targeting information. These results challenge a major assumption of most PMP targeting studies. In addition, we demonstrate that PEX19, a factor required for peroxisomal membrane biogenesis, interacts with the two minimal targeting regions of PMP34. Together, these results raise the interesting possibility that PMP import may require novel mechanisms to ensure the solubility of integral PMPs before their insertion in the peroxisome membrane, and that PEX19 may play a central role in this process. PMID:11402059

  12. A computational framework for gene regulatory network inference that combines multiple methods and datasets

    PubMed Central

    2011-01-01

    Background Reverse engineering in systems biology entails inference of gene regulatory networks from observational data. This data typically include gene expression measurements of wild type and mutant cells in response to a given stimulus. It has been shown that when more than one type of experiment is used in the network inference process the accuracy is higher. Therefore the development of generally applicable and effective methodologies that embed multiple sources of information in a single computational framework is a worthwhile objective. Results This paper presents a new method for network inference, which uses multi-objective optimisation (MOO) to integrate multiple inference methods and experiments. We illustrate the potential of the methodology by combining ODE and correlation-based network inference procedures as well as time course and gene inactivation experiments. Here we show that our methodology is effective for a wide spectrum of data sets and method integration strategies. Conclusions The approach we present in this paper is flexible and can be used in any scenario that benefits from integration of multiple sources of information and modelling procedures in the inference process. Moreover, the application of this method to two case studies representative of bacteria and vertebrate systems has shown potential in identifying key regulators of important biological processes. PMID:21489290

  13. Accelerating Ab Initio Path Integral Simulations via Imaginary Multiple-Timestepping.

    PubMed

    Cheng, Xiaolu; Herr, Jonathan D; Steele, Ryan P

    2016-04-12

    This work investigates the use of multiple-timestep schemes in imaginary time for computationally efficient ab initio equilibrium path integral simulations of quantum molecular motion. In the simplest formulation, only every n(th) path integral replica is computed at the target level of electronic structure theory, whereas the remaining low-level replicas still account for nuclear motion quantum effects with a more computationally economical theory. Motivated by recent developments for multiple-timestep techniques in real-time classical molecular dynamics, both 1-electron (atomic-orbital basis set) and 2-electron (electron correlation) truncations are shown to be effective. Structural distributions and thermodynamic averages are tested for representative analytic potentials and ab initio molecular examples. Target quantum chemistry methods include density functional theory and second-order Møller-Plesset perturbation theory, although any level of theory is formally amenable to this framework. For a standard two-level splitting, computational speedups of 1.6-4.0x are observed when using a 4-fold reduction in time slices; an 8-fold reduction is feasible in some cases. Multitiered options further reduce computational requirements and suggest that quantum mechanical motion could potentially be obtained at a cost not significantly different from the cost of classical simulations.

  14. Characterizing lentic freshwater fish assemblages using multiple sampling methods.

    PubMed

    Fischer, Jesse R; Quist, Michael C

    2014-07-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48-1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  15. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    USGS Publications Warehouse

    Fischer, Jesse R.; Quist, Michael

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  16. Identifying Multiple Submissions in Internet Research: Preserving Data Integrity

    PubMed Central

    Bowen, Anne M.; Daniel, Candice M.; Williams, Mark L.; Baird, Grayson L.

    2008-01-01

    Internet-based sexuality research with hidden populations has become increasingly popular. Respondent anonymity may encourage participation and lower social desirability, but associated disinhibition may promote multiple submissions, especially when incentives are offered. The goal of this study was to identify the usefulness of different variables for detecting multiple submissions from repeat responders and to explore incentive effects. The data included 1,900 submissions from a three-session Internet intervention with a pretest and three post-test questionnaires. Participants were men who have sex with men and incentives were offered to rural participants for completing each questionnaire. The final number of submissions included 1,273 “unique”, 132 first submissions by “repeat responders” and 495 additional submissions by the “repeat responders” (N = 1,900). Four categories of repeat responders were identified: “infrequent” (2–5 submissions), “persistent” (6–10 submissions), “very persistent” (11–30 submissions), and “hackers” (more than 30 submissions). Internet Provider (IP) addresses, user names, and passwords were the most useful for identifying “infrequent” repeat responders. “Hackers” often varied their IP address and identifying information to prevent easy identification, but investigating the data for small variations in IP, using reverse telephone look up, and patterns across usernames and passwords were helpful. Incentives appeared to play a role in stimulating multiple submissions, especially from the more sophisticated “hackers”. Finally, the web is ever evolving and it will be necessary to have good programmers and staff who evolve as fast as “hackers”. PMID:18240015

  17. Method and apparatus for fiber optic multiple scattering suppression

    NASA Technical Reports Server (NTRS)

    Ackerson, Bruce J. (Inventor)

    2000-01-01

    The instant invention provides a method and apparatus for use in laser induced dynamic light scattering which attenuates the multiple scattering component in favor of the single scattering component. The preferred apparatus utilizes two light detectors that are spatially and/or angularly separated and which simultaneously record the speckle pattern from a single sample. The recorded patterns from the two detectors are then cross correlated in time to produce one point on a composite single/multiple scattering function curve. By collecting and analyzing cross correlation measurements that have been taken at a plurality of different spatial/angular positions, the signal representative of single scattering may be differentiated from the signal representative of multiple scattering, and a near optimum detector separation angle for use in taking future measurements may be determined.

  18. Sequential multiple methods as a contemporary method in learning disability nursing practice research.

    PubMed

    Mafuba, Kay; Gates, Bob

    2012-12-01

    This paper explores and advocates the use of sequential multiple methods as a contemporary strategy for undertaking research. Sequential multiple methods involve the use of results obtained through one data collection method to determine the direction and implementation of subsequent stages of a research project (Morse, 1991; Morgan, 1998). This paper will also explore the significance of how triangulating research at the epistemological, theoretical and methodological levels could enhance research. Finally the paper evaluates the significance of sequential multiple method in learning disability nursing research practice.

  19. Multigrid method for integral equations and automatic programs

    NASA Technical Reports Server (NTRS)

    Lee, Hosae

    1993-01-01

    Several iterative algorithms based on multigrid methods are introduced for solving linear Fredholm integral equations of the second kind. Automatic programs based on these algorithms are introduced using Simpson's rule and the piecewise Gaussian rule for numerical integration.

  20. Integration by differentiation: new proofs, methods and examples

    NASA Astrophysics Data System (ADS)

    Jia, Ding; Tang, Eugene; Kempf, Achim

    2017-06-01

    Recently, new methods were introduced which allow one to solve ordinary integrals by performing only derivatives. These studies were originally motivated by the difficulties of the quantum field theoretic path integral, and correspondingly, the results were derived by heuristic methods. Here, we give rigorous proofs for the methods to hold on fully specified function spaces. We then illustrate the efficacy of the new methods by applying them to the study of the surprising behavior of so-called Borwein integrals.

  1. Calculation of transonic flows using an extended integral equation method

    NASA Technical Reports Server (NTRS)

    Nixon, D.

    1976-01-01

    An extended integral equation method for transonic flows is developed. In the extended integral equation method velocities in the flow field are calculated in addition to values on the aerofoil surface, in contrast with the less accurate 'standard' integral equation method in which only surface velocities are calculated. The results obtained for aerofoils in subcritical flow and in supercritical flow when shock waves are present compare satisfactorily with the results of recent finite difference methods.

  2. A method for interactive specification of multiple-block topologies

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.; Mccann, Karen M.

    1991-01-01

    A method is presented for dealing with the vast amount of topological and other data which must be specified to generate a multiple-block computational grid. Specific uses of the graphical capabilities of a powerful scientific workstation are described which reduce the burden on the user of collecting and formatting such large amounts of data. A program to implement this method, 3DPREP, is described. A plotting transformation algorithm, some useful software tools, notes on programming, and a database organization are also presented. Example grids developed using the method are shown.

  3. A method for interactive specification of multiple-block topologies

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.; Mccann, Karen M.

    1991-01-01

    A method is presented for dealing with the vast amount of topological and other data which must be specified to generate a multiple-block computational grid. Specific uses of the graphical capabilities of a powerful scientific workstation are described which reduce the burden on the user of collecting and formatting such large amounts of data. A program to implement this method, 3DPREP, is described. A plotting transformation algorithm, some useful software tools, notes on programming, and a database organization are also presented. Example grids developed using the method are shown.

  4. Assessing and predicting protein interactions by combining manifold embedding with multiple information integration

    PubMed Central

    2012-01-01

    Background Protein-protein interactions (PPIs) play crucial roles in virtually every aspect of cellular function within an organism. Over the last decade, the development of novel high-throughput techniques has resulted in enormous amounts of data and provided valuable resources for studying protein interactions. However, these high-throughput protein interaction data are often associated with high false positive and false negative rates. It is therefore highly desirable to develop scalable methods to identify these errors from the computational perspective. Results We have developed a robust computational technique for assessing the reliability of interactions and predicting new interactions by combining manifold embedding with multiple information integration. Validation of the proposed method was performed with extensive experiments on densely-connected and sparse PPI networks of yeast respectively. Results demonstrate that the interactions ranked top by our method have high functional homogeneity and localization coherence. Conclusions Our proposed method achieves better performances than the existing methods no matter assessing or predicting protein interactions. Furthermore, our method is general enough to work over a variety of PPI networks irrespectively of densely-connected or sparse PPI network. Therefore, the proposed algorithm is a much more promising method to detect both false positive and false negative interactions in PPI networks. PMID:22595000

  5. Empathetic, Critical Integrations of Multiple Perspectives: A Core Practice for Language Teacher Education?

    ERIC Educational Resources Information Center

    Daniel, Shannon M.

    2015-01-01

    In this self-study, the author reflects on her implementation of empathetic, critical integrations of multiple perspectives (ECI), which she designed to afford preservice teachers the opportunity to discuss and collectively reflect upon the oft-diverging multiple perspectives, values, and practices they experience during their practicum (Daniel,…

  6. Empathetic, Critical Integrations of Multiple Perspectives: A Core Practice for Language Teacher Education?

    ERIC Educational Resources Information Center

    Daniel, Shannon M.

    2015-01-01

    In this self-study, the author reflects on her implementation of empathetic, critical integrations of multiple perspectives (ECI), which she designed to afford preservice teachers the opportunity to discuss and collectively reflect upon the oft-diverging multiple perspectives, values, and practices they experience during their practicum (Daniel,…

  7. Energy Simulation of Integrated Multiple-Zone Variable Refrigerant Flow System

    SciTech Connect

    Shen, Bo; Rice, C Keith; Baxter, Van D

    2013-01-01

    We developed a detailed steady-state system model, to simulate the performance of an integrated five-zone variable refrigerant flow (VRF)heat pump system. The system is multi-functional, capable of space cooling, space heating, combined space cooling and water heating, and dedicated water heating. Methods were developed to map the VRF performance in each mode, based on the abundant data produced by the equipment system model. The performance maps were used in TRNSYS annual energy simulations. Using TRNSYS, we have successfully setup and run cases for a multiple-split, VRF heat pump and dehumidifier combination in 5-zone houses in 5 climates that control indoor dry-bulb temperature and relative humidity. We compared the calculated energy consumptions for the VRF heat pump against that of a baseline central air source heat pump, coupled with electric water heating and the standalone dehumidifiers. In addition, we investigated multiple control scenarios for the VRF heat pump, i.e. on/off control, variable indoor air flow rate, and using different zone temperature setting schedules, etc. The energy savings for the multiple scenarios were assessed.

  8. Students' integration of multiple representations in a titration experiment

    NASA Astrophysics Data System (ADS)

    Kunze, Nicole M.

    A complete understanding of a chemical concept is dependent upon a student's ability to understand the microscopic or particulate nature of the phenomenon and integrate the microscopic, symbolic, and macroscopic representations of the phenomenon. Acid-base chemistry is a general chemistry topic requiring students to understand the topics of chemical reactions, solutions, and equilibrium presented earlier in the course. In this study, twenty-five student volunteers from a second semester general chemistry course completed two interviews. The first interview was completed prior to any classroom instruction on acids and bases. The second interview took place after classroom instruction, a prelab activity consisting of a titration calculation worksheet, a titration computer simulation, or a microscopic level animation of a titration, and two microcomputer-based laboratory (MBL) titration experiments. During the interviews, participants were asked to define and describe acid-base concepts and in the second interview they also drew the microscopic representations of four stages in an acid-base titration. An analysis of the data showed that participants had integrated the three representations of an acid-base titration to varying degrees. While some participants showed complete understanding of acids, bases, titrations, and solution chemistry, other participants showed several alternative conceptions concerning strong acid and base dissociation, the formation of titration products, and the dissociation of soluble salts. Before instruction, participants' definitions of acid, base, and pH were brief and consisted of descriptive terms. After instruction, the definitions were more scientific and reflected the definitions presented during classroom instruction.

  9. Galerkin projection methods for solving multiple related linear systems

    SciTech Connect

    Chan, T.F.; Ng, M.; Wan, W.L.

    1996-12-31

    We consider using Galerkin projection methods for solving multiple related linear systems A{sup (i)}x{sup (i)} = b{sup (i)} for 1 {le} i {le} s, where A{sup (i)} and b{sup (i)} are different in general. We start with the special case where A{sup (i)} = A and A is symmetric positive definite. The method generates a Krylov subspace from a set of direction vectors obtained by solving one of the systems, called the seed system, by the CG method and then projects the residuals of other systems orthogonally onto the generated Krylov subspace to get the approximate solutions. The whole process is repeated with another unsolved system as a seed until all the systems are solved. We observe in practice a super-convergence behaviour of the CG process of the seed system when compared with the usual CG process. We also observe that only a small number of restarts is required to solve all the systems if the right-hand sides are close to each other. These two features together make the method particularly effective. In this talk, we give theoretical proof to justify these observations. Furthermore, we combine the advantages of this method and the block CG method and propose a block extension of this single seed method. The above procedure can actually be modified for solving multiple linear systems A{sup (i)}x{sup (i)} = b{sup (i)}, where A{sup (i)} are now different. We can also extend the previous analytical results to this more general case. Applications of this method to multiple related linear systems arising from image restoration and recursive least squares computations are considered as examples.

  10. Exercise in multiple sclerosis -- an integral component of disease management

    PubMed Central

    2012-01-01

    Multiple sclerosis (MS) is the most common chronic inflammatory disorder of the central nervous system (CNS) in young adults. The disease causes a wide range of symptoms depending on the localization and characteristics of the CNS pathology. In addition to drug-based immunomodulatory treatment, both drug-based and non-drug approaches are established as complementary strategies to alleviate existing symptoms and to prevent secondary diseases. In particular, physical therapy like exercise and physiotherapy can be customized to the individual patient's needs and has the potential to improve the individual outcome. However, high quality systematic data on physical therapy in MS are rare. This article summarizes the current knowledge on the influence of physical activity and exercise on disease-related symptoms and physical restrictions in MS patients. Other treatment strategies such as drug treatments or cognitive training were deliberately excluded for the purposes of this article. PMID:22738091

  11. Integrating regional conservation priorities for multiple objectives into national policy.

    PubMed

    Beger, Maria; McGowan, Jennifer; Treml, Eric A; Green, Alison L; White, Alan T; Wolff, Nicholas H; Klein, Carissa J; Mumby, Peter J; Possingham, Hugh P

    2015-09-14

    Multinational conservation initiatives that prioritize investment across a region invariably navigate trade-offs among multiple objectives. It seems logical to focus where several objectives can be achieved efficiently, but such multi-objective hotspots may be ecologically inappropriate, or politically inequitable. Here we devise a framework to facilitate a regionally cohesive set of marine-protected areas driven by national preferences and supported by quantitative conservation prioritization analyses, and illustrate it using the Coral Triangle Initiative. We identify areas important for achieving six objectives to address ecosystem representation, threatened fauna, connectivity and climate change. We expose trade-offs between areas that contribute substantially to several objectives and those meeting one or two objectives extremely well. Hence there are two strategies to guide countries choosing to implement regional goals nationally: multi-objective hotspots and complementary sets of single-objective priorities. This novel framework is applicable to any multilateral or global initiative seeking to apply quantitative information in decision making.

  12. Multiple light scattering methods for multiphase flow diagnostics

    NASA Astrophysics Data System (ADS)

    Estevadeordal, Jordi

    2015-11-01

    Multiphase flows of gases and liquids containing droplets, bubbles, or particulates present light scattering imaging challenges due to the interference from each phase, such as secondary reflections, extinctions, absorptions, and refractions. These factors often prevent the unambiguous detection of each phase and also produce undesired beam steering. The effects can be especially complex in presence of dense phases, multispecies flows, and high pressure environments. This investigation reports new methods for overcoming these effects for quantitative measurements of velocity, density, and temperature fields. The methods are based on light scattering techniques combining Mie and filtered Rayleigh scattering and light extinction analyses and measurements. The optical layout is designed to perform multiple property measurements with improved signal from each phase via laser spectral and polarization characterization, etalon decontamination, and use of multiple wavelengths and imaging detectors.

  13. An integrated modeling method for wind turbines

    NASA Astrophysics Data System (ADS)

    Fadaeinedjad, Roohollah

    To study the interaction of the electrical, mechanical, and aerodynamic aspects of a wind turbine, a detailed model that considers all these aspects must be used. A drawback of many studies in the area of wind turbine simulation is that either a very simple mechanical model is used with a detailed electrical model, or vice versa. Hence the interactions between electrical and mechanical aspects of wind turbine operation are not accurately taken into account. In this research, it will be shown that a combination of different simulation packages, namely TurbSim, FAST, and Simulink can be used to model the aerodynamic, mechanical, and electrical aspects of a wind turbine in detail. In this thesis, after a review of some wind turbine concepts and software tools, a simulation structure is proposed for studying wind turbines that integrates the mechanical and electrical components of a wind energy conversion device. Based on the simulation structure, a comprehensive model for a three-bladed variable speed wind turbine with doubly-fed induction generator is developed. Using the model, the impact of a voltage sag on the wind turbine tower vibration is investigated under various operating conditions such as power system short circuit level, mechanical parameters, and wind turbine operating conditions. It is shown how an electrical disturbance can cause more sustainable tower vibrations under high speed and turbulent wind conditions, which may disrupt the operation of pitch control system. A similar simulation structure is used to model a two-bladed fixed speed wind turbine with an induction generator. An extension of the concept is introduced by adding a diesel generator system. The model is utilized to study the impact of the aeroelastic aspects of wind turbine (i.e. tower shadow, wind shears, yaw error, turbulence, and mechanical vibrations) on the power quality of a stand-alone wind-diesel system. Furthermore, an IEEE standard flickermeter model is implemented in a

  14. Modular multiple sensors information management for computer-integrated surgery.

    PubMed

    Vaccarella, Alberto; Enquobahrie, Andinet; Ferrigno, Giancarlo; Momi, Elena De

    2012-09-01

    In the past 20 years, technological advancements have modified the concept of modern operating rooms (ORs) with the introduction of computer-integrated surgery (CIS) systems, which promise to enhance the outcomes, safety and standardization of surgical procedures. With CIS, different types of sensor (mainly position-sensing devices, force sensors and intra-operative imaging devices) are widely used. Recently, the need for a combined use of different sensors raised issues related to synchronization and spatial consistency of data from different sources of information. In this study, we propose a centralized, multi-sensor management software architecture for a distributed CIS system, which addresses sensor information consistency in both space and time. The software was developed as a data server module in a client-server architecture, using two open-source software libraries: Image-Guided Surgery Toolkit (IGSTK) and OpenCV. The ROBOCAST project (FP7 ICT 215190), which aims at integrating robotic and navigation devices and technologies in order to improve the outcome of the surgical intervention, was used as the benchmark. An experimental protocol was designed in order to prove the feasibility of a centralized module for data acquisition and to test the application latency when dealing with optical and electromagnetic tracking systems and ultrasound (US) imaging devices. Our results show that a centralized approach is suitable for minimizing synchronization errors; latency in the client-server communication was estimated to be 2 ms (median value) for tracking systems and 40 ms (median value) for US images. The proposed centralized approach proved to be adequate for neurosurgery requirements. Latency introduced by the proposed architecture does not affect tracking system performance in terms of frame rate and limits US images frame rate at 25 fps, which is acceptable for providing visual feedback to the surgeon in the OR. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Lidar Tracking of Multiple Fluorescent Tracers: Method and Field Test

    NASA Technical Reports Server (NTRS)

    Eberhard, Wynn L.; Willis, Ron J.

    1992-01-01

    Past research and applications have demonstrated the advantages and usefulness of lidar detection of a single fluorescent tracer to track air motions. Earlier researchers performed an analytical study that showed good potential for lidar discrimination and tracking of two or three different fluorescent tracers at the same time. The present paper summarizes the multiple fluorescent tracer method, discusses its expected advantages and problems, and describes our field test of this new technique.

  16. Measuring multiple residual-stress components using the contour method and multiple cuts

    SciTech Connect

    Prime, Michael B; Swenson, Hunter; Pagliaro, Pierluigi; Zuccarello, Bernardo

    2009-01-01

    The conventional contour method determines one component of stress over the cross section of a part. The part is cut into two, the contour of the exposed surface is measured, and Bueckner's superposition principle is analytically applied to calculate stresses. In this paper, the contour method is extended to the measurement of multiple stress components by making multiple cuts with subsequent applications of superposition. The theory and limitations are described. The theory is experimentally tested on a 316L stainless steel disk with residual stresses induced by plastically indenting the central portion of the disk. The stress results are validated against independent measurements using neutron diffraction. The theory has implications beyond just multiple cuts. The contour method measurements and calculations for the first cut reveal how the residual stresses have changed throughout the part. Subsequent measurements of partially relaxed stresses by other techniques, such as laboratory x-rays, hole drilling, or neutron or synchrotron diffraction, can be superimposed back to the original state of the body.

  17. New methods for the numerical integration of ordinary differential equations and their application to the equations of motion of spacecraft

    NASA Technical Reports Server (NTRS)

    Banyukevich, A.; Ziolkovski, K.

    1975-01-01

    A number of hybrid methods for solving Cauchy problems are described on the basis of an evaluation of advantages of single and multiple-point numerical integration methods. The selection criterion is the principle of minimizing computer time. The methods discussed include the Nordsieck method, the Bulirsch-Stoer extrapolation method, and the method of recursive Taylor-Steffensen power series.

  18. Solution of elastoplastic torsion problem by boundary integral method

    NASA Technical Reports Server (NTRS)

    Mendelson, A.

    1975-01-01

    The boundary integral method was applied to the elastoplastic analysis of the torsion of prismatic bars, and the results are compared with those obtained by the finite difference method. Although fewer unknowns were used, very good accuracy was obtained with the boundary integral method. Both simply and multiply connected bodies can be handled with equal ease.

  19. Integrating regional conservation priorities for multiple objectives into national policy

    PubMed Central

    Beger, Maria; McGowan, Jennifer; Treml, Eric A.; Green, Alison L.; White, Alan T.; Wolff, Nicholas H.; Klein, Carissa J.; Mumby, Peter J.; Possingham, Hugh P.

    2015-01-01

    Multinational conservation initiatives that prioritize investment across a region invariably navigate trade-offs among multiple objectives. It seems logical to focus where several objectives can be achieved efficiently, but such multi-objective hotspots may be ecologically inappropriate, or politically inequitable. Here we devise a framework to facilitate a regionally cohesive set of marine-protected areas driven by national preferences and supported by quantitative conservation prioritization analyses, and illustrate it using the Coral Triangle Initiative. We identify areas important for achieving six objectives to address ecosystem representation, threatened fauna, connectivity and climate change. We expose trade-offs between areas that contribute substantially to several objectives and those meeting one or two objectives extremely well. Hence there are two strategies to guide countries choosing to implement regional goals nationally: multi-objective hotspots and complementary sets of single-objective priorities. This novel framework is applicable to any multilateral or global initiative seeking to apply quantitative information in decision making. PMID:26364769

  20. An integrated decision making approach for assessing healthcare waste treatment technologies from a multiple stakeholder.

    PubMed

    Shi, Hua; Liu, Hu-Chen; Li, Ping; Xu, Xue-Guo

    2017-01-01

    With increased worldwide awareness of environmental issues, healthcare waste (HCW) management has received much attention from both researchers and practitioners over the past decade. The task of selecting the optimum treatment technology for HCWs is a challenging decision making problem involving conflicting evaluation criteria and multiple stakeholders. In this paper, we develop an integrated decision making framework based on cloud model and MABAC method for evaluating and selecting the best HCW treatment technology from a multiple stakeholder perspective. The introduced framework deals with uncertain linguistic assessments of alternatives by using interval 2-tuple linguistic variables, determines decision makers' relative weights based on the uncertainty and divergence degrees of every decision maker, and obtains the ranking of all HCW disposal alternatives with the aid of an extended MABAC method. Finally, an empirical example from Shanghai, China, is provided to illustrate the feasibility and effectiveness of the proposed approach. Results indicate that the methodology being proposed is more suitable and effective to handle the HCW treatment technology selection problem under vague and uncertain information environment.

  1. Multiple-Time Step Ab Initio Molecular Dynamics Based on Two-Electron Integral Screening.

    PubMed

    Fatehi, Shervin; Steele, Ryan P

    2015-03-10

    A multiple-timestep ab initio molecular dynamics scheme based on varying the two-electron integral screening method used in Hartree-Fock or density functional theory calculations is presented. Although screening is motivated by numerical considerations, it is also related to separations in the length- and timescales characterizing forces in a molecular system: Loose thresholds are sufficient to describe fast motions over short distances, while tight thresholds may be employed for larger length scales and longer times, leading to a practical acceleration of ab initio molecular dynamics simulations. Standard screening approaches can lead, however, to significant discontinuities in (and inconsistencies between) the energy and gradient when the screening threshold is loose, making them inappropriate for use in dynamics. To remedy this problem, a consistent window-screening method that smooths these discontinuities is devised. Further algorithmic improvements reuse electronic-structure information within the dynamics step and enhance efficiency relative to a naı̈ve multiple-timestepping protocol. The resulting scheme is shown to realize meaningful reductions in the cost of Hartree-Fock and B3LYP simulations of a moderately large system, the protonated sarcosine/glycine dipeptide embedded in a 19-water cluster.

  2. Integrating multiple scientific computing needs via a Private Cloud infrastructure

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Brunetti, R.; Lusso, S.; Vallero, S.

    2014-06-01

    In a typical scientific computing centre, diverse applications coexist and share a single physical infrastructure. An underlying Private Cloud facility eases the management and maintenance of heterogeneous use cases such as multipurpose or application-specific batch farms, Grid sites catering to different communities, parallel interactive data analysis facilities and others. It allows to dynamically and efficiently allocate resources to any application and to tailor the virtual machines according to the applications' requirements. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques; for example, rolling updates can be performed easily and minimizing the downtime. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 site and a dynamically expandable PROOF-based Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The Private Cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem (used in two different configurations for worker- and service-class hypervisors) and the OpenWRT Linux distribution (used for network virtualization). A future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and by using mainstream contextualization tools like CloudInit.

  3. Propagation error minimization method for multiple structural displacement monitoring system

    NASA Astrophysics Data System (ADS)

    Jeon, Haemin; Shin, Jae-Uk; Myung, Hyun

    2013-04-01

    In the previous study, a visually servoed paired structured light system (ViSP) which is composed of two sides facing each other, each with one or two lasers, a 2-DOF manipulator, a camera, and a screen has been proposed. The lasers project their parallel beams to the screen on the opposite side and 6-DOF relative displacement between two sides is estimated by calculating positions of the projected laser beams and rotation angles of the manipulators. To apply the system to massive civil structures such as long-span bridges or high-rise buildings, the whole area should be divided into multiple partitions and each ViSP module is placed in each partition in a cascaded manner. In other words, the movement of the entire structure can be monitored by multiplying the estimated displacements from multiple ViSP modules. In the multiplication, however, there is a major problem that the displacement estimation error is propagated throughout the multiple modules. To solve the problem, propagation error minimization method (PEMM) which uses Newton-Raphson formulation inspired by the error back-propagation algorithm is proposed. In this method, a propagation error at the last module is calculated and then the estimated displacement from ViSP at each partition is updated in reverse order by using the proposed PEMM that minimizes the propagation error. To verify the performance of the proposed method, various simulations and experimental tests have been performed. The results show that the propagation error is significantly reduced after applying PEMM.

  4. Integrating Stratification and Information Approaches for Multiple Constrained CAT.

    ERIC Educational Resources Information Center

    Leung, Chi-Keung; Chang, Hua-Hua; Hau, Kit-Tai

    It is widely believed that item selection methods using the maximum information approach (MI) can maintain high efficiency in trait estimation by repeatedly choosing high discriminating (alpha) items. However, the consequence is that they lead to extremely skewed item exposure distribution in which items with high alpha values becoming overly…

  5. Numerical Simulation of Antennas with Improved Integral Equation Method

    NASA Astrophysics Data System (ADS)

    Ma, Ji; Fang, Guang-You; Lu, Wei

    2015-08-01

    Simulating antennas around a conducting object is a challenge task in computational electromagnetism, which is concerned with the behaviour of electromagnetic fields. To analyze this model efficiently, an improved integral equation-fast Fourier transform (IE-FFT) algorithm is presented in this paper. The proposed scheme employs two Cartesian grids with different size and location to enclose the antenna and the other object, respectively. On the one hand, IE-FFT technique is used to store matrix in a sparse form and accelerate the matrix-vector multiplication for each sub-domain independently. On the other hand, the mutual interaction between sub-domains is taken as the additional exciting voltage in each matrix equation. By updating integral equations several times, the whole electromagnetic system can achieve a stable status. Finally, the validity of the presented method is verified through the analysis of typical antennas in the presence of a conducting object. Supported by in part China Postdoctoral Science Foundation under Grant No. 2014M550839, and in part by the Key Research Program of the Chinese Academy of Sciences under Grant No. KGZD-EW-603

  6. Assessing District Energy Systems Performance Integrated with Multiple Thermal Energy Storages

    NASA Astrophysics Data System (ADS)

    Rezaie, Behnaz

    The goal of this study is to examine various energy resources in district energy (DE) systems and then DE system performance development by means of multiple thermal energy storages (TES) application. This study sheds light on areas not yet investigated precisely in detail. Throughout the research, major components of the heat plant, energy suppliers of the DE systems, and TES characteristics are separately examined; integration of various configurations of the multiple TESs in the DE system is then analysed. In the first part of the study, various sources of energy are compared, in a consistent manner, financially and environmentally. The TES performance is then assessed from various aspects. Then, TES(s) and DE systems with several sources of energy are integrated, and are investigated as a heat process centre. The most efficient configurations of the multiple TESs integrated with the DE system are investigated. Some of the findings of this study are applied on an actual DE system. The outcomes of this study provide insight for researchers and engineers who work in this field, as well as policy makers and project managers who are decision-makers. The accomplishments of the study are original developments TESs and DE systems. As an original development the Enviro-Economic Function, to balance the economic and environmental aspects of energy resources technologies in DE systems, is developed; various configurations of multiple TESs, including series, parallel, and general grid, are developed. The developed related functions are discharge temperature and energy of the TES, and energy and exergy efficiencies of the TES. The TES charging and discharging behavior of TES instantaneously is also investigated to obtain the charging temperature, the maximum charging temperature, the charging energy flow, maximum heat flow capacity, the discharging temperature, the minimum charging temperature, the discharging energy flow, the maximum heat flow capacity, and performance

  7. Integration of multiple research disciplines on the International Space Station

    NASA Technical Reports Server (NTRS)

    Penley, N. J.; Uri, J.; Sivils, T.; Bartoe, J. D.

    2000-01-01

    The International Space Station will provide an extremely high-quality, long-duration microgravity environment for the conduct of research. In addition, the ISS offers a platform for performing observations of Earth and Space from a high-inclination orbit, outside of the Earth's atmosphere. This unique environment and observational capability offers the opportunity for advancement in a diverse set of research fields. Many of these disciplines do not relate to one another, and present widely differing approaches to study, as well as different resource and operational requirements. Significant challenges exist to ensure the highest quality research return for each investigation. Requirements from different investigations must be identified, clarified, integrated and communicated to ISS personnel in a consistent manner. Resources such as power, crew time, etc. must be apportioned to allow the conduct of each investigation. Decisions affecting research must be made at the strategic level as well as at a very detailed execution level. The timing of the decisions can range from years before an investigation to real-time operations. The international nature of the Space Station program adds to the complexity. Each participating country must be assured that their interests are represented during the entire planning and operations process. A process for making decisions regarding research planning, operations, and real-time replanning is discussed. This process ensures adequate representation of all research investigators. It provides a means for timely decisions, and it includes a means to ensure that all ISS International Partners have their programmatic interests represented. c 2000 Published by Elsevier Science Ltd. All rights reserved.

  8. Integration of multiple research disciplines on the International Space Station

    NASA Technical Reports Server (NTRS)

    Penley, N. J.; Uri, J.; Sivils, T.; Bartoe, J. D.

    2000-01-01

    The International Space Station will provide an extremely high-quality, long-duration microgravity environment for the conduct of research. In addition, the ISS offers a platform for performing observations of Earth and Space from a high-inclination orbit, outside of the Earth's atmosphere. This unique environment and observational capability offers the opportunity for advancement in a diverse set of research fields. Many of these disciplines do not relate to one another, and present widely differing approaches to study, as well as different resource and operational requirements. Significant challenges exist to ensure the highest quality research return for each investigation. Requirements from different investigations must be identified, clarified, integrated and communicated to ISS personnel in a consistent manner. Resources such as power, crew time, etc. must be apportioned to allow the conduct of each investigation. Decisions affecting research must be made at the strategic level as well as at a very detailed execution level. The timing of the decisions can range from years before an investigation to real-time operations. The international nature of the Space Station program adds to the complexity. Each participating country must be assured that their interests are represented during the entire planning and operations process. A process for making decisions regarding research planning, operations, and real-time replanning is discussed. This process ensures adequate representation of all research investigators. It provides a means for timely decisions, and it includes a means to ensure that all ISS International Partners have their programmatic interests represented. c 2000 Published by Elsevier Science Ltd. All rights reserved.

  9. Computational methods in metallic alloys within multiple scattering theory

    NASA Astrophysics Data System (ADS)

    Rusanu, Aurelian

    Designing materials, particularly at the nano-scale, is an important scientific research area. It includes a large spectrum of basic science and technological developments. In order to provide results that are relevant to real materials, quantum mechanical simulations involving thousands to millions of atoms must be carried out. The locally self-consistent multiple scattering (LSMS) method is the method of choice for such calculations because it has a technical feature called order-N scaling. We describe an implementation of the LSMS for massively parallel supercomputers using k-space and real-space methods. For magnetic materials, the constrained local moment approach and the exchange interaction method are used. We demonstrate our approach by calculating the electronic and magnetic structure of an iron nano-particle embedded in an iron aluminide crystal matrix.

  10. Differential operator multiplication method for fractional differential equations

    NASA Astrophysics Data System (ADS)

    Tang, Shaoqiang; Ying, Yuping; Lian, Yanping; Lin, Stephen; Yang, Yibo; Wagner, Gregory J.; Liu, Wing Kam

    2016-11-01

    Fractional derivatives play a very important role in modeling physical phenomena involving long-range correlation effects. However, they raise challenges of computational cost and memory storage requirements when solved using current well developed numerical methods. In this paper, the differential operator multiplication method is proposed to address the issues by considering a reaction-advection-diffusion equation with a fractional derivative in time. The linear fractional differential equation is transformed into an integer order differential equation by the proposed method, which can fundamentally fix the aforementioned issues for select fractional differential equations. In such a transform, special attention should be paid to the initial conditions for the resulting differential equation of higher integer order. Through numerical experiments, we verify the proposed method for both fractional ordinary differential equations and partial differential equations.

  11. Integrated navigation method based on inertial navigation system and Lidar

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoyue; Shi, Haitao; Pan, Jianye; Zhang, Chunxi

    2016-04-01

    An integrated navigation method based on the inertial navigational system (INS) and Lidar was proposed for land navigation. Compared with the traditional integrated navigational method and dead reckoning (DR) method, the influence of the inertial measurement unit (IMU) scale factor and misalignment was considered in the new method. First, the influence of the IMU scale factor and misalignment on navigation accuracy was analyzed. Based on the analysis, the integrated system error model of INS and Lidar was established, in which the IMU scale factor and misalignment error states were included. Then the observability of IMU error states was analyzed. According to the results of the observability analysis, the integrated system was optimized. Finally, numerical simulation and a vehicle test were carried out to validate the availability and utility of the proposed INS/Lidar integrated navigational method. Compared with the test result of a traditional integrated navigation method and DR method, the proposed integrated navigational method could result in a higher navigation precision. Consequently, the IMU scale factor and misalignment error were effectively compensated by the proposed method and the new integrated navigational method is valid.

  12. Field evaluation of personal sampling methods for multiple bioaerosols.

    PubMed

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  13. Field Evaluation of Personal Sampling Methods for Multiple Bioaerosols

    PubMed Central

    Wang, Chi-Hsun; Chen, Bean T.; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols. PMID:25799419

  14. Lattice Boltzmann equation method for multiple immiscible continuum fluids.

    PubMed

    Spencer, T J; Halliday, I; Care, C M

    2010-12-01

    This paper generalizes the two-component algorithm of Sec. , extending it, in Sec. , to describe N>2 mutually immiscible fluids in the isothermal continuum regime. Each fluid has an independent interfacial tension. While retaining all its computational advantages, we remove entirely the empiricism associated with contact behavior in our previous multiple immiscible fluid models [M. M. Dupin, Phys. Rev. E 73, 055701(R) (2006); Med. Eng. Phys. 28, 13 (2006)] while solidifying the physical foundations. Moreover, the model relies upon a fluid-fluid segregation which is simpler, computationally faster, more free of artifacts (i.e., the interfacial microcurrent), and upon an interface-inducing force distribution which is analytic. The method is completely symmetric between any numbers of immiscible fluids and stable over a wide range of directly input interfacial tension. We present data on the steady-state properties of multiple interface model, which are in good agreement with theory [R. E. Johnson and S. S. Sadhal, Annu. Rev. Fluid Mech. 17, 289 (1985)], specifically on the shapes of multidrop systems. Section is an analysis of the kinetic and continuum-scale descriptions of the underlying two-component lattice Boltzmann model for immiscible fluids, extendable to more than two immiscible fluids. This extension requires (i) the use of a more local kinetic equation perturbation which is (ii) free from a reliance on measured interfacial curvature. It should be noted that viewed simply as a two-component method, the continuum algorithm is inferior to our previous methods, reported by Lishchuk [Phys. Rev. E 67, 036701 (2003)] and Halliday [Phys. Rev. E 76, 026708 (2007)]. Greater stability and parameter range is achieved in multiple drop simulations by using the forced multi-relaxation-time lattice Boltzmann method developed, along with (for completeness) a forced exactly incompressible Bhatnagar-Gross-Krook lattice Boltzmann model, in the Appendix. These appended schemes

  15. Enhancing subsurface information from the fusion of multiple geophysical methods

    NASA Astrophysics Data System (ADS)

    Jafargandomi, A.; Binley, A.

    2011-12-01

    Characterization of hydrologic systems is a key element in understanding and predicting their behaviour. Geophysical methods especially electrical methods (e.g., electrical resistivity tomography (ERT), induced polarization (IP) and electromagnetic (EM)) are becoming popular for such purpose due to their non-invasive nature, high sensitivity to hydrological parameters and the speed of measurements. However, interrogation of each geophysical method provides only limited information about some of the subsurface parameters. Therefore, in order to achieve a comprehensive picture from the hydrologic system, fusion of multiple geophysical data sets can be beneficial. Although a number of fusion approaches have been proposed in the literature, an aspect that has been generally overlooked is the assessment of information content from each measurement approach. Such an assessment provides useful insight for the design of future surveys. We develop a fusion strategy based on the capability of multiple geophysical methods to provide enough resolution to identify subsurface material parameters and structure. We apply a Bayesian framework to analyse the information in multiple geophysical data sets. In this approach multiple geophysical data sets are fed into a Markov chain Monte Carlo (McMC) inversion algorithm and the information content of the post-inversion result (posterior probability distribution) is quantified. We use Shannon's information measure to quantify the information obtained from the inversion of different combinations of geophysical data sets. In this strategy, information from multiple methods is brought together via introducing joint likelihood function and/or constraining the prior information. We apply the fusion tool to one of the target sites of the EU FP7 project ModelProbe which aims to develop technologies and tools for soil contamination assessment and site characterization. The target site is located close to Trecate (Novara - NW Italy). At this

  16. An Integrated Approach for Accessing Multiple Datasets through LANCE

    NASA Astrophysics Data System (ADS)

    Murphy, K. J.; Teague, M.; Conover, H.; Regner, K.; Beaumont, B.; Masuoka, E.; Vollmer, B.; Theobald, M.; Durbin, P.; Michael, K.; Boller, R. A.; Schmaltz, J. E.; Davies, D.; Horricks, K.; Ilavajhala, S.; Thompson, C. K.; Bingham, A.

    2011-12-01

    The NASA/GSFC Land Atmospheres Near-real time Capability for EOS (LANCE) provides imagery for approximately 40 data products from MODIS, AIRS, AMSR-E and OMI to support the applications community in the study of a variety of phenomena. Thirty-six of these products are available within 2.5 hours of observation at the spacecraft. The data set includes the population density data provided by the EOSDIS Socio-Economic Data and Applications Center (SEDAC). The purpose of this paper is to describe the variety of tools that have been developed by LANCE to support user access to the imagery. The long-standing Rapid Response system has been integrated into LANCE and is a major vehicle for the distribution of the imagery to end users. There are presently approximately 10,000 anonymous users per month accessing these imagery. The products are grouped into 14 applications categories such as Smoke Plumes, Pollution, Fires, Agriculture and the selection of any category will make relevant subsets of the 40 products available as possible overlays in an interactive Web Client utilizing Web Mapping Service (WMS) to support user investigations (http://lance2.modaps.eosdis.nasa.gov/wms/). For example, selecting Severe Storms will include 6 products for MODIS, OMI, AIRS, and AMSR-E plus the SEDAC population density data. The client and WMS were developed using open-source technologies such as OpenLayers and MapServer and provides a uniform, browser-based access to data products. All overlays are downloadable in PNG, JPEG, or GeoTiff form up to 200MB per request. The WMS was beta-tested with the user community and substantial performance improvements were made through the use of such techniques as tile-caching. LANCE established a partnership with Physical Oceanography Distributed Active Archive Center (PO DAAC) to develop an alternative presentation for the 40 data products known as the State of the Earth (SOTE). This provides a Google Earth-based interface to the products grouped in

  17. Planarian shows decision-making behavior in response to multiple stimuli by integrative brain function.

    PubMed

    Inoue, Takeshi; Hoshino, Hajime; Yamashita, Taiga; Shimoyama, Seira; Agata, Kiyokazu

    2015-01-01

    Planarians belong to an evolutionarily early group of organisms that possess a central nervous system including a well-organized brain with a simple architecture but many types of neurons. Planarians display a number of behaviors, such as phototaxis and thermotaxis, in response to external stimuli, and it has been shown that various molecules and neural pathways in the brain are involved in controlling these behaviors. However, due to the lack of combinatorial assay methods, it remains obscure whether planarians possess higher brain functions, including integration in the brain, in which multiple signals coming from outside are coordinated and used in determining behavioral strategies. In the present study, we designed chemotaxis and thigmotaxis/kinesis tracking assays to measure several planarian behaviors in addition to those measured by phototaxis and thermotaxis assays previously established by our group, and used these tests to analyze planarian chemotactic and thigmotactic/kinetic behaviors. We found that headless planarian body fragments and planarians that had specifically lost neural activity following regeneration-dependent conditional gene knockdown (Readyknock) of synaptotagmin in the brain lost both chemotactic and thigmotactic behaviors, suggesting that neural activity in the brain is required for the planarian's chemotactic and thigmotactic behaviors. Furthermore, we compared the strength of phototaxis, chemotaxis, thigmotaxis/kinesis, and thermotaxis by presenting simultaneous binary stimuli to planarians. We found that planarians showed a clear order of predominance of these behaviors. For example, when planarians were simultaneously exposed to 400 lux of light and a chemoattractant, they showed chemoattractive behavior irrespective of the direction of the light source, although exposure to light of this intensity alone induces evasive behavior away from the light source. In contrast, when the light intensity was increased to 800 or 1600 lux and

  18. A Comparison of Multiple-Event Location Methods

    NASA Astrophysics Data System (ADS)

    Engdahl, E. R.; Rodi, W.; Bergman, E. A.; Waldhauser, F.; Pavlis, G. L.; Israelsson, H.; Dewey, J. W.

    2003-12-01

    Multiple-event location methods solve jointly for the location parameters (hypocenters and origin times) of seismic events in a cluster and travel-time corrections at the stations recording the events. This paper reports some preliminary comparisons of five such methods that have been developed over the years: hypocentral decomposition (HDC), double differencing (DD), progressive multiple-event location (PMEL), joint hypocenter determination (JHD), and a recently developed algorithm based on grid search (GMEL). We have applied each method to two adjacent earthquake clusters in Turkey: 33 events from the 17 Aug 1999 Izmit earthquake sequence and 41 events from the 12 Nov 1999 Duzce sequence. Previously, Engdahl and Bergman (2001) had applied HDC to these clusters using Pn and teleseismic P arrival times from NEIC and ground-truth (local network) locations for a few of the events. Their data set comprised approximately 3500 arrivals at 640 stations for the Izmit cluster and 3200 arrivals at 600 stations for Duzce. We applied the other multiple-event location methods to the same set of phase picks, using the same phase identifications and fixed event depths that were used in the HDC analysis. While the five algorithms are quite different in their computational approach, our initial results indicate that the methods yield quite similar relative event locations when they are applied with the same data and assumptions. However, they resolve the trade-off between the centroid location of a cluster and station corrections differently, and they also differ in how they use ground-truth information to constrain this trade-off and obtain absolute event locations. The locations relative to the cluster centroids generally agreed within 5 km, but was on the order of 10 km in some instances. This may have to do with the different schemes for weighting data used by the different methods, which cannot always be equalized between methods. To test this hypothesis, we applied GMEL with

  19. Power-efficient method for IM-DD optical transmission of multiple OFDM signals.

    PubMed

    Effenberger, Frank; Liu, Xiang

    2015-05-18

    We propose a power-efficient method for transmitting multiple frequency-division multiplexed (FDM) orthogonal frequency-division multiplexing (OFDM) signals in intensity-modulation direct-detection (IM-DD) optical systems. This method is based on quadratic soft clipping in combination with odd-only channel mapping. We show, both analytically and experimentally, that the proposed approach is capable of improving the power efficiency by about 3 dB as compared to conventional FDM OFDM signals under practical bias conditions, making it a viable solution in applications such as optical fiber-wireless integrated systems where both IM-DD optical transmission and OFDM signaling are important.

  20. Determination of elementary first integrals of a generalized Raychaudhuri equation by the Darboux integrability method

    NASA Astrophysics Data System (ADS)

    Choudhury, A. Ghose; Guha, Partha; Khanra, Barun

    2009-10-01

    The Darboux integrability method is particularly useful to determine first integrals of nonplanar autonomous systems of ordinary differential equations, whose associated vector fields are polynomials. In particular, we obtain first integrals for a variant of the generalized Raychaudhuri equation, which has appeared in string inspired modern cosmology.

  1. Integrated Multiple “-omics” Data Reveal Subtypes of Hepatocellular Carcinoma

    PubMed Central

    Liu, Gang; Dong, Chuanpeng; Liu, Lei

    2016-01-01

    Hepatocellular carcinoma is one of the most heterogeneous cancers, as reflected by its multiple grades and difficulty to subtype. In this study, we integrated copy number variation, DNA methylation, mRNA, and miRNA data with the developed “cluster of cluster” method and classified 256 HCC samples from TCGA (The Cancer Genome Atlas) into five major subgroups (S1-S5). We observed that this classification was associated with specific mutations and protein expression, and we detected that each subgroup had distinct molecular signatures. The subclasses were associated not only with survival but also with clinical observations. S1 was characterized by bulk amplification on 8q24, TP53 mutation, low lipid metabolism, highly expressed onco-proteins, attenuated tumor suppressor proteins and a worse survival rate. S2 and S3 were characterized by telomere hypomethylation and a low expression of TERT and DNMT1/3B. Compared to S2, S3 was associated with less copy number variation and some good prognosis biomarkers, including CRP and CYP2E1. In contrast, the mutation rate of CTNNB1 was higher in S3. S4 was associated with bulk amplification and various molecular characteristics at different biological levels. In summary, we classified the HCC samples into five subgroups using multiple “-omics” data. Each subgroup had a distinct survival rate and molecular signature, which may provide information about the pathogenesis of subtypes in HCC. PMID:27806083

  2. Methods for radiation detection and characterization using a multiple detector probe

    DOEpatents

    Akers, Douglas William; Roybal, Lyle Gene

    2014-11-04

    Apparatuses, methods, and systems relating to radiological characterization of environments are disclosed. Multi-detector probes with a plurality of detectors in a common housing may be used to substantially concurrently detect a plurality of different radiation activities and types. Multiple multi-detector probes may be used in a down-hole environment to substantially concurrently detect radioactive activity and contents of a buried waste container. Software may process, analyze, and integrate the data from the different multi-detector probes and the different detector types therein to provide source location and integrated analysis as to the source types and activity in the measured environment. Further, the integrated data may be used to compensate for differential density effects and the effects of radiation shielding materials within the volume being measured.

  3. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    Fabricating and operating complex systems involves dealing with uncertainty in the relevant variables. In the case of aircraft, flow conditions are subject to change during operation. Efficiency and engine noise may be different from the expected values because of manufacturing tolerances and normal wear and tear. Engine components may have a shorter life than expected because of manufacturing tolerances. In spite of the important effect of operating- and manufacturing-uncertainty on the performance and expected life of the component or system, traditional aerodynamic shape optimization has focused on obtaining the best design given a set of deterministic flow conditions. Clearly it is important to both maintain near-optimal performance levels at off-design operating conditions, and, ensure that performance does not degrade appreciably when the component shape differs from the optimal shape due to manufacturing tolerances and normal wear and tear. These requirements naturally lead to the idea of robust optimal design wherein the concept of robustness to various perturbations is built into the design optimization procedure. The basic ideas involved in robust optimal design will be included in this lecture. The imposition of the additional requirement of robustness results in a multiple-objective optimization problem requiring appropriate solution procedures. Typically the costs associated with multiple-objective optimization are substantial. Therefore efficient multiple-objective optimization procedures are crucial to the rapid deployment of the principles of robust design in industry. Hence the companion set of lecture notes (Single- and Multiple-Objective Optimization with Differential Evolution and Neural Networks ) deals with methodology for solving multiple-objective Optimization problems efficiently, reliably and with little user intervention. Applications of the methodologies presented in the companion lecture to robust design will be included here. The

  4. Robust control of multiple integrators subject to input saturation and disturbance

    NASA Astrophysics Data System (ADS)

    Ding, Shihong; Zheng, Wei Xing

    2015-04-01

    This paper is concerned with the problem of robust stabilisation of multiple integrators systems subject to input saturation and disturbance from the viewpoint of state feedback and output feedback. First of all, without considering the disturbance, a backstepping-like method in conjunction with a series of saturation functions with different saturation levels is employed to design a nested-saturation based state-feedback controller with pre-chosen parameters. On this basis, taking the disturbance into account, a sliding mode disturbance observer (DOB) is adopted to estimate the states and the disturbance. Then, by combining the above state-feedback controller and the estimated states together, a composite controller with disturbance compensation is developed. With the removal of the non-increasing restriction on the saturation levels, the controller design becomes very flexible and the convergence performance of the closed-loop system is much improved. Meanwhile, with the aid of the estimated values by the DOB, we obtain not only the output-feedback control scheme but also the better disturbance rejection property for the closed-loop system. A simulation example of a triple integrators system is presented to substantiate the usefulness of the proposed technique.

  5. Thermally integrated staged methanol reformer and method

    DOEpatents

    Skala, Glenn William; Hart-Predmore, David James; Pettit, William Henry; Borup, Rodney Lynn

    2001-01-01

    A thermally integrated two-stage methanol reformer including a heat exchanger and first and second reactors colocated in a common housing in which a gaseous heat transfer medium circulates to carry heat from the heat exchanger into the reactors. The heat transfer medium comprises principally hydrogen, carbon dioxide, methanol vapor and water vapor formed in a first stage reforming reaction. A small portion of the circulating heat transfer medium is drawn off and reacted in a second stage reforming reaction which substantially completes the reaction of the methanol and water remaining in the drawn-off portion. Preferably, a PrOx reactor will be included in the housing upstream of the heat exchanger to supplement the heat provided by the heat exchanger.

  6. Multiple-time-stepping generalized hybrid Monte Carlo methods

    SciTech Connect

    Escribano, Bruno; Akhmatskaya, Elena; Reich, Sebastian; Azpiroz, Jon M.

    2015-01-01

    Performance of the generalized shadow hybrid Monte Carlo (GSHMC) method [1], which proved to be superior in sampling efficiency over its predecessors [2–4], molecular dynamics and hybrid Monte Carlo, can be further improved by combining it with multi-time-stepping (MTS) and mollification of slow forces. We demonstrate that the comparatively simple modifications of the method not only lead to better performance of GSHMC itself but also allow for beating the best performed methods, which use the similar force splitting schemes. In addition we show that the same ideas can be successfully applied to the conventional generalized hybrid Monte Carlo method (GHMC). The resulting methods, MTS-GHMC and MTS-GSHMC, provide accurate reproduction of thermodynamic and dynamical properties, exact temperature control during simulation and computational robustness and efficiency. MTS-GHMC uses a generalized momentum update to achieve weak stochastic stabilization to the molecular dynamics (MD) integrator. MTS-GSHMC adds the use of a shadow (modified) Hamiltonian to filter the MD trajectories in the HMC scheme. We introduce a new shadow Hamiltonian formulation adapted to force-splitting methods. The use of such Hamiltonians improves the acceptance rate of trajectories and has a strong impact on the sampling efficiency of the method. Both methods were implemented in the open-source MD package ProtoMol and were tested on a water and a protein systems. Results were compared to those obtained using a Langevin Molly (LM) method [5] on the same systems. The test results demonstrate the superiority of the new methods over LM in terms of stability, accuracy and sampling efficiency. This suggests that putting the MTS approach in the framework of hybrid Monte Carlo and using the natural stochasticity offered by the generalized hybrid Monte Carlo lead to improving stability of MTS and allow for achieving larger step sizes in the simulation of complex systems.

  7. Future Directions in Vulnerability to Depression among Youth: Integrating Risk Factors and Processes across Multiple Levels of Analysis

    PubMed Central

    Hankin, Benjamin L.

    2014-01-01

    Depression is a developmental phenomenon. Considerable progress has been made in describing the syndrome, establishing its prevalence and features, providing clues as to its etiology, and developing evidence-based treatment and prevention options. Despite considerable headway in distinct lines of vulnerability research, there is an explanatory gap in the field ability to more comprehensively explain and predict who is likely to become depressed, when, and why. Still, despite clear success in predicting moderate variance for future depression, especially with empirically rigorous methods and designs, the heterogeneous and multi-determined nature of depression suggests that additional etiologies need to be included to advance knowledge on developmental pathways to depression. This paper advocates for a multiple levels of analysis approach to investigating vulnerability to depression across the lifespan and providing a more comprehensive understanding of its etiology. One example of a multiple levels of analysis model of vulnerabilities to depression is provided that integrates the most accessible, observable factors (e.g., cognitive and temperament risks), intermediate processes and endophenotypes (e.g., information processing biases, biological stress physiology, and neural activation and connectivity), and genetic influences (e.g., candidate genes and epigenetics). Evidence for each of these factors as well as their cross-level integration is provided. Methodological and conceptual considerations important for conducting integrative, multiple levels of depression vulnerability research are discussed. Finally, translational implications for how a multiple levels of analysis perspective may confer additional leverage to reduce the global burden of depression and improve care are considered. PMID:22900513

  8. A multiple-phenotype imputation method for genetic studies.

    PubMed

    Dahl, Andrew; Iotchkova, Valentina; Baud, Amelie; Johansson, Åsa; Gyllensten, Ulf; Soranzo, Nicole; Mott, Richard; Kranis, Andreas; Marchini, Jonathan

    2016-04-01

    Genetic association studies have yielded a wealth of biological discoveries. However, these studies have mostly analyzed one trait and one SNP at a time, thus failing to capture the underlying complexity of the data sets. Joint genotype-phenotype analyses of complex, high-dimensional data sets represent an important way to move beyond simple genome-wide association studies (GWAS) with great potential. The move to high-dimensional phenotypes will raise many new statistical problems. Here we address the central issue of missing phenotypes in studies with any level of relatedness between samples. We propose a multiple-phenotype mixed model and use a computationally efficient variational Bayesian algorithm to fit the model. On a variety of simulated and real data sets from a range of organisms and trait types, we show that our method outperforms existing state-of-the-art methods from the statistics and machine learning literature and can boost signals of association.

  9. A multiple phenotype imputation method for genetic studies

    PubMed Central

    Dahl, Andrew; Iotchkova, Valentina; Baud, Amelie; Johansson, Åsa; Gyllensten, Ulf; Soranzo, Nicole; Mott, Richard; Kranis, Andreas; Marchini, Jonathan

    2016-01-01

    Genetic association studies have yielded a wealth of biologic discoveries. However, these have mostly analyzed one trait and one SNP at a time, thus failing to capture the underlying complexity of these datasets. Joint genotype-phenotype analyses of complex, high-dimensional datasets represent an important way to move beyond simple GWAS with great potential. The move to high-dimensional phenotypes will raise many new statistical problems. In this paper we address the central issue of missing phenotypes in studies with any level of relatedness between samples. We propose a multiple phenotype mixed model and use a computationally efficient variational Bayesian algorithm to fit the model. On a variety of simulated and real datasets from a range of organisms and trait types, we show that our method outperforms existing state-of-the-art methods from the statistics and machine learning literature and can boost signals of association. PMID:26901065

  10. Higher order time integration methods for two-phase flow

    NASA Astrophysics Data System (ADS)

    Kees, Christopher E.; Miller, Cass T.

    Time integration methods that adapt in both the order of approximation and time step have been shown to provide efficient solutions to Richards' equation. In this work, we extend the same method of lines approach to solve a set of two-phase flow formulations and address some mass conservation issues from the previous work. We analyze these formulations and the nonlinear systems that result from applying the integration methods, placing particular emphasis on their index, range of applicability, and mass conservation characteristics. We conduct numerical experiments to study the behavior of the numerical models for three test problems. We demonstrate that higher order integration in time is more efficient than standard low-order methods for a variety of practical grids and integration tolerances, that the adaptive scheme successfully varies the step size in response to changing conditions, and that mass balance can be maintained efficiently using variable-order integration and an appropriately chosen numerical model formulation.

  11. Treatment of domain integrals in boundary element methods

    SciTech Connect

    Nintcheu Fata, Sylvain

    2012-01-01

    A systematic and rigorous technique to calculate domain integrals without a volume-fitted mesh has been developed and validated in the context of a boundary element approximation. In the proposed approach, a domain integral involving a continuous or weakly-singular integrand is first converted into a surface integral by means of straight-path integrals that intersect the underlying domain. Then, the resulting surface integral is carried out either via analytic integration over boundary elements or by use of standard quadrature rules. This domain-to-boundary integral transformation is derived from an extension of the fundamental theorem of calculus to higher dimension, and the divergence theorem. In establishing the method, it is shown that the higher-dimensional version of the first fundamental theorem of calculus corresponds to the well-known Poincare lemma. The proposed technique can be employed to evaluate integrals defined over simply- or multiply-connected domains with Lipschitz boundaries which are embedded in an Euclidean space of arbitrary but finite dimension. Combined with the singular treatment of surface integrals that is widely available in the literature, this approach can also be utilized to effectively deal with boundary-value problems involving non-homogeneous source terms by way of a collocation or a Galerkin boundary integral equation method using only the prescribed surface discretization. Sample problems associated with the three-dimensional Poisson equation and featuring the Newton potential are successfully solved by a constant element collocation method to validate this study.

  12. An experiment to compare multiple methods for streamflow uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Kiang, Julie; McMillan, Hilary; Gazoorian, Chris; Mason, Robert; Le Coz, Jerome; Renard, Benjamin; Mansanarez, Valentin; Westerberg, Ida; Petersen-Øverleir, Asgeir; Reitan, Trond; Sikorska, Anna; Seibert, Jan; Coxon, Gemma; Freer, Jim; Belleville, Arnaud; Hauet, Alexandre

    2017-04-01

    Stage-discharge rating curves are used to relate streamflow discharge to continuously measured river stage readings to create a continuous record of streamflow discharge. The stage-discharge relationship is estimated and refined using discrete streamflow measurements over time, during which both the discharge and stage are measured. There is uncertainty in the resulting rating curve due to multiple factors including the curve-fitting process, assumptions on the form of the model used, fluvial geomorphology of natural channels, and the approaches used to extrapolate the rating equation beyond available observations. This rating curve uncertainty leads to uncertainty in the streamflow timeseries, and therefore to uncertainty in predictive models that use the streamflow data. Many different methods have been proposed in the literature for estimating rating curve uncertainty, differing in mathematical rigor, in the assumptions made about the component errors, and in the information required to implement the method at any given site. This study describes the results of an international experiment to test and compare streamflow uncertainty estimation methods from 7 research groups across 9 institutions. The methods range from simple LOWESS fits to more complicated Bayesian methods that consider hydraulic principles directly. We evaluate these different methods when applied to three diverse gauging stations using standardized information (channel characteristics, hydrographs, and streamflow measurements). Our results quantify the resultant spread of the stage-discharge curves and compare the level of uncertainty attributed to the streamflow records by each different method. We provide insight into the sensitivity of streamflow uncertainty bounds to the choice of uncertainty estimation method, and discuss the implications for model uncertainty assessment.

  13. Lattice Boltzmann equation method for multiple immiscible continuum fluids

    NASA Astrophysics Data System (ADS)

    Spencer, T. J.; Halliday, I.; Care, C. M.

    2010-12-01

    This paper generalizes the two-component algorithm of Sec. , extending it, in Sec. , to describe N>2 mutually immiscible fluids in the isothermal continuum regime. Each fluid has an independent interfacial tension. While retaining all its computational advantages, we remove entirely the empiricism associated with contact behavior in our previous multiple immiscible fluid models [M. M. Dupin , Phys. Rev. E 73, 055701(R) (2006)10.1103/PhysRevE.73.055701; Med. Eng. Phys. 28, 13 (2006)10.1016/j.medengphy.2005.04.015] while solidifying the physical foundations. Moreover, the model relies upon a fluid-fluid segregation which is simpler, computationally faster, more free of artifacts (i.e., the interfacial microcurrent), and upon an interface-inducing force distribution which is analytic. The method is completely symmetric between any numbers of immiscible fluids and stable over a wide range of directly input interfacial tension. We present data on the steady-state properties of multiple interface model, which are in good agreement with theory [R. E. Johnson and S. S. Sadhal, Annu. Rev. Fluid Mech. 17, 289 (1985)10.1146/annurev.fl.17.010185.001445], specifically on the shapes of multidrop systems. Section is an analysis of the kinetic and continuum-scale descriptions of the underlying two-component lattice Boltzmann model for immiscible fluids, extendable to more than two immiscible fluids. This extension requires (i) the use of a more local kinetic equation perturbation which is (ii) free from a reliance on measured interfacial curvature. It should be noted that viewed simply as a two-component method, the continuum algorithm is inferior to our previous methods, reported by Lishchuk [Phys. Rev. E 67, 036701 (2003)]10.1103/PhysRevE.76.036701 and Halliday [Phys. Rev. E 76, 026708 (2007)]10.1103/PhysRevE.76.026708. Greater stability and parameter range is achieved in multiple drop simulations by using the forced multi-relaxation-time lattice Boltzmann method developed

  14. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  15. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  16. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  17. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  18. Integration of Research Studies: Meta-Analysis of Research. Methods of Integrative Analysis; Final Report.

    ERIC Educational Resources Information Center

    Glass, Gene V.; And Others

    Integrative analysis, or what is coming to be known as meta-analysis, is the integration of the findings of many empirical research studies of a topic. Meta-analysis differs from traditional narrative forms of research reviewing in that it is more quantitative and statistical. Thus, the methods of meta-analysis are merely statistical methods,…

  19. Shape integral method for magnetospheric shapes. [boundary layer calculations

    NASA Technical Reports Server (NTRS)

    Michel, F. C.

    1979-01-01

    A method is developed for calculating the shape of any magnetopause to arbitrarily high precision. The method uses an integral equation which is evaluated for a trial shape. The resulting values of the integral equation as a function of auxiliary variables indicate how close one is to the desired solution. A variational method can then be used to improve the trial shape. Some potential applications are briefly mentioned.

  20. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    PubMed

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  1. Quantitative Integration of Multiple Geophysical Techniques for Reducing Uncertainty in Discrete Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Carr, M. C.; Baker, G. S.; Herrmann, N.; Yerka, S.; Angst, M.

    2008-12-01

    The objectives of this project are to (1) utilize quantitative integration of multiple geophysical techniques, (2) determine geophysical anomalies that may indicate locations of various archaeological structures, and (3) develop techniques of quantifying causes of uncertainty. Two sites are used to satisfy these objectives. The first, representing a site with unknown target features, is an archaeological site on the Tennessee River floodplain. The area is divided into 437 (20 x 20 m) plots with 0.5 m spacing where magnetic gradiometry profiles were collected in a zig-zag pattern, resulting in 350 km of line data. Once anomalies are identified in the magnetics data, potential excavation sites for archeological features are determined and other geophysical techniques are utilized to gain confidence in choosing which anomalies to excavate. Several grids are resurveyed using Ground Penetrating Radar (GPR) and EM-31 with a 0.25 m spacing in a grid pattern. A quantitative method of integrating data into one comprehensive set is developed, enhancing interpretation because each geophysical technique utilized within this study produced a unique response to noise and the targets. Spatial visualization software is used to interpolate irregularly spaced XYZ data into a regularly spaced grid and display the geophysical data in 3D representations. Once all data are exported from each individual instrument, grid files are created for quantitative merging of the data and to create grid-based maps including contour, image, shaded relief, and surface maps. Statistics were calculated from anomaly classification in the data and excavated features present. To study this methodology in a more controlled setting, a second site is used. This site is analogous to the first in that it is along the Tennessee River floodplain on the same bedrock units. However, this analog site contains known targets (previously buried and accurately located) including size, shape, and orientation. Four

  2. Damping identification in frequency domain using integral method

    NASA Astrophysics Data System (ADS)

    Guo, Zhiwei; Sheng, Meiping; Ma, Jiangang; Zhang, Wulin

    2015-03-01

    A new method for damping identification of linear system in frequency domain is presented, by using frequency response function (FRF) with integral method. The FRF curve is firstly transformed to other type of frequency-related curve by changing the representations of horizontal and vertical axes. For the newly constructed frequency-related curve, integral is conducted and the area forming from the new curve is used to determine the damping. Three different methods based on integral are proposed in this paper, which are called FDI-1, FDI-2 and FDI-3 method, respectively. For a single degree of freedom (Sdof) system, the formulated relation of each method between integrated area and loss factor is derived theoretically. The numeral simulation and experiment results show that, the proposed integral methods have high precision, strong noise resistance and are very stable in repeated measurements. Among the three integral methods, FDI-3 method is the most recommended because of its higher accuracy and simpler algorithm. The new methods are limited to linear system in which modes are well separated, and for closely spaced mode system, mode decomposition process should be conducted firstly.

  3. Integrated management of thesis using clustering method

    NASA Astrophysics Data System (ADS)

    Astuti, Indah Fitri; Cahyadi, Dedy

    2017-02-01

    Thesis is one of major requirements for student in pursuing their bachelor degree. In fact, finishing the thesis involves a long process including consultation, writing manuscript, conducting the chosen method, seminar scheduling, searching for references, and appraisal process by the board of mentors and examiners. Unfortunately, most of students find it hard to match all the lecturers' free time to sit together in a seminar room in order to examine the thesis. Therefore, seminar scheduling process should be on the top of priority to be solved. Manual mechanism for this task no longer fulfills the need. People in campus including students, staffs, and lecturers demand a system in which all the stakeholders can interact each other and manage the thesis process without conflicting their timetable. A branch of computer science named Management Information System (MIS) could be a breakthrough in dealing with thesis management. This research conduct a method called clustering to distinguish certain categories using mathematics formulas. A system then be developed along with the method to create a well-managed tool in providing some main facilities such as seminar scheduling, consultation and review process, thesis approval, assessment process, and also a reliable database of thesis. The database plays an important role in present and future purposes.

  4. Missing data methods in Mendelian randomization studies with multiple instruments.

    PubMed

    Burgess, Stephen; Seaman, Shaun; Lawlor, Debbie A; Casas, Juan P; Thompson, Simon G

    2011-11-01

    Mendelian randomization studies typically have low power. Where there are several valid candidate genetic instruments, precision can be gained by using all the instruments available. However, sporadically missing genetic data can offset this gain. The authors describe 4 Bayesian methods for imputing the missing data based on a missing-at-random assumption: multiple imputations, single nucleotide polymorphism (SNP) imputation, latent variables, and haplotype imputation. These methods are demonstrated in a simulation study and then applied to estimate the causal relation between C-reactive protein and each of fibrinogen and coronary heart disease, based on 3 SNPs in British Women's Heart and Health Study participants assessed at baseline between May 1999 and June 2000. A complete-case analysis based on all 3 SNPs was found to be more precise than analyses using any 1 SNP alone. Precision is further improved by using any of the 4 proposed missing data methods; the improvement is equivalent to about a 25% increase in sample size. All methods gave similar results, which were apparently not overly sensitive to violation of the missing-at-random assumption. Programming code for the analyses presented is available online.

  5. String Banana Template Method for tracking in a high-multiplicity environment with significant multiple scattering

    NASA Astrophysics Data System (ADS)

    Kulinich, P.; Krylov, V.

    2006-10-01

    A String Banana Template Method (SBTM) for track reconstruction in the presence of significant Multiple Scattering (MS) is described. The main idea of the method is based on the features of ensembles of tracks selected by three-fold coincidences. The SBTM provides a narrower search window than other methods by exploiting the features of such ensembles: it deals with particular "branches" in the MS "tree". A two-step track model with additional parameters to account for MS is used. The SBTM uses stored template fields generated by precise Monte Carlo (MC) simulation. SBTM capabilities in terms of track parameter resolution are demonstrated for a model spectrometer. This method has been implemented in the ROOT C++ framework and tested with MC simulations as well as with data from a heavy ion collision experiment using a silicon-based spectrometer with a complex geometry, moderate segmentation (≃mm) and non-uniform magnetic field. Primary tracks in the most central Au-Au collisions at √{sNN}=200 GeV with occupancy >20% and momenta down to ˜80 MeV/c (for pions) have been reconstructed with high efficiency.

  6. Exponential Methods for the Time Integration of Schroedinger Equation

    SciTech Connect

    Cano, B.; Gonzalez-Pachon, A.

    2010-09-30

    We consider exponential methods of second order in time in order to integrate the cubic nonlinear Schroedinger equation. We are interested in taking profit of the special structure of this equation. Therefore, we look at symmetry, symplecticity and approximation of invariants of the proposed methods. That will allow to integrate till long times with reasonable accuracy. Computational efficiency is also our aim. Therefore, we make numerical computations in order to compare the methods considered and so as to conclude that explicit Lawson schemes projected on the norm of the solution are an efficient tool to integrate this equation.

  7. Integration of Multiple Genomic and Phenotype Data to Infer Novel miRNA-Disease Associations

    PubMed Central

    Zhou, Meng; Cheng, Liang; Yang, Haixiu; Wang, Jing; Sun, Jie; Wang, Zhenzhen

    2016-01-01

    MicroRNAs (miRNAs) play an important role in the development and progression of human diseases. The identification of disease-associated miRNAs will be helpful for understanding the molecular mechanisms of diseases at the post-transcriptional level. Based on different types of genomic data sources, computational methods for miRNA-disease association prediction have been proposed. However, individual source of genomic data tends to be incomplete and noisy; therefore, the integration of various types of genomic data for inferring reliable miRNA-disease associations is urgently needed. In this study, we present a computational framework, CHNmiRD, for identifying miRNA-disease associations by integrating multiple genomic and phenotype data, including protein-protein interaction data, gene ontology data, experimentally verified miRNA-target relationships, disease phenotype information and known miRNA-disease connections. The performance of CHNmiRD was evaluated by experimentally verified miRNA-disease associations, which achieved an area under the ROC curve (AUC) of 0.834 for 5-fold cross-validation. In particular, CHNmiRD displayed excellent performance for diseases without any known related miRNAs. The results of case studies for three human diseases (glioblastoma, myocardial infarction and type 1 diabetes) showed that all of the top 10 ranked miRNAs having no known associations with these three diseases in existing miRNA-disease databases were directly or indirectly confirmed by our latest literature mining. All these results demonstrated the reliability and efficiency of CHNmiRD, and it is anticipated that CHNmiRD will serve as a powerful bioinformatics method for mining novel disease-related miRNAs and providing a new perspective into molecular mechanisms underlying human diseases at the post-transcriptional level. CHNmiRD is freely available at http://www.bio-bigdata.com/CHNmiRD. PMID:26849207

  8. Integration of Multiple Genomic and Phenotype Data to Infer Novel miRNA-Disease Associations.

    PubMed

    Shi, Hongbo; Zhang, Guangde; Zhou, Meng; Cheng, Liang; Yang, Haixiu; Wang, Jing; Sun, Jie; Wang, Zhenzhen

    2016-01-01

    MicroRNAs (miRNAs) play an important role in the development and progression of human diseases. The identification of disease-associated miRNAs will be helpful for understanding the molecular mechanisms of diseases at the post-transcriptional level. Based on different types of genomic data sources, computational methods for miRNA-disease association prediction have been proposed. However, individual source of genomic data tends to be incomplete and noisy; therefore, the integration of various types of genomic data for inferring reliable miRNA-disease associations is urgently needed. In this study, we present a computational framework, CHNmiRD, for identifying miRNA-disease associations by integrating multiple genomic and phenotype data, including protein-protein interaction data, gene ontology data, experimentally verified miRNA-target relationships, disease phenotype information and known miRNA-disease connections. The performance of CHNmiRD was evaluated by experimentally verified miRNA-disease associations, which achieved an area under the ROC curve (AUC) of 0.834 for 5-fold cross-validation. In particular, CHNmiRD displayed excellent performance for diseases without any known related miRNAs. The results of case studies for three human diseases (glioblastoma, myocardial infarction and type 1 diabetes) showed that all of the top 10 ranked miRNAs having no known associations with these three diseases in existing miRNA-disease databases were directly or indirectly confirmed by our latest literature mining. All these results demonstrated the reliability and efficiency of CHNmiRD, and it is anticipated that CHNmiRD will serve as a powerful bioinformatics method for mining novel disease-related miRNAs and providing a new perspective into molecular mechanisms underlying human diseases at the post-transcriptional level. CHNmiRD is freely available at http://www.bio-bigdata.com/CHNmiRD.

  9. Multiplicity and Self-Identity: Trauma and Integration in Shirley Mason's Art

    ERIC Educational Resources Information Center

    Thompson, Geoffrey

    2011-01-01

    This viewpoint appeared in its original form as the catalogue essay that accompanied the exhibition "Multiplicity and Self-Identity: Trauma and Integration in Shirley Mason's Art," curated by the author for Gallery 2110, Sacramento, CA, and the 2010 Annual Conference of the American Art Therapy Association. The exhibition featured 17 artworks by…

  10. Multiplicity and Self-Identity: Trauma and Integration in Shirley Mason's Art

    ERIC Educational Resources Information Center

    Thompson, Geoffrey

    2011-01-01

    This viewpoint appeared in its original form as the catalogue essay that accompanied the exhibition "Multiplicity and Self-Identity: Trauma and Integration in Shirley Mason's Art," curated by the author for Gallery 2110, Sacramento, CA, and the 2010 Annual Conference of the American Art Therapy Association. The exhibition featured 17 artworks by…

  11. Technology Integration in a One-to-One Laptop Initiative: A Multiple Case Study Analysis

    ERIC Educational Resources Information Center

    Jones, Marsha B.

    2013-01-01

    The purpose of this multiple case study analysis was to examine teachers' experiences and perceptions in order to understand what actions and interactions supported or inhibited technology integration during a one-to-one laptop initiative. This research sought to gain teachers' perspectives on the challenges and successes they faced as classroom…

  12. Technology Integration in a One-to-One Laptop Initiative: A Multiple Case Study Analysis

    ERIC Educational Resources Information Center

    Jones, Marsha B.

    2013-01-01

    The purpose of this multiple case study analysis was to examine teachers' experiences and perceptions in order to understand what actions and interactions supported or inhibited technology integration during a one-to-one laptop initiative. This research sought to gain teachers' perspectives on the challenges and successes they faced as classroom…

  13. Fostering Creativity in Advertising Students: Incorporating the Theories of Multiple Intelligences and Integrative Learning.

    ERIC Educational Resources Information Center

    Rega, Bonney

    Noting that linguistic and mathematical/logical are the two kinds of intelligences the educational system encourages and that the educational system, as well as science in general, tends to neglect the nonverbal form of intellect, this paper describes Howard Gardner's multiple intelligences theory and Peter Kline's theory of integrative learning…

  14. Integration of graphene oxide and DNA as a universal platform for multiple arithmetic logic units.

    PubMed

    Wang, Kun; Ren, Jiangtao; Fan, Daoqing; Liu, Yaqing; Wang, Erkang

    2014-11-28

    By a combination of graphene oxide and DNA, a universal platform was developed for integration of multiple logic gates to implement both half adder and half subtractor functions. A constant undefined threshold range between high and low fluorescence output signals was set for all the developed logic gates.

  15. The Effect of Sensory Integration Treatment on Children with Multiple Disabilities.

    ERIC Educational Resources Information Center

    Din, Feng S.; Lodato, Donna M.

    Six children with multiple disabilities (ages 5 to 8) participated in this evaluation of the effect of sensory integration treatment on sensorimotor function and academic learning. The children had cognitive abilities ranging from sub-average to significantly sub-average, three were non-ambulatory, one had severe behavioral problems, and each…

  16. STRUCTURE OF THE EGF RECEPTOR TRANSACTIVATION CIRCUIT INTEGRATES MULTIPLE SIGNALS WITH CELL CONTEXT

    PubMed Central

    Joslin, Elizabeth J.; Shankaran, Harish; Opresko, Lee K.; Bollinger, Nikki; Lauffenburger, Douglas A.; Wiley, H. Steven

    2012-01-01

    Summary Transactivation of the epidermal growth factor receptor (EGFR) is thought to be a process by which a variety of cellular inputs can be integrated into a single signaling pathway through either stimulated proteolysis (shedding) of membrane-anchored EGFR ligands or by modification of the activity of the EGFR. As a first step towards building a predictive model of the EGFR transactivation circuit, we quantitatively defined how signals from multiple agonists were integrated both upstream and downstream of the EGFR to regulate extracellular signal regulated kinase (ERK) activity in human mammary epithelial cells. By using a “non-binding” reporter of ligand shedding, we found that transactivation triggers a positive feedback loop from ERK back to the EGFR such that ligand shedding drives EGFR-stimulated ERK that in turn drives further ligand shedding. Importantly, activated Ras and ERK levels were nearly linear functions of ligand shedding and the effect of multiple, sub-saturating inputs was additive. Simulations showed that ERK-mediated feedback through ligand shedding resulted in a stable steady-state level of activated ERK, but also showed that the extracellular environment can modulate the level of feedback. Our results suggest that the transactivation circuit acts as a context-dependent integrator and amplifier of multiple extracellular signals and that signal integration can effectively occur at multiple points in the EGFR pathway. PMID:20458382

  17. Multiple proviral integration events after virological synapse-mediated HIV-1 spread

    SciTech Connect

    Russell, Rebecca A.; Martin, Nicola; Mitar, Ivonne; Jones, Emma; Sattentau, Quentin J.

    2013-08-15

    HIV-1 can move directly between T cells via virological synapses (VS). Although aspects of the molecular and cellular mechanisms underlying this mode of spread have been elucidated, the outcomes for infection of the target cell remain incompletely understood. We set out to determine whether HIV-1 transfer via VS results in productive, high-multiplicity HIV-1 infection. We found that HIV-1 cell-to-cell spread resulted in nuclear import of multiple proviruses into target cells as seen by fluorescence in-situ hybridization. Proviral integration into the target cell genome was significantly higher than that seen in a cell-free infection system, and consequent de novo viral DNA and RNA production in the target cell detected by quantitative PCR increased over time. Our data show efficient proviral integration across VS, implying the probability of multiple integration events in target cells that drive productive T cell infection. - Highlights: • Cell-to-cell HIV-1 infection delivers multiple vRNA copies to the target cell. • Cell-to-cell infection results in productive infection of the target cell. • Cell-to-cell transmission is more efficient than cell-free HIV-1 infection. • Suggests a mechanism for recombination in cells infected with multiple viral genomes.

  18. A Rationale for Mixed Methods (Integrative) Research Programmes in Education

    ERIC Educational Resources Information Center

    Niaz, Mansoor

    2008-01-01

    Recent research shows that research programmes (quantitative, qualitative and mixed) in education are not displaced (as suggested by Kuhn) but rather lead to integration. The objective of this study is to present a rationale for mixed methods (integrative) research programs based on contemporary philosophy of science (Lakatos, Giere, Cartwright,…

  19. A Rationale for Mixed Methods (Integrative) Research Programmes in Education

    ERIC Educational Resources Information Center

    Niaz, Mansoor

    2008-01-01

    Recent research shows that research programmes (quantitative, qualitative and mixed) in education are not displaced (as suggested by Kuhn) but rather lead to integration. The objective of this study is to present a rationale for mixed methods (integrative) research programs based on contemporary philosophy of science (Lakatos, Giere, Cartwright,…

  20. System and method for inventorying multiple remote objects

    DOEpatents

    Carrender, Curtis L [Morgan Hill, CA; Gilbert, Ronald W [Morgan Hill, CA

    2009-12-29

    A system and method of inventorying multiple objects utilizing a multi-level or a chained radio frequency identification system. The system includes a master tag and a plurality of upper level tags and lower level tags associated with respective objects. The upper and lower level tags communicate with each other and the master tag so that reading of the master tag reveals the presence and absence of upper and lower level tags. In the chained RF system, the upper and lower level tags communicate locally with each other in a manner so that more remote tags that are out of range of some of the upper and lower level tags have their information relayed through adjacent tags to the master tag and thence to a controller.

  1. System and method for inventorying multiple remote objects

    DOEpatents

    Carrender, Curtis L.; Gilbert, Ronald W.

    2007-10-23

    A system and method of inventorying multiple objects utilizing a multi-level or a chained radio frequency identification system. The system includes a master tag and a plurality of upper level tags and lower level tags associated with respective objects. The upper and lower level tags communicate with each other and the master tag so that reading of the master tag reveals the presence and absence of upper and lower level tags. In the chained RF system, the upper and lower level tags communicate locally with each other in a manner so that more remote tags that are out of range of some of the upper and lower level tags have their information relayed through adjacent tags to the master tag and thence to a controller.

  2. Inaccuracy in the treatment of multiple-order diffraction by secondary-edge-source methods.

    PubMed

    Summers, Jason E

    2013-06-01

    Existing secondary-edge-source methods based on the Biot-Tolstoy solution for diffraction from an infinite wedge compute multiple-order diffraction by cascading the integration over secondary sources used to determine first-order diffraction from the edge. It is demonstrated here that this approach errs in some important cases because it neglects slope-diffraction contributions. This error is illustrated by considering the case of an infinite slit in a thin, hard screen. Comparisons with measurements for this case and analytical solutions for the case of a circular aperture in a thin, hard screen are used as a basis to gauge the magnitude of the error.

  3. [Connotation of ecological integrity and its assessment methods: a review].

    PubMed

    Huang, Baorong; Ouyang, Zhiyun; Zheng, Hua; Wang, Xiaoke; Miao, Hong

    2006-11-01

    Ecological integrity is the capability to support and maintain a balanced, integrative and adaptive biologic system, having the full range of elements and processes expected in the natural habitats of a region. Assessment of ecological integrity has great significance for preventing sensitive nature habitats from human disturbance. The theory of dissipative structures suggests that the stressors from human activities, as well as the biological, physical and chemical integrity and ecosystem function that reflect the ability of self-organizing, can well indicate the integrity of an ecosystem. This paper summarized the experiential indicators for assessing the integrity of aquatic and terrestrial ecosystems and the stressors from human disturbance, and discussed the methods for selecting priority indicators and comprehensive assessment in actual assessment programs. The prospects of further study were discussed, according to some issues existed in published researches.

  4. Integrating Multiple Autonomous Underwater Vessels, Surface Vessels and Aircraft into Oceanographic Research Vessel Operations

    NASA Astrophysics Data System (ADS)

    McGillivary, P. A.; Borges de Sousa, J.; Martins, R.; Rajan, K.

    2012-12-01

    Autonomous platforms are increasingly used as components of Integrated Ocean Observing Systems and oceanographic research cruises. Systems deployed can include gliders or propeller-driven autonomous underwater vessels (AUVs), autonomous surface vessels (ASVs), and unmanned aircraft systems (UAS). Prior field campaigns have demonstrated successful communication, sensor data fusion and visualization for studies using gliders and AUVs. However, additional requirements exist for incorporating ASVs and UASs into ship operations. For these systems to be optimally integrated into research vessel data management and operational planning systems involves addressing three key issues: real-time field data availability, platform coordination, and data archiving for later analysis. A fleet of AUVs, ASVs and UAS deployed from a research vessel is best operated as a system integrated with the ship, provided communications among them can be sustained. For this purpose, Disruptive Tolerant Networking (DTN) software protocols for operation in communication-challenged environments help ensure reliable high-bandwidth communications. Additionally, system components need to have considerable onboard autonomy, namely adaptive sampling capabilities using their own onboard sensor data stream analysis. We discuss Oceanographic Decision Support System (ODSS) software currently used for situational awareness and planning onshore, and in the near future event detection and response will be coordinated among multiple vehicles. Results from recent field studies from oceanographic research vessels using AUVs, ASVs and UAS, including the Rapid Environmental Picture (REP-12) cruise, are presented describing methods and results for use of multi-vehicle communication and deliberative control networks, adaptive sampling with single and multiple platforms, issues relating to data management and archiving, and finally challenges that remain in addressing these technological issues. Significantly, the

  5. Monitoring gray wolf populations using multiple survey methods

    USGS Publications Warehouse

    Ausband, David E.; Rich, Lindsey N.; Glenn, Elizabeth M.; Mitchell, Michael S.; Zager, Pete; Miller, David A.W.; Waits, Lisette P.; Ackerman, Bruce B.; Mack, Curt M.

    2013-01-01

    The behavioral patterns and large territories of large carnivores make them challenging to monitor. Occupancy modeling provides a framework for monitoring population dynamics and distribution of territorial carnivores. We combined data from hunter surveys, howling and sign surveys conducted at predicted wolf rendezvous sites, and locations of radiocollared wolves to model occupancy and estimate the number of gray wolf (Canis lupus) packs and individuals in Idaho during 2009 and 2010. We explicitly accounted for potential misidentification of occupied cells (i.e., false positives) using an extension of the multi-state occupancy framework. We found agreement between model predictions and distribution and estimates of number of wolf packs and individual wolves reported by Idaho Department of Fish and Game and Nez Perce Tribe from intensive radiotelemetry-based monitoring. Estimates of individual wolves from occupancy models that excluded data from radiocollared wolves were within an average of 12.0% (SD = 6.0) of existing statewide minimum counts. Models using only hunter survey data generally estimated the lowest abundance, whereas models using all data generally provided the highest estimates of abundance, although only marginally higher. Precision across approaches ranged from 14% to 28% of mean estimates and models that used all data streams generally provided the most precise estimates. We demonstrated that an occupancy model based on different survey methods can yield estimates of the number and distribution of wolf packs and individual wolf abundance with reasonable measures of precision. Assumptions of the approach including that average territory size is known, average pack size is known, and territories do not overlap, must be evaluated periodically using independent field data to ensure occupancy estimates remain reliable. Use of multiple survey methods helps to ensure that occupancy estimates are robust to weaknesses or changes in any 1 survey method

  6. Method of generating multiple sets of experimental phantom data.

    PubMed

    Sitek, Arkadiusz; Reutter, Bryan W; Huesman, Ronald H; Gullberg, Grant T

    2006-07-01

    Currently, 2 types of phantoms (physical and computer generated) are used for testing and comparing tomographic reconstruction methods. Data from physical phantoms include all physical effects associated with the detection of radiation. However, with physical phantoms it is difficult to control the number of detected counts, simulate the dynamics of uptake and washout, or create multiple noise realizations of an acquisition. Computer-generated phantoms can overcome some of the disadvantages of physical phantoms, but simulation of all factors affecting the detection of radiation is extremely complex and in some cases impossible. To overcome the problems with both types of phantoms, we developed a physical and computer-generated hybrid phantom that allows the creation of multiple noise realizations of tomographic datasets of the dynamic uptake governed by kinetic models. The method is phantom and camera specific. We applied it to an anthropomorphic torso phantom with a cardiac insert, using a SPECT system with attenuation correction. First, real data were acquired. For each compartment (heart, blood pool, liver, and background) of the physical phantom, large numbers of short tomographic projections were acquired separately for each angle. Sinograms were built from a database of projections by summing the projections of each compartment of the phantom. The amount of activity in each phantom compartment was regulated by the number of added projections. Sinograms corresponding to various projection times, configurations and numbers of detector heads, numbers of noise realizations, numbers of phantom compartments, and compartment-specific time-activity curves in MBq/cm3 were assembled from the database. The acquisition produced a database of 120 projection angles ranging over 360 degrees . For each angle, 300 projections of 0.5 s each were stored in 128 x 128 matrices for easy access. The acquired database was successful in the generation of static and dynamic sinograms

  7. A multiple hypotheses uncertainty analysis in hydrological modelling: about model structure, landscape parameterization, and numerical integration

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2016-04-01

    Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.

  8. Method for distinguishing multiple targets using time-reversal acoustics

    DOEpatents

    Berryman, James G.

    2004-06-29

    A method for distinguishing multiple targets using time-reversal acoustics. Time-reversal acoustics uses an iterative process to determine the optimum signal for locating a strongly reflecting target in a cluttered environment. An acoustic array sends a signal into a medium, and then receives the returned/reflected signal. This returned/reflected signal is then time-reversed and sent back into the medium again, and again, until the signal being sent and received is no longer changing. At that point, the array has isolated the largest eigenvalue/eigenvector combination and has effectively determined the location of a single target in the medium (the one that is most strongly reflecting). After the largest eigenvalue/eigenvector combination has been determined, to determine the location of other targets, instead of sending back the same signals, the method sends back these time reversed signals, but half of them will also be reversed in sign. There are various possibilities for choosing which half to do sign reversal. The most obvious choice is to reverse every other one in a linear array, or as in a checkerboard pattern in 2D. Then, a new send/receive, send-time reversed/receive iteration can proceed. Often, the first iteration in this sequence will be close to the desired signal from a second target. In some cases, orthogonalization procedures must be implemented to assure the returned signals are in fact orthogonal to the first eigenvector found.

  9. Investigation of the Multiple Method Adaptive Control (MMAC) method for flight control systems

    NASA Technical Reports Server (NTRS)

    Athans, M.; Baram, Y.; Castanon, D.; Dunn, K. P.; Green, C. S.; Lee, W. H.; Sandell, N. R., Jr.; Willsky, A. S.

    1979-01-01

    The stochastic adaptive control of the NASA F-8C digital-fly-by-wire aircraft using the multiple model adaptive control (MMAC) method is presented. The selection of the performance criteria for the lateral and the longitudinal dynamics, the design of the Kalman filters for different operating conditions, the identification algorithm associated with the MMAC method, the control system design, and simulation results obtained using the real time simulator of the F-8 aircraft at the NASA Langley Research Center are discussed.

  10. Biclustering as a method for RNA local multiple sequence alignment.

    PubMed

    Wang, Shu; Gutell, Robin R; Miranker, Daniel P

    2007-12-15

    Biclustering is a clustering method that simultaneously clusters both the domain and range of a relation. A challenge in multiple sequence alignment (MSA) is that the alignment of sequences is often intended to reveal groups of conserved functional subsequences. Simultaneously, the grouping of the sequences can impact the alignment; precisely the kind of dual situation biclustering is intended to address. We define a representation of the MSA problem enabling the application of biclustering algorithms. We develop a computer program for local MSA, BlockMSA, that combines biclustering with divide-and-conquer. BlockMSA simultaneously finds groups of similar sequences and locally aligns subsequences within them. Further alignment is accomplished by dividing both the set of sequences and their contents. The net result is both a multiple sequence alignment and a hierarchical clustering of the sequences. BlockMSA was tested on the subsets of the BRAliBase 2.1 benchmark suite that display high variability and on an extension to that suite to larger problem sizes. Also, alignments were evaluated of two large datasets of current biological interest, T box sequences and Group IC1 Introns. The results were compared with alignments computed by ClustalW, MAFFT, MUCLE and PROBCONS alignment programs using Sum of Pairs (SPS) and Consensus Count. Results for the benchmark suite are sensitive to problem size. On problems of 15 or greater sequences, BlockMSA is consistently the best. On none of the problems in the test suite are there appreciable differences in scores among BlockMSA, MAFFT and PROBCONS. On the T box sequences, BlockMSA does the most faithful job of reproducing known annotations. MAFFT and PROBCONS do not. On the Intron sequences, BlockMSA, MAFFT and MUSCLE are comparable at identifying conserved regions. BlockMSA is implemented in Java. Source code and supplementary datasets are available at http://aug.csres.utexas.edu/msa/

  11. Mixed time integration methods for transient thermal analysis of structures

    NASA Technical Reports Server (NTRS)

    Liu, W. K.

    1982-01-01

    The computational methods used to predict and optimize the thermal structural behavior of aerospace vehicle structures are reviewed. In general, two classes of algorithms, implicit and explicit, are used in transient thermal analysis of structures. Each of these two methods has its own merits. Due to the different time scales of the mechanical and thermal responses, the selection of a time integration method can be a different yet critical factor in the efficient solution of such problems. Therefore mixed time integration methods for transient thermal analysis of structures are being developed. The computer implementation aspects and numerical evaluation of these mixed time implicit-explicit algorithms in thermal analysis of structures are presented. A computationally useful method of estimating the critical time step for linear quadrilateral element is also given. Numerical tests confirm the stability criterion and accuracy characteristics of the methods. The superiority of these mixed time methods to the fully implicit method or the fully explicit method is also demonstrated.

  12. The predictive integration method for dynamics of infrequent events

    NASA Astrophysics Data System (ADS)

    Cubuk, Ekin; Waterland, Amos; Kaxiras, Efthimios

    2012-02-01

    With the increasing prominence and availability of multi-processor computers, recasting problems in a form amenable to parallel solution is becoming a critical step in effective scientific computation. We present a method for parallelizing molecular dynamics simulations in time scale, by using predictive integration. Our method is closely related to Voter's parallel replica method, but goes beyond that approach in that it involves speculatively initializing processors in more than one basin. Our predictive integration method requires predicting possible future configurations while it does not suffer from restrictions due to correlation time after transitions between basins.

  13. An integrated lean-methods approach to hospital facilities redesign.

    PubMed

    Nicholas, John

    2012-01-01

    Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.

  14. Implementation of sinh method in integration space for boundary integrals with near singularity in potential problems

    NASA Astrophysics Data System (ADS)

    Xie, Guizhong; Zhang, Dehai; Zhang, Jianming; Meng, Fannian; Du, Wenliao; Wen, Xiaoyu

    2016-12-01

    As a widely used numerical method, boundary element method (BEM) is efficient for computer aided engineering (CAE). However, boundary integrals with near singularity need to be calculated accurately and efficiently to implement BEM for CAE analysis on thin bodies successfully. In this paper, the distance in the denominator of the fundamental solution is first designed as an equivalent form using approximate expansion and the original sinh method can be revised into a new form considering the minimum distance and the approximate expansion. Second, the acquisition of the projection point by Newton-Raphson method is introduced. We acquire the nearest point between the source point and element edge by solving a cubic equation if the location of the projection point is outside the element, where boundary integrals with near singularity appear. Finally, the subtriangles of the local coordinate space are mapped into the integration space and the sinh method is applied in the integration space. The revised sinh method can be directly performed in the integration element. Averification test of our method is proposed. Results demonstrate that our method is effective for regularizing the boundary integrals with near singularity.

  15. Comparison of time integration methods for the evolution of galaxies

    NASA Astrophysics Data System (ADS)

    Degraaf, W.

    In the simulation of the evolution of elliptical galaxies, Leap-Frog is currently the most frequently used time integration method. The question is whether other methods perform better than this classical method. Improvements may also be expected from the use of variable step-lengths. We compare Leap-Frog with several other methods, namely: a fourth-order Nystrom method, a symplectic method, and DOPRI-five and eight. DOPRI uses variable steps of its own accord. For the other methods we construct a variable step procedure ourselves. The comparison of the methods is carried out in three Hamiltonian test problems.

  16. A Dynamic Integrated Fault Diagnosis Method for Power Transformers

    PubMed Central

    Gao, Wensheng; Liu, Tong

    2015-01-01

    In order to diagnose transformer fault efficiently and accurately, a dynamic integrated fault diagnosis method based on Bayesian network is proposed in this paper. First, an integrated fault diagnosis model is established based on the causal relationship among abnormal working conditions, failure modes, and failure symptoms of transformers, aimed at obtaining the most possible failure mode. And then considering the evidence input into the diagnosis model is gradually acquired and the fault diagnosis process in reality is multistep, a dynamic fault diagnosis mechanism is proposed based on the integrated fault diagnosis model. Different from the existing one-step diagnosis mechanism, it includes a multistep evidence-selection process, which gives the most effective diagnostic test to be performed in next step. Therefore, it can reduce unnecessary diagnostic tests and improve the accuracy and efficiency of diagnosis. Finally, the dynamic integrated fault diagnosis method is applied to actual cases, and the validity of this method is verified. PMID:25685841

  17. A dynamic integrated fault diagnosis method for power transformers.

    PubMed

    Gao, Wensheng; Bai, Cuifen; Liu, Tong

    2015-01-01

    In order to diagnose transformer fault efficiently and accurately, a dynamic integrated fault diagnosis method based on Bayesian network is proposed in this paper. First, an integrated fault diagnosis model is established based on the causal relationship among abnormal working conditions, failure modes, and failure symptoms of transformers, aimed at obtaining the most possible failure mode. And then considering the evidence input into the diagnosis model is gradually acquired and the fault diagnosis process in reality is multistep, a dynamic fault diagnosis mechanism is proposed based on the integrated fault diagnosis model. Different from the existing one-step diagnosis mechanism, it includes a multistep evidence-selection process, which gives the most effective diagnostic test to be performed in next step. Therefore, it can reduce unnecessary diagnostic tests and improve the accuracy and efficiency of diagnosis. Finally, the dynamic integrated fault diagnosis method is applied to actual cases, and the validity of this method is verified.

  18. Coupling equivalent plate and finite element formulations in multiple-method structural analyses

    NASA Technical Reports Server (NTRS)

    Giles, Gary L.; Norwood, Keith

    1994-01-01

    A coupled multiple-method analysis procedure for use late in conceptual design or early in preliminary design of aircraft structures is described. Using this method, aircraft wing structures are represented with equivalent plate models, and structural details such as engine/pylon structure, landing gear, or a 'stick' model of a fuselage are represented with beam finite element models. These two analysis methods are implemented in an integrated multiple-method formulation that involves the assembly and solution of a combined set of linear equations. The corresponding solution vector contains coefficients of the polynomials that describe the deflection of the wing and also the components of translations and rotations at the joints of the beam members. Two alternative approaches for coupling the methods are investigated; one using transition finite elements and the other using Lagrange multipliers. The coupled formulation is applied to the static analysis and vibration analysis of a conceptual design model of a fighter aircraft. The results from the coupled method are compared with corresponding results from an analysis in which the entire model is composed of finite elements.

  19. A Promising Approach to Integrally Evaluate the Disease Outcome of Cerebral Ischemic Rats Based on Multiple-Biomarker Crosstalk

    PubMed Central

    Wang, Yixuan; Wei, Chunxiang; Zhu, Tao; Wang, Haidong; He, Hua

    2017-01-01

    Purpose The study was designed to evaluate the disease outcome based on multiple biomarkers related to cerebral ischemia. Methods Rats were randomly divided into sham, permanent middle cerebral artery occlusion, and edaravone-treated groups. Cerebral ischemia was induced by permanent middle cerebral artery occlusion surgery in rats. To form a simplified crosstalk network, the related multiple biomarkers were chosen as S100β, HIF-1α, IL-1β, PGI2, TXA2, and GSH-Px. The levels or activities of these biomarkers in plasma were detected before and after ischemia. Concurrently, neurological deficit scores and cerebral infarct volumes were assessed. Based on a mathematic model, network balance maps and three integral disruption parameters (k, φ, and u) of the simplified crosstalk network were achieved. Results The levels or activities of the related biomarkers and neurological deficit scores were significantly impacted by cerebral ischemia. The balance maps intuitively displayed the network disruption, and the integral disruption parameters quantitatively depicted the disruption state of the simplified network after cerebral ischemia. The integral disruption parameter u values correlated significantly with neurological deficit scores and infarct volumes. Conclusion Our results indicate that the approach based on crosstalk network may provide a new promising way to integrally evaluate the outcome of cerebral ischemia. PMID:28630527

  20. Diagnosis of multiple sclerosis from EEG signals using nonlinear methods.

    PubMed

    Torabi, Ali; Daliri, Mohammad Reza; Sabzposhan, Seyyed Hojjat

    2017-09-08

    EEG signals have essential and important information about the brain and neural diseases. The main purpose of this study is classifying two groups of healthy volunteers and Multiple Sclerosis (MS) patients using nonlinear features of EEG signals while performing cognitive tasks. EEG signals were recorded when users were doing two different attentional tasks. One of the tasks was based on detecting a desired change in color luminance and the other task was based on detecting a desired change in direction of motion. EEG signals were analyzed in two ways: EEG signals analysis without rhythms decomposition and EEG sub-bands analysis. After recording and preprocessing, time delay embedding method was used for state space reconstruction; embedding parameters were determined for original signals and their sub-bands. Afterwards nonlinear methods were used in feature extraction phase. To reduce the feature dimension, scalar feature selections were done by using T-test and Bhattacharyya criteria. Then, the data were classified using linear support vector machines (SVM) and k-nearest neighbor (KNN) method. The best combination of the criteria and classifiers was determined for each task by comparing performances. For both tasks, the best results were achieved by using T-test criterion and SVM classifier. For the direction-based and the color-luminance-based tasks, maximum classification performances were 93.08 and 79.79% respectively which were reached by using optimal set of features. Our results show that the nonlinear dynamic features of EEG signals seem to be useful and effective in MS diseases diagnosis.

  1. Explicit Integration of Extremely Stiff Reaction Networks: Partial Equilibrium Methods

    SciTech Connect

    Guidry, Mike W; Billings, J. J.; Hix, William Raphael

    2013-01-01

    In two preceding papers [1,2] we have shown that, when reaction networks are well removed from equilibrium, explicit asymptotic and quasi-steady-state approximations can give algebraically stabilized integration schemes that rival standard implicit methods in accuracy and speed for extremely stiff systems. However, we also showed that these explicit methods remain accurate but are no longer competitive in speed as the network approaches equilibrium. In this paper we analyze this failure and show that it is associated with the presence of fast equilibration timescales that neither asymptotic nor quasi-steady-state approximations are able to remove efficiently from the numerical integration. Based on this understanding, we develop a partial equilibrium method to deal effectively with the new partial equilibrium methods, give an integration scheme that plausibly can deal with the stiffest networks, even in the approach to equilibrium, with accuracy and speed competitive with that of implicit methods. Thus we demonstrate that algebraically stabilized explicit methods may offer alternatives to implicit integration of even extremely stiff systems, and that these methods may permit integration of much larger networks than have been feasible previously in a variety of fields.

  2. Integrity of the Anterior Visual Pathway and Its Association with Ambulatory Performance in Multiple Sclerosis

    PubMed Central

    Sandroff, Brian M.; Pula, John H.; Motl, Robert W.

    2013-01-01

    Background. Retinal nerve fiber layer thickness (RNFLT) and total macular volume (TMV) represent markers of neuroaxonal degeneration within the anterior visual pathway that might correlate with ambulation in persons with multiple sclerosis (MS). Objective. This study examined the associations between RNFLT and TMV with ambulatory parameters in MS. Methods. Fifty-eight MS patients underwent a neurological examination for generation of an expanded disability status scale (EDSS) score and measurement of RNFLT and TMV using optical coherence tomography (OCT). Participants completed the 6-minute walk (6MW) and the timed 25-foot walk (T25FW). The associations were examined using generalized estimating equation models that accounted for within-patient, inter-eye correlations, and controlled for disease duration, EDSS score, and age. Results. RNFLT was not significantly associated with 6MW (P = 0.99) or T25FW (P = 0.57). TMV was significantly associated with 6MW (P = 0.023) and T25FW (P = 0.005). The coefficients indicated that unit differences in 6MW (100 feet) and T25FW (1 second) were associated with 0.040 and −0.048 unit differences in TMV (mm3), respectively. Conclusion. Integrity of the anterior visual pathway, particularly TMV, might represent a noninvasive measure of neuroaxonal degeneration that is correlated with ambulatory function in MS. PMID:23864950

  3. Neural Network Emulation of the Integral Equation Model with Multiple Scattering

    PubMed Central

    Pulvirenti, Luca; Ticconi, Francesca; Pierdicca, Nazzareno

    2009-01-01

    The Integral Equation Model with multiple scattering (IEMM) represents a well-established method that provides a theoretical framework for the scattering of electromagnetic waves from rough surfaces. A critical aspect is the long computational time required to run such a complex model. To deal with this problem, a neural network technique is proposed in this work. In particular, we have adopted neural networks to reproduce the backscattering coefficients predicted by IEMM at L- and C-bands, thus making reference to presently operative satellite radar sensors, i.e., that aboard ERS-2, ASAR on board ENVISAT (C-band), and PALSAR aboard ALOS (L-band). The neural network-based model has been designed for radar observations of both flat and tilted surfaces, in order to make it applicable for hilly terrains too. The assessment of the proposed approach has been carried out by comparing neural network-derived backscattering coefficients with IEMM-derived ones. Different databases with respect to those employed to train the networks have been used for this purpose. The outcomes seem to prove the feasibility of relying on a neural network approach to efficiently and reliably approximate an electromagnetic model of surface scattering. PMID:22408496

  4. The use of multiple oxygen implants for fabrication of bipolar silicon-on-insulator integrated circuits

    NASA Astrophysics Data System (ADS)

    Platteter, Dale G.; Cheek, Tom F., Jr.

    1988-12-01

    A description is given of the radiation improvements obtained by fabricating bipolar integrated circuits on oxygen-implanted silicon-on-insulator substrates that were manufactured with multiple (low-dose) implants. Bipolar 74ALSOO gates fabricated on these substrates showed an improvement in total dose and dose-rate radiation response over identical circuits fabricated in bulk silicon. Defects in SIMOX material were reduced by over four orders of magnitude. The results demonstrate that bipolar devices, fabricated on multiple-implant SIMOX substrates, can compete with conventional dielectric isolation for many radiation-hardened system applications.

  5. High sensitivity detection of NO2 employing off-axis integrated cavity output spectroscopy coupled with multiple line integrated spectroscopy

    NASA Astrophysics Data System (ADS)

    Rao, Gottipaty N.; Karpf, Andreas

    2011-05-01

    We report on the development of a new sensor for NO2 with ultrahigh sensitivity of detection. This has been accomplished by combining off-axis integrated cavity output spectroscopy (OA-ICOS) (which can provide large path lengths of the order of several km in a small volume cell) with multiple line integrated absorption spectroscopy (MLIAS) (where we integrate the absorption spectra over a large number of rotational-vibrational transitions of the molecular species to further improve the sensitivity). Employing an external cavity tunable quantum cascade laser operating in the 1601 - 1670 cm-1 range and a high-finesse optical cavity, the absorption spectra of NO2 over 100 transitions in the R-band have been recorded. From the observed linear relationship between the integrated absorption vs. concentration of NO2, we report an effective sensitivity of detection of 10 ppt for NO2. To the best of our knowledge, this is among the most sensitive levels of detection of NO2 to date. A sensitive sensor for the detection of NO2 will be helpful to monitor the ambient air quality, combustion emissions from the automobiles, power plants, aircraft and for the detection of nitrate based explosives (which are commonly used in improvised explosives (IEDs)). Additionally such a sensor would be valuable for the study of complex chemical reactions that undergo in the atmosphere resulting in the formation of photochemical smog, tropospheric ozone and acid rain.

  6. Evaluating marginal likelihood with thermodynamic integration method and comparison with several other numerical methods

    SciTech Connect

    Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; Beerli, Peter; Zeng, Xiankui; Lu, Dan; Tao, Yuezan

    2016-02-05

    Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamic integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.

  7. Multiple trim magnets, or magic fingers,'' for insertion device field integral correction

    SciTech Connect

    Hoyer, E.; Marks, S.; Pipersky, P.; Schlueter, R. )

    1995-02-01

    Multiple trim magnets (MTMs), also known as magic fingers,'' are an arrangement of magnets for reducing integrated magnetic-field errors in insertion devices. The idea is to use transverse arrays of permanent magnets, hence the name multiple trim magnets,'' above and below the midplane, to correct both normal and skew longitudinal magnetic-field integral errors in a device. MTMs are typically installed at the ends of an ID. Adjustments are made by changing either the size, position, or orientation of each trim magnet. Application of the MTMs to the ALS undulators reduced both the normal and skew longitudinal field integral errors, over the entire 20 mm[times]60 mm good field region,'' of the beam aperture by as much as an order of magnitude. The requirements included corrections of field and gradients outside the multipole convergence radius. Additionally, these trim magnet arrays provided correction of the linear component of the integrated field gradients for particles with trajectories not parallel to the nominal beam axis. The MTM concept, design, construction, tests that demonstrated feasibility, and magnetic-field integral reduction of ALS undulators are presented.

  8. Methods for the joint meta-analysis of multiple tests.

    PubMed

    Trikalinos, Thomas A; Hoaglin, David C; Small, Kevin M; Terrin, Norma; Schmid, Christopher H

    2014-12-01

    Existing methods for meta-analysis of diagnostic test accuracy focus primarily on a single index test. We propose models for the joint meta-analysis of studies comparing multiple index tests on the same participants in paired designs. These models respect the grouping of data by studies, account for the within-study correlation between the tests' true-positive rates (TPRs) and between their false-positive rates (FPRs) (induced because tests are applied to the same participants), and allow for between-study correlations between TPRs and FPRs (such as those induced by threshold effects). We estimate models in the Bayesian setting. We demonstrate using a meta-analysis of screening for Down syndrome with two tests: shortened humerus (arm bone), and shortened femur (thigh bone). Separate and joint meta-analyses yielded similar TPR and FPR estimates. For example, the summary TPR for a shortened humerus was 35.3% (95% credible interval (CrI): 26.9, 41.8%) versus 37.9% (27.7, 50.3%) with joint versus separate meta-analysis. Joint meta-analysis is more efficient when calculating comparative accuracy: the difference in the summary TPRs was 0.0% (-8.9, 9.5%; TPR higher for shortened humerus) with joint versus 2.6% (-14.7, 19.8%) with separate meta-analyses. Simulation and empirical analyses are needed to refine the role of the proposed methodology. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Laser housing having integral mounts and method of manufacturing same

    DOEpatents

    Herron, Michael Alan; Brickeen, Brian Keith

    2004-10-19

    A housing adapted to position, support, and facilitate aligning various components, including an optical path assembly, of a laser. In a preferred embodiment, the housing is constructed from a single piece of material and broadly comprises one or more through-holes; one or more cavities; and one or more integral mounts, wherein the through-holes and the cavities cooperate to define the integral mounts. Securement holes machined into the integral mounts facilitate securing components within the integral mounts using set screws, adhesive, or a combination thereof. In a preferred method of making the housing, the through-holes and cavities are first machined into the single piece of material, with at least some of the remaining material forming the integral mounts.

  10. Application of integrated fluid-thermal-structural analysis methods

    NASA Technical Reports Server (NTRS)

    Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken

    1988-01-01

    Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods is not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.

  11. Application of integrated fluid-thermal structural analysis methods

    NASA Technical Reports Server (NTRS)

    Wieting, Allan R.; Dechaumphai, Pramote; Bey, Kim S.; Thornton, Earl A.; Morgan, Ken

    1988-01-01

    Hypersonic vehicles operate in a hostile aerothermal environment which has a significant impact on their aerothermostructural performance. Significant coupling occurs between the aerodynamic flow field, structural heat transfer, and structural response creating a multidisciplinary interaction. Interfacing state-of-the-art disciplinary analysis methods are not efficient, hence interdisciplinary analysis methods integrated into a single aerothermostructural analyzer are needed. The NASA Langley Research Center is developing such methods in an analyzer called LIFTS (Langley Integrated Fluid-Thermal-Structural) analyzer. The evolution and status of LIFTS is reviewed and illustrated through applications.

  12. [Study on plastic film thickness measurement by integral spectrum method].

    PubMed

    Qiu, Chao; Sun, Xiao-Gang

    2013-01-01

    Band integral transmission was defined and plastic film thickness measurement model was built by analyzing the intensity variation when the light passes plastic film, after the concept of band Lambert Law was proposed. Polypropylene film samples with different thickness were taken as the research object, and their spectral transmission was measured by the spectrometer. The relationship between thickness and band integral transmission is fitted using the model mentioned before. The feasibility of developing new broad band plastic film thickness on-line measurement system based on this method was analysed employing the ideal blackbody at temperature of 500 K. The experimental results indicate that plastic film thickness will be measured accurately by integral spectrum method. Plastic film thickness on-line measurement system based on this method will hopefully solve the problems of that based on dual monochromatic light contrast method, such as low accuracy, poor universality and so on.

  13. A Comparison of Treatment Integrity Assessment Methods for Behavioral Intervention

    ERIC Educational Resources Information Center

    Koh, Seong A.

    2010-01-01

    The purpose of this study was to examine the similarity of outcomes from three different treatment integrity (TI) methods, and to identify the method which best corresponded to the assessment of a child's behavior. Six raters were recruited through individual contact via snowball sampling. A modified intervention component list and 19 video clips…

  14. When Curriculum and Technology Meet: Technology Integration in Methods Courses

    ERIC Educational Resources Information Center

    Keeler, Christy G.

    2008-01-01

    Reporting on the results of an action research study, this manuscript provides examples of strategies used to integrate technology into a content methods course. The study used reflective teaching of a social studies methods course at a major Southwestern university in 10 course sections over a four-semester period. In alignment with the research…

  15. A Comparison of Treatment Integrity Assessment Methods for Behavioral Intervention

    ERIC Educational Resources Information Center

    Koh, Seong A.

    2010-01-01

    The purpose of this study was to examine the similarity of outcomes from three different treatment integrity (TI) methods, and to identify the method which best corresponded to the assessment of a child's behavior. Six raters were recruited through individual contact via snowball sampling. A modified intervention component list and 19 video clips…

  16. System and method for integrating hazard-based decision making tools and processes

    DOEpatents

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  17. Genetic, Epigenetic, and Environmental Factors Influencing Neurovisceral Integration of Cardiovascular Modulation: Focus on Multiple Sclerosis.

    PubMed

    Sternberg, Zohara

    2016-03-01

    Thought to be an autoimmune inflammatory CNS disease, multiple sclerosis (MS) involves multiple pathologies with heterogeneous clinical presentations. An impaired neurovisceral integration of cardiovascular modulation, indicated by sympathetic and parasympathetic autonomic nervous system (ANS) dysfunction, is among common MS clinical presentations. ANS dysfunction could not only enhance MS inflammatory and neurodegenerative processes, but can also lead to clinical symptoms such as depression, fatigue, sleep disorder, migraine, osteoporosis, and cerebral hemodynamic impairments. Therefore, factors influencing ANS functional activities, in one way or another, will have a significant impact on MS disease course. This review describes the genetic and epigenetic factors, and their interactions with a number of environmental factors contributing to the neurovisceral integration of cardiovascular modulation, with a focus on MS. Future studies should investigate the improvement in cardiovascular ANS function, as a strategy for preventing and minimizing MS-related morbidities, and improving patients' quality of life.

  18. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    NASA Astrophysics Data System (ADS)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  19. Tuning of PID controllers for integrating systems using direct synthesis method.

    PubMed

    Anil, Ch; Padma Sree, R

    2015-07-01

    A PID controller is designed for various forms of integrating systems with time delay using direct synthesis method. The method is based on comparing the characteristic equation of the integrating system and PID controller with a filter with the desired characteristic equation. The desired characteristic equation comprises of multiple poles which are placed at the same desired location. The tuning parameter is adjusted so as to achieve the desired robustness. Tuning rules in terms of process parameters are given for various forms of integrating systems. The tuning parameter can be selected for the desired robustness by specifying Ms value. The proposed controller design method is applied to various transfer function models and to the nonlinear model equations of jacketed CSTR to show its effectiveness and applicability.

  20. New inversion methods for the Lorentz Integral Transform

    NASA Astrophysics Data System (ADS)

    Andreasi, D.; Leidemann, W.; Reiß, C.; Schwamb, M.

    2005-06-01

    The Lorentz Integral Transform approach allows microscopic calculations of electromagnetic reaction cross-sections without explicit knowledge of final-state wave functions. The necessary inversion of the transform has to be treated with great care, since it constitutes a so-called ill-posed problem. In this work new inversion techniques for the Lorentz Integral Transform are introduced. It is shown that they all contain a regularization scheme, which is necessary to overcome the ill-posed problem. In addition, it is illustrated that the new techniques have a much broader range of application than the present standard inversion method of the Lorentz Integral Transform.

  1. Jacobi-Integral Method For Two-Body Problem

    NASA Technical Reports Server (NTRS)

    Bond, Victor R.; Gottlieb, Robert G.; Fraietta, Michael F.

    1991-01-01

    Jacobi-integral method enables efficient, accurate computation of trajectory of natural satellite or spacecraft perturbed by component of gravitational potential depending explicitly on both position and time. Instead of total energy, Jacobi integral, which is energylike constant of motion in this case, embedded in Newtonian differential equations of motion. Trajectories computed in fewer steps. With modifications, applicable to such terrestrial problems as motions of rotors and of beams of electrically charged particles in changing electrical and magnetic fields.

  2. Promising Perceptions, Divergent Practices and Barriers to Integrated Malaria Prevention in Wakiso District, Uganda: A Mixed Methods Study

    PubMed Central

    Musoke, David; Miiro, George; Karani, George; Morris, Keith; Kasasa, Simon; Ndejjo, Rawlance; Nakiyingi-Miiro, Jessica; Guwatudde, David; Musoke, Miph Boses

    2015-01-01

    Background The World Health Organization recommends use of multiple approaches to control malaria. The integrated approach to malaria prevention advocates the use of several malaria prevention methods in a holistic manner. This study assessed perceptions and practices on integrated malaria prevention in Wakiso district, Uganda. Methods A clustered cross-sectional survey was conducted among 727 households from 29 villages using both quantitative and qualitative methods. Assessment was done on awareness of various malaria prevention methods, potential for use of the methods in a holistic manner, and reasons for dislike of certain methods. Households were classified as using integrated malaria prevention if they used at least two methods. Logistic regression was used to test for factors associated with the use of integrated malaria prevention while adjusting for clustering within villages. Results Participants knew of the various malaria prevention methods in the integrated approach including use of insecticide treated nets (97.5%), removing mosquito breeding sites (89.1%), clearing overgrown vegetation near houses (97.9%), and closing windows and doors early in the evenings (96.4%). If trained, most participants (68.6%) would use all the suggested malaria prevention methods of the integrated approach. Among those who would not use all methods, the main reasons given were there being too many (70.2%) and cost (32.0%). Only 33.0% households were using the integrated approach to prevent malaria. Use of integrated malaria prevention by households was associated with reading newspapers (AOR 0.34; 95% CI 0.22 –0.53) and ownership of a motorcycle/car (AOR 1.75; 95% CI 1.03 – 2.98). Conclusion Although knowledge of malaria prevention methods was high and perceptions on the integrated approach promising, practices on integrated malaria prevention was relatively low. The use of the integrated approach can be improved by promoting use of multiple malaria prevention methods

  3. Modeling of multiple-optical-axis pattern-integrated interference lithography systems.

    PubMed

    Sedivy, Donald E; Gaylord, Thomas K

    2014-06-01

    The image quality and collimation in a multiple-optical-axis pattern-integrated interference lithography system are evaluated for an elementary optical system composed of single-element lenses. Image quality and collimation are individually and jointly optimized for these lenses. Example images for a jointly optimized system are simulated using a combination of ray tracing and Fourier analysis. Even with these nonoptimized components, reasonable fidelity is shown to be possible.

  4. It is time to integrate: the temporal dynamics of object motion and texture motion integration in multiple object tracking.

    PubMed

    Huff, Markus; Papenmeier, Frank

    2013-01-14

    In multiple-object tracking, participants can track several moving objects among identical distractors. It has recently been shown that the human visual system uses motion information in order to keep track of targets (St. Clair et al., Journal of Vision, 10(4), 1-13). Texture on the surface of an object that moved in the opposite direction to the object itself impaired tracking performance. In this study, we examined the temporal interval at which texture motion and object motion is integrated in dynamic scenes. In two multiple-object tracking experiments, we manipulated the texture motion on the objects: The texture either moved in the same direction as the objects, in the opposite direction, or alternated between the same and opposite direction at varying intervals. In Experiment 1, we show that the integration of object motion and texture motion can take place at intervals as short as 100 ms. In Experiment 2, we show that there is a linear relationship between the proportion of opposite texture motion and tracking performance. We suggest that texture motion might cause shifts in perceived object locations, thus influencing tracking performance.

  5. Scientific concepts and applications of integrated discrete multiple organ co-culture technology

    PubMed Central

    Gayathri, Loganathan; Dhanasekaran, Dharumadurai; Akbarsha, Mohammad A.

    2015-01-01

    Over several decades, animals have been used as models to investigate the human-specific drug toxicity, but the outcomes are not always reliably extrapolated to the humans in vivo. Appropriate in vitro human-based experimental system that includes in vivo parameters is required for the evaluation of multiple organ interaction, multiple organ/organ-specific toxicity, and metabolism of xenobiotic compounds to avoid the use of animals for toxicity testing. One such versatile in vitro technology in which human primary cells could be used is integrated discrete multiple organ co-culture (IdMOC). IdMOC system adopts wells-within-well concept that facilitates co-culture of cells from different organs in a discrete manner, separately in the respective media in the smaller inner wells which are then interconnected by an overlay of a universal medium in the large containing well. This novel in vitro approach mimics the in vivo situation to a great extent, and employs cells from multiple organs that are physically separated but interconnected by a medium that mimics the systemic circulation and provides for multiple organ interaction. Applications of IdMOC include assessment of multiple organ toxicity, drug distribution, organ-specific toxicity, screening of anticancer drugs, metabolic cytotoxicity, etc. PMID:25969651

  6. Development of Improved Surface Integral Methods for Jet Aeroacoustic Predictions

    NASA Technical Reports Server (NTRS)

    Pilon, Anthony R.; Lyrintzis, Anastasios S.

    1997-01-01

    The accurate prediction of aerodynamically generated noise has become an important goal over the past decade. Aeroacoustics must now be an integral part of the aircraft design process. The direct calculation of aerodynamically generated noise with CFD-like algorithms is plausible. However, large computer time and memory requirements often make these predictions impractical. It is therefore necessary to separate the aeroacoustics problem into two parts, one in which aerodynamic sound sources are determined, and another in which the propagating sound is calculated. This idea is applied in acoustic analogy methods. However, in the acoustic analogy, the determination of far-field sound requires the solution of a volume integral. This volume integration again leads to impractical computer requirements. An alternative to the volume integrations can be found in the Kirchhoff method. In this method, Green's theorem for the linear wave equation is used to determine sound propagation based on quantities on a surface surrounding the source region. The change from volume to surface integrals represents a tremendous savings in the computer resources required for an accurate prediction. This work is concerned with the development of enhancements of the Kirchhoff method for use in a wide variety of aeroacoustics problems. This enhanced method, the modified Kirchhoff method, is shown to be a Green's function solution of Lighthill's equation. It is also shown rigorously to be identical to the methods of Ffowcs Williams and Hawkings. This allows for development of versatile computer codes which can easily alternate between the different Kirchhoff and Ffowcs Williams-Hawkings formulations, using the most appropriate method for the problem at hand. The modified Kirchhoff method is developed primarily for use in jet aeroacoustics predictions. Applications of the method are shown for two dimensional and three dimensional jet flows. Additionally, the enhancements are generalized so that

  7. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  8. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  9. Performance Analysis of Fault Detection and Identification for Multiple Faults in GNSS and GNSS/INS Integration

    NASA Astrophysics Data System (ADS)

    Alqurashi, Muwaffaq; Wang, Jinling

    2015-03-01

    For positioning, navigation and timing (PNT) purposes, GNSS or GNSS/INS integration is utilised to provide real-time solutions. However, any potential sensor failures or faulty measurements due to malfunctions of sensor components or harsh operating environments may cause unsatisfactory estimation for PNT parameters. The inability for immediate detecting faulty measurements or sensor component failures will reduce the overall performance of the system. So, real time detection and identification of faulty measurements is required to make the system more accurate and reliable for different applications that need real time solutions such as real time mapping for safety or emergency purposes. Consequently, it is necessary to implement an online fault detection and isolation (FDI) algorithm which is a statistic-based approach to detect and identify multiple faults.However, further investigations on the performance of the FDI for multiple fault scenarios is still required. In this paper, the performance of the FDI method under multiple fault scenarios is evaluated, e.g., for two, three and four faults in the GNSS and GNSS/INS measurements under different conditions of visible satellites and satellites geometry. Besides, the reliability (e.g., MDB) and separability (correlation coefficients between faults detection statistics) measures are also investigated to measure the capability of the FDI method. A performance analysis of the FDI method is conducted under the geometric constraints, to show the importance of the FDI method in terms of fault detectability and separability for robust positioning and navigation for real time applications.

  10. The eye in hand: predicting others' behavior by integrating multiple sources of information.

    PubMed

    Ambrosini, Ettore; Pezzulo, Giovanni; Costantini, Marcello

    2015-04-01

    The ability to predict the outcome of other beings' actions confers significant adaptive advantages. Experiments have assessed that human action observation can use multiple information sources, but it is currently unknown how they are integrated and how conflicts between them are resolved. To address this issue, we designed an action observation paradigm requiring the integration of multiple, potentially conflicting sources of evidence about the action target: the actor's gaze direction, hand preshape, and arm trajectory, and their availability and relative uncertainty in time. In two experiments, we analyzed participants' action prediction ability by using eye tracking and behavioral measures. The results show that the information provided by the actor's gaze affected participants' explicit predictions. However, results also show that gaze information was disregarded as soon as information on the actor's hand preshape was available, and this latter information source had widespread effects on participants' prediction ability. Furthermore, as the action unfolded in time, participants relied increasingly more on the arm movement source, showing sensitivity to its increasing informativeness. Therefore, the results suggest that the brain forms a robust estimate of the actor's motor intention by integrating multiple sources of information. However, when informative motor cues such as a preshaped hand with a given grip are available and might help in selecting action targets, people tend to capitalize on such motor cues, thus turning out to be more accurate and fast in inferring the object to be manipulated by the other's hand. Copyright © 2015 the American Physiological Society.

  11. The eye in hand: predicting others' behavior by integrating multiple sources of information

    PubMed Central

    Pezzulo, Giovanni; Costantini, Marcello

    2015-01-01

    The ability to predict the outcome of other beings' actions confers significant adaptive advantages. Experiments have assessed that human action observation can use multiple information sources, but it is currently unknown how they are integrated and how conflicts between them are resolved. To address this issue, we designed an action observation paradigm requiring the integration of multiple, potentially conflicting sources of evidence about the action target: the actor's gaze direction, hand preshape, and arm trajectory, and their availability and relative uncertainty in time. In two experiments, we analyzed participants' action prediction ability by using eye tracking and behavioral measures. The results show that the information provided by the actor's gaze affected participants' explicit predictions. However, results also show that gaze information was disregarded as soon as information on the actor's hand preshape was available, and this latter information source had widespread effects on participants' prediction ability. Furthermore, as the action unfolded in time, participants relied increasingly more on the arm movement source, showing sensitivity to its increasing informativeness. Therefore, the results suggest that the brain forms a robust estimate of the actor's motor intention by integrating multiple sources of information. However, when informative motor cues such as a preshaped hand with a given grip are available and might help in selecting action targets, people tend to capitalize on such motor cues, thus turning out to be more accurate and fast in inferring the object to be manipulated by the other's hand. PMID:25568158

  12. Explicit Integration of Extremely Stiff Reaction Networks: Asymptotic Methods

    SciTech Connect

    Guidry, Mike W; Budiardja, R.; Feger, E.; Billings, J. J.; Hix, William Raphael; Messer, O.E.B.; Roche, K. J.; McMahon, E.; He, M.

    2013-01-01

    We show that, even for extremely stiff systems, explicit integration may compete in both accuracy and speed with implicit methods if algebraic methods are used to stabilize the numerical integration. The stabilizing algebra differs for systems well removed from equilibrium and those near equilibrium. This paper introduces a quantitative distinction between these two regimes and addresses the former case in depth, presenting explicit asymptotic methods appropriate when the system is extremely stiff but only weakly equilibrated. A second paper [1] examines quasi-steady-state methods as an alternative to asymptotic methods in systems well away from equilibrium and a third paper [2] extends these methods to equilibrium conditions in extremely stiff systems using partial equilibrium methods. All three papers present systematic evidence for timesteps competitive with implicit methods. Because explicit methods can execute a timestep faster than an implicit method, our results imply that algebraically stabilized explicit algorithms may offer a means to integration of larger networks than have been feasible previously in various disciplines.

  13. A bin integral method for solving the kinetic collection equation

    NASA Astrophysics Data System (ADS)

    Wang, Lian-Ping; Xue, Yan; Grabowski, Wojciech W.

    2007-09-01

    A new numerical method for solving the kinetic collection equation (KCE) is proposed, and its accuracy and convergence are investigated. The method, herein referred to as the bin integral method with Gauss quadrature (BIMGQ), makes use of two binwise moments, namely, the number and mass concentration in each bin. These two degrees of freedom define an extended linear representation of the number density distribution for each bin following Enukashvily (1980). Unlike previous moment-based methods in which the gain and loss integrals are evaluated for a target bin, the concept of source-bin pair interactions is used to transfer bin moments from source bins to target bins. Collection kernels are treated by bilinear interpolations. All binwise interaction integrals are then handled exactly by Gauss quadrature of various orders. In essence the method combines favorable features in previous spectral moment-based and bin-based pair-interaction (or flux) methods to greatly enhance the logic, consistency, and simplicity in the numerical method and its implementation. Quantitative measures are developed to rigorously examine the accuracy and convergence properties of BIMGQ for both the Golovin kernel and hydrodynamic kernels. It is shown that BIMGQ has a superior accuracy for the Golovin kernel and a monotonic convergence behavior for hydrodynamic kernels. Direct comparisons are also made with the method of Berry and Reinhardt (1974), the linear flux method of Bott (1998), and the linear discrete method of Simmel et al. (2002).

  14. Integral Method of Boundary Characteristics in Solving the Stefan Problem: Dirichlet Condition

    NASA Astrophysics Data System (ADS)

    Kot, V. A.

    2016-09-01

    The integral method of boundary characteristics is considered as applied to the solution of the Stefan problem with a Dirichlet condition. On the basis of the multiple integration of the heat-conduction equation, a sequence of identical equalities with boundary characteristics in the form of n-fold integrals of the surface temperature has been obtained. It is shown that, in the case where the temperature profile is defined by an exponential polynomial and the Stefan condition is not fulfilled at a moving interphase boundary, the accuracy of solving the Stefan problem with a Dirichlet condition by the integral method of boundary characteristics is higher by several orders of magnitude than the accuracy of solving this problem by other known approximate methods and that the solutions of the indicated problem with the use of the fourth-sixth degree polynomials on the basis of the integral method of boundary characteristics are exact in essence. This method surpasses the known numerical methods by many orders of magnitude in the accuracy of calculating the position of the interphase boundary and is approximately equal to them in the accuracy of calculating the temperature profile.

  15. Understanding Physiology in the Continuum: Integration of Information from Multiple -Omics Levels

    PubMed Central

    Kamisoglu, Kubra; Acevedo, Alison; Almon, Richard R.; Coyle, Susette; Corbett, Siobhan; Dubois, Debra C.; Nguyen, Tung T.; Jusko, William J.; Androulakis, Ioannis P.

    2017-01-01

    In this paper, we discuss approaches for integrating biological information reflecting diverse physiologic levels. In particular, we explore statistical and model-based methods for integrating transcriptomic, proteomic and metabolomics data. Our case studies reflect responses to a systemic inflammatory stimulus and in response to an anti-inflammatory treatment. Our paper serves partly as a review of existing methods and partly as a means to demonstrate, using case studies related to human endotoxemia and response to methylprednisolone (MPL) treatment, how specific questions may require specific methods, thus emphasizing the non-uniqueness of the approaches. Finally, we explore novel ways for integrating -omics information with PKPD models, toward the development of more integrated pharmacology models. PMID:28289389

  16. Boundary integral equation method for electromagnetic and elastic waves

    NASA Astrophysics Data System (ADS)

    Chen, Kun

    calculating Brillouin diagram in eigenvalue problem and for normal incidence in scattering problem. Thirdly, a high order Nyström method is developed for elastodynamic scattering that features a simple local correction scheme due to a careful choice of basis functions. A novel simple and efficient singularity subtraction scheme and a new effective near singularity subtraction scheme are proposed for performing singular and nearly singular integrals on curvilinear triangular elements. The robustness, high accuracy and high order convergence of the proposed approached are demonstrated by numerical results. Finally, the multilevel fast multipole algorithm (MLFMA) is applied to accelerate the proposed Nyström method for solving large scale problems. A Formulation that can significantly reduce the memory requirements in MLFMA is come up with. Numerical examples in frequency domain are first given to show the accuracy and efficiency of the algorithm. By solving at multiple frequencies and performing the inverse Fourier transform, time domain results are also presented that are of interest to ultrasonic non-destructive evaluation.

  17. Promising perceptions, divergent practices and barriers to integrated malaria prevention in Wakiso district, Uganda: a mixed methods study.

    PubMed

    Musoke, David; Miiro, George; Karani, George; Morris, Keith; Kasasa, Simon; Ndejjo, Rawlance; Nakiyingi-Miiro, Jessica; Guwatudde, David; Musoke, Miph Boses

    2015-01-01

    The World Health Organization recommends use of multiple approaches to control malaria. The integrated approach to malaria prevention advocates the use of several malaria prevention methods in a holistic manner. This study assessed perceptions and practices on integrated malaria prevention in Wakiso district, Uganda. A clustered cross-sectional survey was conducted among 727 households from 29 villages using both quantitative and qualitative methods. Assessment was done on awareness of various malaria prevention methods, potential for use of the methods in a holistic manner, and reasons for dislike of certain methods. Households were classified as using integrated malaria prevention if they used at least two methods. Logistic regression was used to test for factors associated with the use of integrated malaria prevention while adjusting for clustering within villages. Participants knew of the various malaria prevention methods in the integrated approach including use of insecticide treated nets (97.5%), removing mosquito breeding sites (89.1%), clearing overgrown vegetation near houses (97.9%), and closing windows and doors early in the evenings (96.4%). If trained, most participants (68.6%) would use all the suggested malaria prevention methods of the integrated approach. Among those who would not use all methods, the main reasons given were there being too many (70.2%) and cost (32.0%). Only 33.0% households were using the integrated approach to prevent malaria. Use of integrated malaria prevention by households was associated with reading newspapers (AOR 0.34; 95% CI 0.22 -0.53) and ownership of a motorcycle/car (AOR 1.75; 95% CI 1.03 - 2.98). Although knowledge of malaria prevention methods was high and perceptions on the integrated approach promising, practices on integrated malaria prevention was relatively low. The use of the integrated approach can be improved by promoting use of multiple malaria prevention methods through various communication channels

  18. Extended layerwise method for laminated composite plates with multiple delaminations and transverse cracks

    NASA Astrophysics Data System (ADS)

    Li, D. H.; Zhang, X.; Sze, K. Y.; Liu, Y.

    2016-10-01

    In this paper, the extended layerwise method (XLWM), which was developed for laminated composite beams with multiple delaminations and transverse cracks (Li et al. in Int J Numer Methods Eng 101:407-434, 2015), is extended to laminated composite plates. The strong and weak discontinuous functions along the thickness direction are adopted to simulate multiple delaminations and interlaminar interfaces, respectively, whilst transverse cracks are modeled by the extended finite element method (XFEM). The interaction integral method and maximum circumferential tensile criterion are used to calculate the stress intensity factor (SIF) and crack growth angle, respectively. The XLWM for laminated composite plates can accurately predicts the displacement and stress fields near the crack tips and delamination fronts. The thickness distribution of SIF and thus the crack growth angles in different layers can be obtained. These information cannot be predicted by using other existing shell elements enriched by XFEM. Several numerical examples are studied to demonstrate the capabilities of the XLWM in static response analyses, SIF calculations and crack growth predictions.

  19. Integrative methods for analyzing big data in precision medicine.

    PubMed

    Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša

    2016-03-01

    We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Evaluation of atomic pressure in the multiple time-step integration algorithm.

    PubMed

    Andoh, Yoshimichi; Yoshii, Noriyuki; Yamada, Atsushi; Okazaki, Susumu

    2017-04-15

    In molecular dynamics (MD) calculations, reduction in calculation time per MD loop is essential. A multiple time-step (MTS) integration algorithm, the RESPA (Tuckerman and Berne, J. Chem. Phys. 1992, 97, 1990-2001), enables reductions in calculation time by decreasing the frequency of time-consuming long-range interaction calculations. However, the RESPA MTS algorithm involves uncertainties in evaluating the atomic interaction-based pressure (i.e., atomic pressure) of systems with and without holonomic constraints. It is not clear which intermediate forces and constraint forces in the MTS integration procedure should be used to calculate the atomic pressure. In this article, we propose a series of equations to evaluate the atomic pressure in the RESPA MTS integration procedure on the basis of its equivalence to the Velocity-Verlet integration procedure with a single time step (STS). The equations guarantee time-reversibility even for the system with holonomic constrants. Furthermore, we generalize the equations to both (i) arbitrary number of inner time steps and (ii) arbitrary number of force components (RESPA levels). The atomic pressure calculated by our equations with the MTS integration shows excellent agreement with the reference value with the STS, whereas pressures calculated using the conventional ad hoc equations deviated from it. Our equations can be extended straightforwardly to the MTS integration algorithm for the isothermal NVT and isothermal-isobaric NPT ensembles. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Modulation of C. elegans Touch Sensitivity Is Integrated at Multiple Levels

    PubMed Central

    Chen, Xiaoyin

    2014-01-01

    Sensory systems can adapt to different environmental signals. Here we identify four conditions that modulate anterior touch sensitivity in Caenorhabditis elegans after several hours and demonstrate that such sensory modulation is integrated at multiple levels to produce a single output. Prolonged vibration involving integrin signaling directly sensitizes the touch receptor neurons (TRNs). In contrast, hypoxia, the dauer state, and high salt reduce touch sensitivity by preventing the release of long-range neuroregulators, including two insulin-like proteins. Integration of these latter inputs occurs at upstream neurohormonal cells and at the insulin signaling cascade within the TRNs. These signals and those from integrin signaling converge to modulate touch sensitivity by regulating AKT kinases and DAF-16/FOXO. Thus, activation of either the integrin or insulin pathways can compensate for defects in the other pathway. This modulatory system integrates conflicting signals from different modalities, and adapts touch sensitivity to both mechanical and non-mechanical conditions. PMID:24806678

  2. Digital methods of photopeak integration in activation analysis.

    NASA Technical Reports Server (NTRS)

    Baedecker, P. A.

    1971-01-01

    A study of the precision attainable by several methods of gamma-ray photopeak integration has been carried out. The 'total peak area' method, the methods proposed by Covell, Sterlinski, and Quittner, and some modifications of these methods have been considered. A modification by Wasson of the total peak area method is considered to be the most advantageous due to its simplicity and the relatively high precision obtainable with this technique. A computer routine for the analysis of spectral data from nondestructive activation analysis experiments employing a Ge(Li) detector-spectrometer system is described.

  3. Error Analysis and Calibration Method of a Multiple Field-of-View Navigation System

    PubMed Central

    Shi, Shuai; Zhao, Kaichun; You, Zheng; Ouyang, Chenguang; Cao, Yongkui; Wang, Zhenzhou

    2017-01-01

    The Multiple Field-of-view Navigation System (MFNS) is a spacecraft subsystem built to realize the autonomous navigation of the Spacecraft Inside Tiangong Space Station. This paper introduces the basics of the MFNS, including its architecture, mathematical model and analysis, and numerical simulation of system errors. According to the performance requirement of the MFNS, the calibration of both intrinsic and extrinsic parameters of the system is assumed to be essential and pivotal. Hence, a novel method based on the geometrical constraints in object space, called checkerboard-fixed post-processing calibration (CPC), is proposed to solve the problem of simultaneously obtaining the intrinsic parameters of the cameras integrated in the MFNS and the transformation between the MFNS coordinate and the cameras’ coordinates. This method utilizes a two-axis turntable and a prior alignment of the coordinates is needed. Theoretical derivation and practical operation of the CPC method are introduced. The calibration experiment results of the MFNS indicate that the extrinsic parameter accuracy of the CPC reaches 0.1° for each Euler angle and 0.6 mm for each position vector component (1σ). A navigation experiment verifies the calibration result and the performance of the MFNS. The MFNS is found to work properly, and the accuracy of the position vector components and Euler angle reaches 1.82 mm and 0.17° (1σ) respectively. The basic mechanism of the MFNS may be utilized as a reference for the design and analysis of multiple-camera systems. Moreover, the calibration method proposed has practical value for its convenience for use and potential for integration into a toolkit. PMID:28327538

  4. Simultaneous reconstruction of multiple depth images without off-focus points in integral imaging using a graphics processing unit.

    PubMed

    Yi, Faliu; Lee, Jieun; Moon, Inkyu

    2014-05-01

    The reconstruction of multiple depth images with a ray back-propagation algorithm in three-dimensional (3D) computational integral imaging is computationally burdensome. Further, a reconstructed depth image consists of a focus and an off-focus area. Focus areas are 3D points on the surface of an object that are located at the reconstructed depth, while off-focus areas include 3D points in free-space that do not belong to any object surface in 3D space. Generally, without being removed, the presence of an off-focus area would adversely affect the high-level analysis of a 3D object, including its classification, recognition, and tracking. Here, we use a graphics processing unit (GPU) that supports parallel processing with multiple processors to simultaneously reconstruct multiple depth images using a lookup table containing the shifted values along the x and y directions for each elemental image in a given depth range. Moreover, each 3D point on a depth image can be measured by analyzing its statistical variance with its corresponding samples, which are captured by the two-dimensional (2D) elemental images. These statistical variances can be used to classify depth image pixels as either focus or off-focus points. At this stage, the measurement of focus and off-focus points in multiple depth images is also implemented in parallel on a GPU. Our proposed method is conducted based on the assumption that there is no occlusion of the 3D object during the capture stage of the integral imaging process. Experimental results have demonstrated that this method is capable of removing off-focus points in the reconstructed depth image. The results also showed that using a GPU to remove the off-focus points could greatly improve the overall computational speed compared with using a CPU.

  5. Cognitive Impairment and Community Integration Outcomes in Individuals Living With Multiple Sclerosis.

    PubMed

    Hughes, Abbey J; Hartoonian, Narineh; Parmenter, Brett; Haselkorn, Jodie K; Lovera, Jesus F; Bourdette, Dennis; Turner, Aaron P

    2015-11-01

    To determine the association between unique domains of cognitive impairment and community integration in individuals with multiple sclerosis (MS), and to determine the contributions of cognitive impairment to community integration beyond the influence of demographic and clinical variables. Cross-sectional analysis of objective neuropsychological assessment and self-report data. Data were collected during baseline assessment of a randomized, multisite controlled trial of ginkgo biloba for cognitive impairment in MS. Hierarchical regression analyses examined the association between subjective and objective measures of cognitive impairment and 3 domains of community integration, adjusting for relevant covariates. Two Veterans Affairs medical center MS clinics. Adults (N=121; ages 24-65y) with a confirmed MS diagnosis. Not applicable. Primary outcomes were scores on the Home Integration (CIQ-H), Social Integration (CIQ-S), and Productivity (CIQ-P) domains of the Community Integration Questionnaire (CIQ). Cognitive impairment was associated with lower scores on the CIQ-H and CIQ-S, but not the CIQ-P. Greater levels of subjective cognitive impairment were associated with lower scores on the CIQ-H and CIQ-S. Greater levels of objective cognitive impairment, specifically slower processing speed and poorer inhibitory control, were related to lower CIQ-S scores. Subjective and objective measures of cognitive impairment were significantly and independently associated with CIQ-S. Objective cognitive impairment may interfere with participation in social activities. Subjective cognitive impairment is also important to assess, because individuals who perceive themselves to be cognitively impaired may be less likely to participate in both home and social activities. Clinical interventions to enhance community integration in individuals with MS may benefit from addressing objective and subjective cognitive impairment by integrating cognitive rehabilitation approaches with self

  6. Integral function method for determination of nonlinear harmonic distortion

    NASA Astrophysics Data System (ADS)

    Cerdeira, Antonio; Alemán, Miguel A.; Estrada, Magali; Flandre, Denis

    2004-12-01

    The analysis of harmonic distortion is of prime importance for the analog and mixed integrated circuits. Recently we presented a new integral function method (IFM), based on a completely new principle, which allows the calculation of harmonic distortion using the DC output characteristic of devices or circuits. In this work we complement the integral function method to provide direct calculation of the following distortion figures: total harmonic distortion (THD), second harmonic distortion (HD2) and third harmonic distortion (HD3), voltage intercept points (VIP) and the intermodulation distortion (IMD). The comparison with the same distortion figures calculated by the Fourier coefficients (FC), by direct AC measurements and from FFT in simulators, indicates that results obtained by IFM give an excellent agreement in the full range of the analyzed active regions. The IFM combines simplicity and computer efficiency with accuracy and with the possibility to easily analyze the distortion when varying any of the circuit or device parameters.

  7. Accelerometer Method and Apparatus for Integral Display and Control Functions

    NASA Technical Reports Server (NTRS)

    Bozeman, Richard J., Jr. (Inventor)

    1998-01-01

    Method and apparatus for detecting mechanical vibrations and outputting a signal in response thereto is discussed. An accelerometer package having integral display and control functions is suitable for mounting upon the machinery to be monitored. Display circuitry provides signals to a bar graph display which may be used to monitor machine conditions over a period of time. Control switches may be set which correspond to elements in the bar graph to provide an alert if vibration signals increase in amplitude over a selected trip point. The circuitry is shock mounted within the accelerometer housing. The method provides for outputting a broadband analog accelerometer signal, integrating this signal to produce a velocity signal, integrating and calibrating the velocity signal before application to a display driver, and selecting a trip point at which a digitally compatible output signal is generated.

  8. Multiple Integration of the Heat-Conduction Equation for a Space Bounded From the Inside

    NASA Astrophysics Data System (ADS)

    Kot, V. A.

    2016-03-01

    An N-fold integration of the heat-conduction equation for a space bounded from the inside has been performed using a system of identical equalities with definition of the temperature function by a power polynomial with an exponential factor. It is shown that, in a number of cases, the approximate solutions obtained can be considered as exact because their errors comprise hundredths and thousandths of a percent. The method proposed for N-fold integration represents an alternative to classical integral transformations.

  9. Integrating multiple omics to unravel mechanisms of Cyclosporin A induced hepatotoxicity in vitro.

    PubMed

    Van den Hof, Wim F P M; Ruiz-Aracama, Ainhoa; Van Summeren, Anke; Jennen, Danyel G J; Gaj, Stan; Coonen, Maarten L J; Brauers, Karen; Wodzig, Will K W H; van Delft, Joost H M; Kleinjans, Jos C S

    2015-04-01

    In order to improve attrition rates of candidate-drugs there is a need for a better understanding of the mechanisms underlying drug-induced hepatotoxicity. We aim to further unravel the toxicological response of hepatocytes to a prototypical cholestatic compound by integrating transcriptomic and metabonomic profiling of HepG2 cells exposed to Cyclosporin A. Cyclosporin A exposure induced intracellular cholesterol accumulation and diminished intracellular bile acid levels. Performing pathway analyses of significant mRNAs and metabolites separately and integrated, resulted in more relevant pathways for the latter. Integrated analyses showed pathways involved in cell cycle and cellular metabolism to be significantly changed. Moreover, pathways involved in protein processing of the endoplasmic reticulum, bile acid biosynthesis and cholesterol metabolism were significantly affected. Our findings indicate that an integrated approach combining metabonomics and transcriptomics data derived from representative in vitro models, with bioinformatics can improve our understanding of the mechanisms of action underlying drug-induced hepatotoxicity. Furthermore, we showed that integrating multiple omics and thereby analyzing genes, microRNAs and metabolites of the opposed model for drug-induced cholestasis can give valuable information about mechanisms of drug-induced cholestasis in vitro and therefore could be used in toxicity screening of new drug candidates at an early stage of drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. An efficient method for generalized linear multiplicative programming problem with multiplicative constraints.

    PubMed

    Zhao, Yingfeng; Liu, Sanyang

    2016-01-01

    We present a practical branch and bound algorithm for globally solving generalized linear multiplicative programming problem with multiplicative constraints. To solve the problem, a relaxation programming problem which is equivalent to a linear programming is proposed by utilizing a new two-phase relaxation technique. In the algorithm, lower and upper bounds are simultaneously obtained by solving some linear relaxation programming problems. Global convergence has been proved and results of some sample examples and a small random experiment show that the proposed algorithm is feasible and efficient.

  11. Multistep and Multistage Boundary Integral Methods for the Wave Equation

    NASA Astrophysics Data System (ADS)

    Banjai, Lehel

    2009-09-01

    We describe how time-discretized wave equation in a homogeneous medium can be solved by boundary integral methods. The time discretization can be a multistep, Runge-Kutta, or a more general multistep-multistage method. The resulting convolutional system of boundary integral equations falls in the family of convolution quadratures of Ch. Lubich. In this work our aim is to discuss a new technique for efficiently solving the discrete convolutional system and to present large scale 3D numerical experiments with a wide range of time-discretizations that have up to now not appeared in print. One of the conclusions is that Runge-Kutta methods are often the method of choice even at low accuracy; yet, in connection with hyperbolic problems BDF (backward difference formulas) have been predominant in the literature on convolution quadrature.

  12. Approximation method to compute domain related integrals in structural studies

    NASA Astrophysics Data System (ADS)

    Oanta, E.; Panait, C.; Raicu, A.; Barhalescu, M.; Axinte, T.

    2015-11-01

    Various engineering calculi use integral calculus in theoretical models, i.e. analytical and numerical models. For usual problems, integrals have mathematical exact solutions. If the domain of integration is complicated, there may be used several methods to calculate the integral. The first idea is to divide the domain in smaller sub-domains for which there are direct calculus relations, i.e. in strength of materials the bending moment may be computed in some discrete points using the graphical integration of the shear force diagram, which usually has a simple shape. Another example is in mathematics, where the surface of a subgraph may be approximated by a set of rectangles or trapezoids used to calculate the definite integral. The goal of the work is to introduce our studies about the calculus of the integrals in the transverse section domains, computer aided solutions and a generalizing method. The aim of our research is to create general computer based methods to execute the calculi in structural studies. Thus, we define a Boolean algebra which operates with ‘simple’ shape domains. This algebraic standpoint uses addition and subtraction, conditioned by the sign of every ‘simple’ shape (-1 for the shapes to be subtracted). By ‘simple’ shape or ‘basic’ shape we define either shapes for which there are direct calculus relations, or domains for which their frontiers are approximated by known functions and the according calculus is carried out using an algorithm. The ‘basic’ shapes are linked to the calculus of the most significant stresses in the section, refined aspect which needs special attention. Starting from this idea, in the libraries of ‘basic’ shapes, there were included rectangles, ellipses and domains whose frontiers are approximated by spline functions. The domain triangularization methods suggested that another ‘basic’ shape to be considered is the triangle. The subsequent phase was to deduce the exact relations for the

  13. An Integrated Inquiry Activity in an Elementary Teaching Methods Classroom

    ERIC Educational Resources Information Center

    Khalid, Tahsin

    2010-01-01

    In this integrated inquiry, students in an elementary teaching methods class investigate a real-world problem outside the classroom. The students use the Cognitive Research Trust (CoRT) thinking strategy to find the causes of, impact of, and possible solutions to the problem. They present their findings and then discuss implementation of this…

  14. Integrating Methods and Materials: Developing Trainees' Reading Skills.

    ERIC Educational Resources Information Center

    Jarvis, Jennifer

    1987-01-01

    Explores issues arising from a research project which studied ways of meeting the reading needs of trainee primary school teachers (from Malawi and Tanzania) of English as a foreign language. Topics discussed include: the classroom teaching situation; teaching "quality"; and integration of materials and methods. (CB)

  15. An Integrated Approach to Research Methods and Capstone

    ERIC Educational Resources Information Center

    Postic, Robert; McCandless, Ray; Stewart, Beth

    2014-01-01

    In 1991, the AACU issued a report on improving undergraduate education suggesting, in part, that a curriculum should be both comprehensive and cohesive. Since 2008, we have systematically integrated our research methods course with our capstone course in an attempt to accomplish the twin goals of comprehensiveness and cohesion. By taking this…

  16. An Integrated Approach to Research Methods and Capstone

    ERIC Educational Resources Information Center

    Postic, Robert; McCandless, Ray; Stewart, Beth

    2014-01-01

    In 1991, the AACU issued a report on improving undergraduate education suggesting, in part, that a curriculum should be both comprehensive and cohesive. Since 2008, we have systematically integrated our research methods course with our capstone course in an attempt to accomplish the twin goals of comprehensiveness and cohesion. By taking this…

  17. Analysis of integral method for fault detection in transformers

    SciTech Connect

    Hijazi, M.E.A.; Basak, A. . Wolfson Centre for Magnetics Technology)

    1993-11-01

    Test results obtained from using the integral method in transformer differential protection against internal fault current, are presented. The effect of various factors on the transient waveforms are considered and conditions to predict the magnetizing inrush current and a faulty system have been digitally simulated.

  18. Multiple sources and multiple measures based traffic flow prediction using the chaos theory and support vector regression method

    NASA Astrophysics Data System (ADS)

    Cheng, Anyu; Jiang, Xiao; Li, Yongfu; Zhang, Chao; Zhu, Hao

    2017-01-01

    This study proposes a multiple sources and multiple measures based traffic flow prediction algorithm using the chaos theory and support vector regression method. In particular, first, the chaotic characteristics of traffic flow associated with the speed, occupancy, and flow are identified using the maximum Lyapunov exponent. Then, the phase space of multiple measures chaotic time series are reconstructed based on the phase space reconstruction theory and fused into a same multi-dimensional phase space using the Bayesian estimation theory. In addition, the support vector regression (SVR) model is designed to predict the traffic flow. Numerical experiments are performed using the data from multiple sources. The results show that, compared with the single measure, the proposed method has better performance for the short-term traffic flow prediction in terms of the accuracy and timeliness.

  19. Singularity Preserving Numerical Methods for Boundary Integral Equations

    NASA Technical Reports Server (NTRS)

    Kaneko, Hideaki (Principal Investigator)

    1996-01-01

    In the past twelve months (May 8, 1995 - May 8, 1996), under the cooperative agreement with Division of Multidisciplinary Optimization at NASA Langley, we have accomplished the following five projects: a note on the finite element method with singular basis functions; numerical quadrature for weakly singular integrals; superconvergence of degenerate kernel method; superconvergence of the iterated collocation method for Hammersteion equations; and singularity preserving Galerkin method for Hammerstein equations with logarithmic kernel. This final report consists of five papers describing these projects. Each project is preceeded by a brief abstract.

  20. Adaptive Transmission Control Method for Communication-Broadcasting Integrated Services

    NASA Astrophysics Data System (ADS)

    Koto, Hideyuki; Furuya, Hiroki; Nakamura, Hajime

    This paper proposes an adaptive transmission control method for massive and intensive telecommunication traffic generated by communication-broadcasting integrated services. The proposed method adaptively controls data transmissions from viewers depending on the congestion states, so that severe congestion can be effectively avoided. Furthermore, it utilizes the broadcasting channel which is not only scalable, but also reliable for controlling the responses from vast numbers of viewers. The performance of the proposed method is evaluated through experiments on a test bed where approximately one million viewers are emulated. The obtained results quantitatively demonstrate the performance of the proposed method and its effectiveness under massive and intensive traffic conditions.

  1. Adaptive Estimation of Multiple Fading Factors for GPS/INS Integrated Navigation Systems

    PubMed Central

    Jiang, Chen; Zhang, Shu-Bi; Zhang, Qiu-Zhao

    2017-01-01

    The Kalman filter has been widely applied in the field of dynamic navigation and positioning. However, its performance will be degraded in the presence of significant model errors and uncertain interferences. In the literature, the fading filter was proposed to control the influences of the model errors, and the H-infinity filter can be adopted to address the uncertainties by minimizing the estimation error in the worst case. In this paper, a new multiple fading factor, suitable for the Global Positioning System (GPS) and the Inertial Navigation System (INS) integrated navigation system, is proposed based on the optimization of the filter, and a comprehensive filtering algorithm is constructed by integrating the advantages of the H-infinity filter and the proposed multiple fading filter. Measurement data of the GPS/INS integrated navigation system are collected under actual conditions. Stability and robustness of the proposed filtering algorithm are tested with various experiments and contrastive analysis are performed with the measurement data. Results demonstrate that both the filter divergence and the influences of outliers are restrained effectively with the proposed filtering algorithm, and precision of the filtering results are improved simultaneously. PMID:28587157

  2. Adaptive Estimation of Multiple Fading Factors for GPS/INS Integrated Navigation Systems.

    PubMed

    Jiang, Chen; Zhang, Shu-Bi; Zhang, Qiu-Zhao

    2017-06-01

    The Kalman filter has been widely applied in the field of dynamic navigation and positioning. However, its performance will be degraded in the presence of significant model errors and uncertain interferences. In the literature, the fading filter was proposed to control the influences of the model errors, and the H-infinity filter can be adopted to address the uncertainties by minimizing the estimation error in the worst case. In this paper, a new multiple fading factor, suitable for the Global Positioning System (GPS) and the Inertial Navigation System (INS) integrated navigation system, is proposed based on the optimization of the filter, and a comprehensive filtering algorithm is constructed by integrating the advantages of the H-infinity filter and the proposed multiple fading filter. Measurement data of the GPS/INS integrated navigation system are collected under actual conditions. Stability and robustness of the proposed filtering algorithm are tested with various experiments and contrastive analysis are performed with the measurement data. Results demonstrate that both the filter divergence and the influences of outliers are restrained effectively with the proposed filtering algorithm, and precision of the filtering results are improved simultaneously.

  3. Assessment of School Merit with Multiple Regression: Methods and Critique.

    ERIC Educational Resources Information Center

    Tate, Richard L.

    1986-01-01

    Regression-based adjustment of student outcomes for the assessment of the merit of schools is considered. First, the basics of causal modeling and multiple regression are briefly reviewed. Then, two common regression-based adjustment procedures are described, pointing out that the validity of the final assessments depends on: (1) the degree to…

  4. Evaluating marginal likelihood with thermodynamic integration method and comparison with several other numerical methods

    DOE PAGES

    Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...

    2016-02-05

    Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less

  5. An integrated modeling approach to support management decisions of coupled groundwater-agricultural systems under multiple uncertainties

    NASA Astrophysics Data System (ADS)

    Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens

    2015-04-01

    The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.

  6. [The academic education in nursing and multiple-victim incidents: an integrative review].

    PubMed

    Salvador, Pétala Tuani Candido de Oliveira; Dantas, Rodrigo Assis Neves; Dantas, Daniele Vieira; Torres, Gilson de Vasconcelos

    2012-06-01

    The objective of this study is to reflect on the knowledge, competencies and skill that must be promoted during the academic education of nurses for an effective professional practice in view of a multiple-victim incident (MVI). This is an integrative literature review regarding academic nursing education. The literature survey was performed on the BDENF, LILACS, SciELO, MEDLINE, Web of Knowledge and HighWire Press databases, using the following descriptors: higher education; nursing education; emergency nursing; and mass casualty incidents. The publications permitted considerations regarding the following themes: particularities; competencies and skills essential in nursing practice in view of multiple-victim incidents; and the professors' strategies to promote those competencies and skills. The literature analysis demonstrated that nursing education should be configured as a space to develop critical thinking skills, which requires professors to have an eclectic educational background.

  7. Characterization of multiple-bit errors from single-ion tracks in integrated circuits

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.; Edmonds, L. D.; Smith, L. S.

    1989-01-01

    The spread of charge induced by an ion track in an integrated circuit and its subsequent collection at sensitive nodal junctions can cause multiple-bit errors. The authors have experimentally and analytically investigated this phenomenon using a 256-kb dynamic random-access memory (DRAM). The effects of different charge-transport mechanisms are illustrated, and two classes of ion-track multiple-bit error clusters are identified. It is demonstrated that ion tracks that hit a junction can affect the lateral spread of charge, depending on the nature of the pull-up load on the junction being hit. Ion tracks that do not hit a junction allow the nearly uninhibited lateral spread of charge.

  8. A Tri-Factor Model for Integrating Ratings Across Multiple Informants

    PubMed Central

    Bauer, Daniel J.; Howard, Andrea L.; Baldasaro, Ruth E.; Curran, Patrick J.; Hussong, Andrea M.; Chassin, Laurie; Zucker, Robert A.

    2014-01-01

    Psychologists often obtain ratings for target individuals from multiple informants such as parents or peers. In this paper we propose a tri-factor model for multiple informant data that separates target-level variability from informant-level variability and item-level variability. By leveraging item-level data, the tri-factor model allows for examination of a single trait rated on a single target. In contrast to many psychometric models developed for multitrait-multimethod data, the tri-factor model is predominantly a measurement model. It is used to evaluate item quality in scale development, test hypotheses about sources of target variability (e.g., sources of trait differences) versus informant variability (e.g., sources of rater bias), and generate integrative scores that are purged of the subjective biases of single informants. PMID:24079932

  9. ePRISM: A case study in multiple proxy and mixed temporal resolution integration

    USGS Publications Warehouse

    Robinson, Marci M.; Dowsett, Harry J.

    2010-01-01

    As part of the Pliocene Research, Interpretation and Synoptic Mapping (PRISM) Project, we present the ePRISM experiment designed I) to provide climate modelers with a reconstruction of an early Pliocene warm period that was warmer than the PRISM interval (similar to 3.3 to 3.0 Ma), yet still similar in many ways to modern conditions and 2) to provide an example of how best to integrate multiple-proxy sea surface temperature (SST) data from time series with varying degrees of temporal resolution and age control as we begin to build the next generation of PRISM, the PRISM4 reconstruction, spanning a constricted time interval. While it is possible to tie individual SST estimates to a single light (warm) oxygen isotope event, we find that the warm peak average of SST estimates over a narrowed time interval is preferential for paleoclimate reconstruction as it allows for the inclusion of more records of multiple paleotemperature proxies.

  10. Impaired Neurovisceral Integration of Cardiovascular Modulation Contributes to Multiple Sclerosis Morbidities.

    PubMed

    Sternberg, Zohara

    2017-01-01

    Multiple sclerosis (MS) is an inflammatory demyelinating central nervous system (CNS) disease with an uncertain etiology. MS is heterogeneous, involving multiple clinical pathologies, including neurodegeneration, depression, fatigue and sleep disorders, migraine, osteoporosis and cerebral hemodynamic impairments. The underlying causes of these pathologies remain mostly unknown. Based on the accumulating evidence derived from our studies and those of other investigators, we propose that the dysregulation in the neurovisceral integration of cardiovascular modulation can lead to many MS-related clinical symptoms. We show that MS inflammatory and neurodegenerative processes are intertwined with the aforementioned clinical morbidities and are collectively the manifestations of cardiovascular autonomic nervous system (ANS) dysfunction. The strategies for improving sympathovagal balance would likely prevent and minimize many MS-related clinical symptoms, improving patients' quality of life. Similar strategies could be applied to other autoimmune and neurodegenerative diseases where autonomic imbalance plays a role.

  11. Integration of Symptom Ratings from Multiple Informants in ADHD Diagnosis: A Psychometric Model with Clinical Utility

    PubMed Central

    Martel, Michelle M.; Schimmack, Ulrich; Nikolas, Molly; Nigg, Joel T.

    2015-01-01

    The Diagnostic and Statistical Manual of Mental Disorder—Fifth Edition explicitly requires that Attention-Deficit/Hyperactivity Disorder (ADHD) symptoms should be apparent across settings, taking into account reports from multiple informants. Yet, it provides no guidelines how information from different raters should be combined in ADHD diagnosis. We examined the validity of different approaches using structural equation modeling (SEM) for multiple-informant data. Participants were 725 children, 6 to 17 years old, and their primary caregivers and teachers, recruited from the community and completing a thorough research-based diagnostic assessment, including a clinician-administered diagnostic interview, parent and teacher standardized rating scales and cognitive testing. A best-estimate ADHD diagnosis was generated by a diagnostic team. An SEM model demonstrated convergent validity among raters. We found relatively weak symptom-specific agreement among raters, suggesting that a general average scoring algorithm is preferable to symptom-specific scoring algorithms such as the “or” and “and” algorithms. Finally, to illustrate the validity of this approach, we show that averaging makes it possible to reduce the number of items from 18 items to 8 items without a significant decrease in validity. In conclusion, information from multiple raters increases the validity of ADHD diagnosis, and averaging appears to be the optimal way to integrate information from multiple raters. PMID:25730162

  12. Integration of symptom ratings from multiple informants in ADHD diagnosis: a psychometric model with clinical utility.

    PubMed

    Martel, Michelle M; Schimmack, Ulrich; Nikolas, Molly; Nigg, Joel T

    2015-09-01

    The Diagnostic and Statistical Manual of Mental Disorder-Fifth Edition explicitly requires that attention-deficit/hyperactivity disorder (ADHD) symptoms should be apparent across settings, taking into account reports from multiple informants. Yet, it provides no guidelines how information from different raters should be combined in ADHD diagnosis. We examined the validity of different approaches using structural equation modeling (SEM) for multiple-informant data. Participants were 725 children, 6 to 17 years old, and their primary caregivers and teachers, recruited from the community and completing a thorough research-based diagnostic assessment, including a clinician-administered diagnostic interview, parent and teacher standardized rating scales, and cognitive testing. A best-estimate ADHD diagnosis was generated by a diagnostic team. An SEM model demonstrated convergent validity among raters. We found relatively weak symptom-specific agreement among raters, suggesting that a general average scoring algorithm is preferable to symptom-specific scoring algorithms such as the "or" and "and" algorithms. Finally, to illustrate the validity of this approach, we show that averaging makes it possible to reduce the number of items from 18 items to 8 items without a significant decrease in validity. In conclusion, information from multiple raters increases the validity of ADHD diagnosis, and averaging appears to be the optimal way to integrate information from multiple raters. (c) 2015 APA, all rights reserved.

  13. Method to integrate full particle orbit in toroidal plasmas

    NASA Astrophysics Data System (ADS)

    Wei, X. S.; Xiao, Y.; Kuley, A.; Lin, Z.

    2015-09-01

    It is important to integrate full particle orbit accurately when studying charged particle dynamics in electromagnetic waves with frequency higher than cyclotron frequency. We have derived a form of the Boris scheme using magnetic coordinates, which can be used effectively to integrate the cyclotron orbit in toroidal geometry over a long period of time. The new method has been verified by a full particle orbit simulation in toroidal geometry without high frequency waves. The full particle orbit calculation recovers guiding center banana orbit. This method has better numeric properties than the conventional Runge-Kutta method for conserving particle energy and magnetic moment. The toroidal precession frequency is found to match that from guiding center simulation. Many other important phenomena in the presence of an electric field, such as E × B drift, Ware pinch effect and neoclassical polarization drift are also verified by the full orbit simulation.

  14. Multiple Populations in M31 Globular Clusters: Clues from Infrared High Resolution Integrated Light Spectroscopy

    NASA Astrophysics Data System (ADS)

    Sakari, Charli; APOGEE Team

    2017-01-01

    Abundance variations are a common feature of Milky Way globular clusters. The globular clusters in M31 are too distant for detailed abundance studies of their individual stars; however, cluster abundances can be determined through high resolution, integrated light (IL) spectroscopy. In this talk, I discuss how IL abundances can be interpreted in the context of multiple populations. In particular, I will present new infrared abudances of 25 M31 globular clusters, derived from IL spectra from the Apache Point Observatory Galactic Evolution Experiment (APOGEE). These H band spectra allow determinations of C, N, and O from molecular features, and Fe, Na, Mg, Al, Si, Ca, Ti, and K from atomic features. The integrated abundance ratios are then investigated with cluster [Fe/H] and mass.

  15. Cortical mechanisms for trans-saccadic memory and integration of multiple object features

    PubMed Central

    Prime, Steven L.; Vesia, Michael; Crawford, J. Douglas

    2011-01-01

    Constructing an internal representation of the world from successive visual fixations, i.e. separated by saccadic eye movements, is known as trans-saccadic perception. Research on trans-saccadic perception (TSP) has been traditionally aimed at resolving the problems of memory capacity and visual integration across saccades. In this paper, we review this literature on TSP with a focus on research showing that egocentric measures of the saccadic eye movement can be used to integrate simple object features across saccades, and that the memory capacity for items retained across saccades, like visual working memory, is restricted to about three to four items. We also review recent transcranial magnetic stimulation experiments which suggest that the right parietal eye field and frontal eye fields play a key functional role in spatial updating of objects in TSP. We conclude by speculating on possible cortical mechanisms for governing egocentric spatial updating of multiple objects in TSP. PMID:21242142

  16. Evaluating environmental sustainability: an integration of multiple-criteria decision-making and fuzzy logic.

    PubMed

    Liu, Kevin F R

    2007-05-01

    While pursuing economic development, countries around the world have become aware of the importance of environmental sustainability; therefore, the evaluation of environmental sustainability has become a significant issue. Traditionally, multiple-criteria decision-making (MCDM) was widely used as a way of evaluating environmental sustainability, Recently, several researchers have attempted to implement this evaluation with fuzzy logic since they recognized the assessment of environmental sustainability as a subjective judgment Intuition. This paper outlines a new evaluation-framework of environmental sustainability, which integrates fuzzy logic into MCDM. This evaluation-framework consists of 36 structured and 5 unstructured decision-points, wherein MCDM is used to handle the former and fuzzy logic serves for the latter, With the integrated evaluation-framework, the evaluations of environmental sustainability in 146 countries are calculated, ranked and clustered, and the evaluation results are very helpful to these countries, as they identify their obstacles towards environmental sustainability.

  17. Data integration and systems biology approaches for biomarker discovery: challenges and opportunities for multiple sclerosis.

    PubMed

    Villoslada, Pablo; Baranzini, Sergio

    2012-07-15

    New "omic" technologies and their application to systems biology approaches offer new opportunities for biomarker discovery in complex disorders, including multiple sclerosis (MS). Recent studies using massive genotyping, DNA arrays, antibody arrays, proteomics, glycomics, and metabolomics from different tissues (blood, cerebrospinal fluid, brain) have identified many molecules associated with MS, defining both susceptibility and functional targets (e.g., biomarkers). Such discoveries involve many different levels in the complex organizational hierarchy of humans (DNA, RNA, protein, etc.), and integrating these datasets into a coherent model with regard to MS pathogenesis would be a significant step forward. Given the dynamic and heterogeneous nature of MS, validating biomarkers is mandatory. To develop accurate markers of disease prognosis or therapeutic response that are clinically useful, combining molecular, clinical, and imaging data is necessary. Such an integrative approach would pave the way towards better patient care and more effective clinical trials that test new therapies, thus bringing the paradigm of personalized medicine in MS one step closer.

  18. Position synchronised control of multiple robotic manipulators based on integral sliding mode

    NASA Astrophysics Data System (ADS)

    Zhao, Dongya; Zhu, Quanmin

    2014-03-01

    In this study, a new position synchronised control algorithm is developed for multiple robotic manipulator systems. In the merit of system synchronisation and integral sliding mode control, the proposed approach can stabilise position tracking of each robotic manipulator while coordinating its motion with the other manipulators. With the integral sliding mode, the proposed approach has insensitiveness against the lumped system uncertainty within the entire process of operation. Further, a perturbation estimator is proposed to reduce chattering effect. The corresponding stability analysis is presented to lay a foundation for theoretical understanding to the underlying issues as well as safely operating real systems. An illustrative example is bench tested to validate the effectiveness of the proposed approach.

  19. Integrated Microfluidic Platform with Multiple Functions To Probe Tumor-Endothelial Cell Interaction.

    PubMed

    Lin, Ling; Lin, Xuexia; Lin, Luyao; Feng, Qiang; Kitamori, Takehiko; Lin, Jin-Ming; Sun, Jiashu

    2017-09-19

    Interaction between tumor and endothelial cells could affect tumor growth and progression and induce drug resistance during cancer therapy. Investigation of tumor-endothelial cell interaction involves cell coculture, protein detection, and analysis of drug metabolites, which are complicated and time-consuming. In this work, we present an integrated microfluidic device with three individual components (cell coculture component, protein detection component, and pretreatment component for drug metabolites) to probe the interaction between tumor and endothelial cells. Cocultured cervical carcinoma cells (CaSki cells) and human umbilical vein endothelial cells (HUVECs) show higher resistance to chemotherapeutic agents than single-cultured cells, indicated by higher cell viability, increased expression of angiogenic proteins, and elevated level of paclitaxel metabolites under coculture conditions. This integrated microfluidic platform with multiple functions facilitates understanding of the interaction between tumor and endothelial cells, and it may become a promising tool for drug screening within an engineered tumor microenvironment.

  20. Current DOT research on the effect of multiple site damage on structural integrity

    NASA Astrophysics Data System (ADS)

    Tong, P.; Arin, Kemal; Jeong, David Y.; Greif, R.; Brewer, John C.; Bobo, Stephan N.; Sampath, Sam N.

    1992-07-01

    Multiple site damage (MSD) is a type of cracking that may be found in aging airplanes and which may adversely affect their continuing airworthiness. The Volpe National Transportation Systems Center has supported the Federal Aviation Administration Technical Center on structural integrity research for the past two and half years. The work has focused on understanding the behavior of MSD, detection of MSD during airframe inspection, and the avoidance of MSD in future designs. These three elements of the MSD problem are addressed and a summary of the completed work, the current status, and requirements for future research is provided.

  1. Integrated airborne lidar and multiple endmember spectral mixture analysis (MESMA) for plant species mapping across multiple functional groups

    NASA Astrophysics Data System (ADS)

    Dahlin, K.; Asner, G. P.

    2010-12-01

    The ability to map plant species distributions has long been one of the key goals of terrestrial remote sensing. Achieving this goal has been challenging, however, due to technical constraints and the difficulty in relating remote observations to ground measurements. Advances in both the types of data that can be collected remotely and in available analytical tools like multiple endmember spectral mixture analysis (MESMA) are allowing for rapid improvements in this field. In 2007 the Carnegie Airborne Observatory (CAO) acquired high resolution lidar and hyperspectral imagery of Jasper Ridge Biological Preserve (Woodside, California). The site contains a mosaic of vegetation types, from grassland to chaparral to evergreen forest. To build a spectral library, 415 GPS points were collected in the field, made up of 44 plant species, six plant categories (for nonphotosynthetic vegetation), and four substrate types. Using the lidar data to select the most illuminated pixels as seen from the aircraft (based on canopy shape and viewing angle), we then reduced the spectral library to only the most fully lit pixels. To identify individual plant species in the imagery, first the hyperspectral data was used to calculate the normalized difference vegetation index (NDVI), and then pixels with an NDVI less than 0.15 were removed from further analysis. The remaining image was stratified into five classes based on vegetation height derived from the lidar data. For each class, a suite of possible endmembers was identified and then three endmember selection procedures (endmember average RMS, minimum average spectral angle, and count based endmember selection) were employed to select the most representative endmembers from each species in each class. Two and three endmember models were then applied and each pixel was assigned a species or plant category based on the highest endmember fraction. To validate the approach, an independent set of 200 points was collected throughout the

  2. An automated integration-free path-integral method based on Kleinert's variational perturbation theory

    NASA Astrophysics Data System (ADS)

    Wong, Kin-Yiu; Gao, Jiali

    2007-12-01

    Based on Kleinert's variational perturbation (KP) theory [Path Integrals in Quantum Mechanics, Statistics, Polymer Physics, and Financial Markets, 3rd ed. (World Scientific, Singapore, 2004)], we present an analytic path-integral approach for computing the effective centroid potential. The approach enables the KP theory to be applied to any realistic systems beyond the first-order perturbation (i.e., the original Feynman-Kleinert [Phys. Rev. A 34, 5080 (1986)] variational method). Accurate values are obtained for several systems in which exact quantum results are known. Furthermore, the computed kinetic isotope effects for a series of proton transfer reactions, in which the potential energy surfaces are evaluated by density-functional theory, are in good accordance with experiments. We hope that our method could be used by non-path-integral experts or experimentalists as a "black box" for any given system.

  3. Integrating multiple fitting regression and Bayes decision for cancer diagnosis with transcriptomic data from tumor-educated blood platelets.

    PubMed

    Huang, Guangzao; Yuan, Mingshun; Chen, Moliang; Li, Lei; You, Wenjie; Li, Hanjie; Cai, James J; Ji, Guoli

    2017-10-07

    The application of machine learning in cancer diagnostics has shown great promise and is of importance in clinic settings. Here we consider applying machine learning methods to transcriptomic data derived from tumor-educated platelets (TEPs) from individuals with different types of cancer. We aim to define a reliability measure for diagnostic purposes to increase the potential for facilitating personalized treatments. To this end, we present a novel classification method called MFRB (for Multiple Fitting Regression and Bayes decision), which integrates the process of multiple fitting regression (MFR) with Bayes decision theory. MFR is first used to map multidimensional features of the transcriptomic data into a one-dimensional feature. The probability density function of each class in the mapped space is then adjusted using the Gaussian probability density function. Finally, the Bayes decision theory is used to build a probabilistic classifier with the estimated probability density functions. The output of MFRB can be used to determine which class a sample belongs to, as well as to assign a reliability measure for a given class. The classical support vector machine (SVM) and probabilistic SVM (PSVM) are used to evaluate the performance of the proposed method with simulated and real TEP datasets. Our results indicate that the proposed MFRB method achieves the best performance compared to SVM and PSVM, mainly due to its strong generalization ability for limited, imbalanced, and noisy data.

  4. Smeared star spot location estimation using directional integral method.

    PubMed

    Hou, Wang; Liu, Haibo; Lei, Zhihui; Yu, Qifeng; Liu, Xiaochun; Dong, Jing

    2014-04-01

    Image smearing significantly affects the accuracy of attitude determination of most star sensors. To ensure the accuracy and reliability of a star sensor under image smearing conditions, a novel directional integral method is presented for high-precision star spot location estimation to improve the accuracy of attitude determination. Simulations based on the orbit data of the challenging mini-satellite payload satellite were performed. Simulation results demonstrated that the proposed method exhibits high performance and good robustness, which indicates that the method can be applied effectively.

  5. Integrated clinical and specialty pharmacy practice model for management of patients with multiple sclerosis.

    PubMed

    Hanson, Rebekah L; Habibi, Mitra; Khamo, Nehrin; Abdou, Sherif; Stubbings, JoAnn

    2014-03-15

    An integrated clinical and specialty pharmacy practice model for the management of patients with multiple sclerosis (MS) is described. Specialty medications, such as disease-modifying therapies (DMTs) used to treat MS, are costly and typically require special administration, handling, and storage. DMTs are associated with high rates of nonadherence and may have associated safety risks. The University of Illinois Hospital and Health Sciences System developed an MS pharmacy practice model that sought to address the many challenges of coordinating care with multiple entities outside the health system. Several key features of the integrated model include a dedicated clinical pharmacist on the MS specialty team, an integrated specialty pharmacy service, direct access to the electronic medical record, and face-to-face interaction with patients. Through the active involvement of the neurology clinical pharmacist and an onsite specialty pharmacy service, targeted assessments and medication and disease education are provided to the patient before DMT initiation and maintained throughout therapy. In addition, the regular point of contact and refill coordination encourages improved compliance, appropriate medication use, ongoing safety monitoring, and improved communication with the provider for quicker interventions. This fosters increased accessibility, convenience, and patient confidence. Improving patient outcomes--the priority goal of this service model--will be assessed in future planned studies. Through this new practice model, providers are empowered to incorporate specialty medication management into transitions in care, admission and discharge quality indicators, readmissions, and other core measures. An integrated pharmacy practice model that includes an interdisciplinary team of physicians, nurses, and pharmacists improved patient compliance with MS therapies.

  6. Monolithic integration of multiple-emission-wavelength laser diodes using low-energy ion implantation

    NASA Astrophysics Data System (ADS)

    Aimez, Vincent; Paquette, Michel; Beauvais, Jacques; Beerens, Jean; Poole, Philip J.; Charbonneau, N. Sylvain

    1998-09-01

    A monolithic optoelectronic chip containing multiple emission wavelength laser diodes has been developed. The semiconductor quantum well lasers have Fabry-Perot cavities of 500 micrometers in length. Electrical insulation between individual integrated devices has been achieved by wet etching the top contact layer and by a lift-off of the surface metal contact between the different lasers. The electroluminescence peak emission spectra of the integrated laser diodes has been shifted over a 25 nm range and 74 nm for discrete devices. Blueshifting of the emission wavelength has been achieved by quantum well intermixing using an industrial low energy ion implanter to generate point defects and a rapid thermal annealer to promote interdiffusion of the barrier and quantum well atoms during the recrystallization anneal. Phosphorus ions were implanted with an energy of 360 keV to precisely defined regions of the heterostructure with SiO2 serving as a masking material. Thus reference and intermixed regions were integrated on a single component. Integrated and discrete laser diodes have been assessed in terms of threshold currents and emission wavelengths.

  7. Blood viscosity measurement: an integral method using Doppler ultrasonic profiles

    NASA Astrophysics Data System (ADS)

    Flaud, P.; Bensalah, A.

    2005-12-01

    The aim of this work is to present a new indirect and noninvasive method for the measurement of the Newtonian blood viscosity. Based on an integral form of the axial Navier-Stokes equation, this method is particularly suited for in vivo investigations using ultrasonic arterial blood velocity profiles. Its main advantage is that it is applicable to periodic as well as non periodic flows. Moreover it does not require classical filtering methods enhancing signal to noise ratio of the physiological signals. This method only requires the knowledge of the velocimetric data measured inside a spatially and temporally optimized zone of the Doppler velocity profiles. The results obtained using numerical simulation as well as in vitro or in vivo experiments prove the effectiveness of the method. It is then well adapted to the clinical environment as a systematic quasi on-line method for the measurement of the blood viscosity.

  8. Retrieval of Temperature From a Multiple Channel Rayleigh-Scatter Lidar Using an Optimal Estimation Method

    NASA Astrophysics Data System (ADS)

    Sica, R. J.; Haefele, A.

    2014-12-01

    The measurement of temperature in the middle atmosphere with Rayleigh-scatter lidars is an important technique for assessing atmospheric change. Current retrieval schemes for these temperature have several shortcoming which can be overcome using an optimal estimation method (OEM). OEMs are applied to the retrieval of temperature from Rayleigh-scatter lidar measurements using both single and multiple channel measurements. Forward models are presented that completely characterize the measurement and allow the simultaneous retrieval of temperature, dead time and background. The method allows a full uncertainty budget to be obtained on a per profile basis that includes, in addition to the statistical uncertainties, the smoothing error and uncertainties due to Rayleigh extinction, ozone absorption, the lidar constant, nonlinearity in the counting system, variation of the Rayleigh-scatter cross section with altitude, pressure, acceleration due to gravity and the variation of mean molecular mass with altitude. The vertical resolution of the temperature profile is found at each height, and a quantitative determination is made of the maximum height to which the retrieval is valid. A single temperature profile can be retrieved from measurements with multiple channels that cover different height ranges, vertical resolutions and even different detection methods. The OEM employed is shown to give robust estimates of temperature consistent with previous methods, while requiring minimal computational time. This demonstrated success of lidar temperature retrievals using an OEM opens new possibilities in atmospheric science for measurement integration between active and passive remote sensing instruments. We are currently working on extending our method to simultaneously retrieve water vapour and temperature using Raman-scatter lidar measurements.

  9. A Numerical Method for Obtaining Monoenergetic Neutron Flux Distributions and Transmissions in Multiple-Region Slabs

    NASA Technical Reports Server (NTRS)

    Schneider, Harold

    1959-01-01

    This method is investigated for semi-infinite multiple-slab configurations of arbitrary width, composition, and source distribution. Isotropic scattering in the laboratory system is assumed. Isotropic scattering implies that the fraction of neutrons scattered in the i(sup th) volume element or subregion that will make their next collision in the j(sup th) volume element or subregion is the same for all collisions. These so-called "transfer probabilities" between subregions are calculated and used to obtain successive-collision densities from which the flux and transmission probabilities directly follow. For a thick slab with little or no absorption, a successive-collisions technique proves impractical because an unreasonably large number of collisions must be followed in order to obtain the flux. Here the appropriate integral equation is converted into a set of linear simultaneous algebraic equations that are solved for the average total flux in each subregion. When ordinary diffusion theory applies with satisfactory precision in a portion of the multiple-slab configuration, the problem is solved by ordinary diffusion theory, but the flux is plotted only in the region of validity. The angular distribution of neutrons entering the remaining portion is determined from the known diffusion flux and the remaining region is solved by higher order theory. Several procedures for applying the numerical method are presented and discussed. To illustrate the calculational procedure, a symmetrical slab ia vacuum is worked by the numerical, Monte Carlo, and P(sub 3) spherical harmonics methods. In addition, an unsymmetrical double-slab problem is solved by the numerical and Monte Carlo methods. The numerical approach proved faster and more accurate in these examples. Adaptation of the method to anisotropic scattering in slabs is indicated, although no example is included in this paper.

  10. Pragmatic Metadata Management for Integration into Multiple Spatial Data Infrastructure Systems and Platforms

    NASA Astrophysics Data System (ADS)

    Benedict, K. K.; Scott, S.

    2013-12-01

    While there has been a convergence towards a limited number of standards for representing knowledge (metadata) about geospatial (and other) data objects and collections, there exist a variety of community conventions around the specific use of those standards and within specific data discovery and access systems. This combination of limited (but multiple) standards and conventions creates a challenge for system developers that aspire to participate in multiple data infrastrucutres, each of which may use a different combination of standards and conventions. While Extensible Markup Language (XML) is a shared standard for encoding most metadata, traditional direct XML transformations (XSLT) from one standard to another often result in an imperfect transfer of information due to incomplete mapping from one standard's content model to another. This paper presents the work at the University of New Mexico's Earth Data Analysis Center (EDAC) in which a unified data and metadata management system has been developed in support of the storage, discovery and access of heterogeneous data products. This system, the Geographic Storage, Transformation and Retrieval Engine (GSTORE) platform has adopted a polyglot database model in which a combination of relational and document-based databases are used to store both data and metadata, with some metadata stored in a custom XML schema designed as a superset of the requirements for multiple target metadata standards: ISO 19115-2/19139/19110/19119, FGCD CSDGM (both with and without remote sensing extensions) and Dublin Core. Metadata stored within this schema is complemented by additional service, format and publisher information that is dynamically "injected" into produced metadata documents when they are requested from the system. While mapping from the underlying common metadata schema is relatively straightforward, the generation of valid metadata within each target standard is necessary but not sufficient for integration into

  11. Integration of isothermal amplification methods in microfluidic devices: Recent advances.

    PubMed

    Giuffrida, Maria Chiara; Spoto, Giuseppe

    2017-04-15

    The integration of nucleic acids detection assays in microfluidic devices represents a highly promising approach for the development of convenient, cheap and efficient diagnostic tools for clinical, food safety and environmental monitoring applications. Such tools are expected to operate at the point-of-care and in resource-limited settings. The amplification of the target nucleic acid sequence represents a key step for the development of sensitive detection protocols. The integration in microfluidic devices of the most popular technology for nucleic acids amplifications, polymerase chain reaction (PCR), is significantly limited by the thermal cycling needed to obtain the target sequence amplification. This review provides an overview of recent advances in integration of isothermal amplification methods in microfluidic devices. Isothermal methods, that operate at constant temperature, have emerged as promising alternative to PCR and greatly simplify the implementation of amplification methods in point-of-care diagnostic devices and devices to be used in resource-limited settings. Possibilities offered by isothermal methods for digital droplet amplification are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Galerkin boundary integral equation method for spontaneous rupture propagation problems

    NASA Astrophysics Data System (ADS)

    Goto, H.; Bielak, J.

    2007-12-01

    We develop a Galerkin finite element boundary integral equation method (GaBIEM) for spontaneous rupture propagation problems for a planar fault embedded in a homogeneous full 2D space. A simple 2D anti plane rupture propagation problem, with a slip-weakening friction law, is simulated by the GaBIEM. This method allows one to separate explicitly the kernel into singular static and time-dependent parts, and a nonsingular dynamic component. The simulated results throw light into the performance of the GaBIEM and highlight differences with respect to that of the traditional, collocation, boundary integral equation method (BIEM). The rate of convergence of the GaBIEM, as measured from a root mean square (RMS) analysis of the difference of approximate solutions corresponding to increasingly finer element sizes is of a higher order than that of the BIEM. There is no restriction on the CFL stability number since an implicit, unconditionally stable method is used for the time integration. The error of the approximation increases with the time step, as expected, and it can remain below that of the BIEM.

  13. Comparison of Four Methods for Weighting Multiple Predictors.

    ERIC Educational Resources Information Center

    Aamodt, Michael G.; Kimbrough, Wilson W.

    1985-01-01

    Four methods were used to weight predictors associated with a Resident Assistant job: (1) rank order weights; (2) unit weights; (3) critical incident weights; and (4) regression weights. A cross-validation was also done. Most weighting methods were highly related. No method was superior in terms of protection from validity shrinkage. (GDC)

  14. Method of optical image coding by time integration

    NASA Astrophysics Data System (ADS)

    Evtikhiev, Nikolay N.; Starikov, Sergey N.; Cheryomkhin, Pavel A.; Krasnov, Vitaly V.; Rodin, Vladislav G.

    2012-06-01

    Method of optical image coding by time integration is proposed. Coding in proposed method is accomplished by shifting object image over photosensor area of digital camera during registration. It results in optically calculated convolution of original image with shifts trajectory. As opposed to optical coding methods based on the use of diffractive optical elements the described coding method is feasible for implementation in totally incoherent light. The method was preliminary tested by using LC monitor for image displaying and shifting. Shifting of object image is realized by displaying video consisting of frames with image to be encoded at different locations on screen of LC monitor while registering it by camera. Optical encoding and numerical decoding of test images were performed successfully. Also more practical experimental implementation of the method with use of LCOS SLM Holoeye PLUTO VIS was realized. Objects images to be encoded were formed in monochromatic spatially incoherent light. Shifting of object image over camera photosensor area was accomplished by displaying video consisting of frames with blazed gratings on LCOS SLM. Each blazed grating deflects reflecting from SLM light at different angle. Results of image optical coding and encoded images numerical restoration are presented. Obtained experimental results are compared with results of numerical modeling. Optical image coding with time integration could be used for accessible quality estimation of optical image coding using diffractive optical elements or as independent optical coding method which can be implemented in incoherent light.

  15. A Parallelized Point Successive Over-Relaxation Method on a Multiple Instruction Multiple Data Stream Computer.

    DTIC Science & Technology

    1984-11-01

    n) (n)f~ 1l/4 (fi~ + fi-~ + fi~- + fijl (3) This is the point Jacobi method . For this particular problem the method simply involves setting the new...advanced (n+1) values at two neighboring points, (i-1,j) and (ij-1). The point Jacobi method in Equation (3) is a two level equation requiring storage of

  16. A General Simulation Method for Multiple Bodies in Proximate Flight

    NASA Technical Reports Server (NTRS)

    Meakin, Robert L.

    2003-01-01

    Methods of unsteady aerodynamic simulation for an arbitrary number of independent bodies flying in close proximity are considered. A novel method to efficiently detect collision contact points is described. A method to compute body trajectories in response to aerodynamic loads, applied loads, and inter-body collisions is also given. The physical correctness of the methods are verified by comparison to a set of analytic solutions. The methods, combined with a Navier-Stokes solver, are used to demonstrate the possibility of predicting the unsteady aerodynamics and flight trajectories of moving bodies that involve rigid-body collisions.

  17. Predicted PAR1 inhibitors from multiple computational methods

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Liu, Jinfeng; Zhu, Tong; Zhang, Lujia; He, Xiao; Zhang, John Z. H.

    2016-08-01

    Multiple computational approaches are employed in order to find potentially strong binders of PAR1 from the two molecular databases: the Specs database containing more than 200,000 commercially available molecules and the traditional Chinese medicine (TCM) database. By combining the use of popular docking scoring functions together with detailed molecular dynamics simulation and protein-ligand free energy calculations, a total of fourteen molecules are found to be potentially strong binders of PAR1. The atomic details in protein-ligand interactions of these molecules with PAR1 are analyzed to help understand the binding mechanism which should be very useful in design of new drugs.

  18. A Modeling Method of Multiple Targets Assignment under Multiple UAVs’ Cooperation

    NASA Astrophysics Data System (ADS)

    Wang, Q. H.; Wan, G.; Cao, X. F.; Xie, L. X.

    2017-03-01

    Aiming at the multiple UAVs’ cooperation in the complex environment, detailed analysis about targets assignment model is made in the paper. Firstly, three basic situations are discussed according to the quantitative relationship between the UAVs and the targets. Then in order to make the targets model more practical, the probability that the UAVs’ damage is also taken into consideration. Following, basic particle swarm optimization algorithm is adopted to solve the model which has great performance in efficiency and convergence. Finally, three-dimensional environment is simulated to verify the model. Simulation results show that the model is practical and close to the actual environment.

  19. Overview of integrative tools and methods in assessing ecological integrity in estuarine and coastal systems worldwide.

    PubMed

    Borja, Angel; Bricker, Suzanne B; Dauer, Daniel M; Demetriades, Nicolette T; Ferreira, João G; Forbes, Anthony T; Hutchings, Pat; Jia, Xiaoping; Kenchington, Richard; Carlos Marques, João; Zhu, Changbo

    2008-09-01

    In recent years, several sets of legislation worldwide (Oceans Act in USA, Australia or Canada; Water Framework Directive or Marine Strategy in Europe, National Water Act in South Africa, etc.) have been developed in order to address ecological quality or integrity, within estuarine and coastal systems. Most such legislation seeks to define quality in an integrative way, by using several biological elements, together with physico-chemical and pollution elements. Such an approach allows assessment of ecological status at the ecosystem level ('ecosystem approach' or 'holistic approach' methodologies), rather than at species level (e.g. mussel biomonitoring or Mussel Watch) or just at chemical level (i.e. quality objectives) alone. Increasing attention has been paid to the development of tools for different physico-chemical or biological (phytoplankton, zooplankton, benthos, algae, phanerogams, fishes) elements of the ecosystems. However, few methodologies integrate all the elements into a single evaluation of a water body. The need for such integrative tools to assess ecosystem quality is very important, both from a scientific and stakeholder point of view. Politicians and managers need information from simple and pragmatic, but scientifically sound methodologies, in order to show to society the evolution of a zone (estuary, coastal area, etc.), taking into account human pressures or recovery processes. These approaches include: (i) multidisciplinarity, inherent in the teams involved in their implementation; (ii) integration of biotic and abiotic factors; (iii) accurate and validated methods in determining ecological integrity; and (iv) adequate indicators to follow the evolution of the monitored ecosystems. While some countries increasingly use the establishment of marine parks to conserve marine biodiversity and ecological integrity, there is awareness (e.g. in Australia) that conservation and management of marine ecosystems cannot be restricted to Marine Protected

  20. Functional integration between brain regions at rest occurs in multiple-frequency bands.

    PubMed

    Gohel, Suril R; Biswal, Bharat B

    2015-02-01

    Studies of resting-state fMRI have shown that blood oxygen level dependent (BOLD) signals giving rise to temporal correlation across voxels (or regions) are dominated by low-frequency fluctuations in the range of ∼ 0.01-0.1 Hz. These low-frequency fluctuations have been further divided into multiple distinct frequency bands (slow-5 and -4) based on earlier neurophysiological studies, though low sampling frequency of fMRI (∼ 0.5 Hz) has substantially limited the exploration of other known frequency bands of neurophysiological origins (slow-3, -2, and -1). In this study, we used resting-state fMRI data acquired from 21 healthy subjects at a higher sampling frequency of 1.5 Hz to assess the presence of resting-state functional connectivity (RSFC) across multiple frequency bands: slow-5 to slow-1. The effect of different frequency bands on spatial extent and connectivity strength for known resting-state networks (RSNs) was also evaluated. RSNs were derived using independent component analysis and seed-based correlation. Commonly known RSNs, such as the default mode, the fronto-parietal, the dorsal attention, and the visual networks, were consistently observed at multiple frequency bands. Significant inter-hemispheric connectivity was observed between each seed and its contra lateral brain region across all frequency bands, though overall spatial extent of seed-based correlation maps decreased in slow-2 and slow-1 frequency bands. These results suggest that functional integration between brain regions at rest occurs over multiple frequency bands and RSFC is a multiband phenomenon. These results also suggest that further investigation of BOLD signal in multiple frequency bands for related cognitive processes should be undertaken.

  1. Fast Electromagnetic Analysis of MRI Transmit RF Coils Based on Accelerated Integral Equation Methods.

    PubMed

    Villena, Jorge Fernandez; Polimeridis, Athanasios G; Eryaman, Yigitcan; Adalsteinsson, Elfar; Wald, Lawrence L; White, Jacob K; Daniel, Luca

    2016-11-01

    A fast frequency domain full-wave electromagnetic simulation method is introduced for the analysis of MRI coils loaded with the realistic human body models. The approach is based on integral equation methods decomposed into two domains: 1) the RF coil array and shield, and 2) the human body region where the load is placed. The analysis of multiple coil designs is accelerated by introducing the precomputed magnetic resonance Green functions (MRGFs), which describe how the particular body model used responds to the incident fields from external sources. These MRGFs, which are precomputed once for a given body model, can be combined with any integral equation solver and reused for the analysis of many coil designs. This approach provides a fast, yet comprehensive, analysis of coil designs, including the port S-parameters and the electromagnetic field distribution within the inhomogeneous body. The method solves the full-wave electromagnetic problem for a head array in few minutes, achieving a speed up of over 150 folds with root mean square errors in the electromagnetic field maps smaller than 0.4% when compared to the unaccelerated integral equation-based solver. This enables the characterization of a large number of RF coil designs in a reasonable time, which is a first step toward an automatic optimization of multiple parameters in the design of transmit arrays, as illustrated in this paper, but also receive arrays.

  2. Integration of multiple-baseline color stereo vision with focus and defocus analysis for 3D shape measurement

    NASA Astrophysics Data System (ADS)

    Yuan, Ta; Subbarao, Murali

    1998-12-01

    A 3D vision system named SVIS is developed for 3D shape measurement that integrates three methods: (i) multiple- baseline, multiple-resolution Stereo Image Analysis (SIA) that uses colore image data, (ii) Image Defocus Analysis (IDA), and (iii) Image Focus Analysis (IFA). IDA and IFA are less accurate than stereo but they do not suffer from the correspondence problem associated with stereo. A rough 3D shape is first obtained using IDA and then IFA is used to obtain an improved estimate. The result is then used in SIA to solve the correspondence problem and obtain an accurate measurement of 3D shape. SIA is implemented using color images recorded at multiple-baselines. Color images provide more information than monochrome images for stereo matching. Therefore matching errors are reduced and accuracy of 3D shape is improved. Further improvements are obtained through multiple-baseline stereo analysis. First short baseline images are analyzed to obtain an initial estimate of 3D shape. In this step, stereo matching errors are low and computation is fast since a shorter baseline result in lower disparities. The initial estimate of 3D shape is used to match longer baseline stereo images. This yields more accurate estimation of 3D shape. The stereo matching step is implemented using a multiple-resolution matching approach to reduce computation. First lower resolution images are matched and the result are used in matching higher resolution images. This paper presented the algorithms and the experimental result of 3D shape measurements on SVIS for several objects. These results suggest a practical vision system for 3D shape measurement.

  3. Integration of sample analysis method (SAM) for polychlorinated biphenyls

    SciTech Connect

    Monagle, M.; Johnson, R.C.

    1996-05-01

    A completely integrated Sample Analysis Method (SAM) has been tested as part of the Contaminant Analysis Automation program. The SAM system was tested for polychlorinated biphenyl samples using five Standard Laboratory Modules{trademark}: two Soxtec{trademark} modules, a high volume concentrator module, a generic materials handling module, and the gas chromatographic module. With over 300 samples completed within the first phase of the validation, recovery and precision data were comparable to manual methods. Based on experience derived from the first evaluation of the automated system, efforts are underway to improve sample recoveries and integrate a sample cleanup procedure. In addition, initial work in automating the extraction of semivolatile samples using this system will also be discussed.

  4. Computing thermal Wigner densities with the phase integration method

    SciTech Connect

    Beutier, J.; Borgis, D.; Vuilleumier, R.; Bonella, S.

    2014-08-28

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta and coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems.

  5. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    SciTech Connect

    Prinn, Ronald; Webster, Mort

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  6. Integration of multiple PCR amplification and DNA mutation analyses by using oligonucleotide microchip.

    SciTech Connect

    Tillib, S. V.; Strizhkov, B. N.; Mirzabekov, A. D.; Biochip Technology Center; Russian Academy of Sciences

    2001-05-01

    We have developed a method for parallel independent on-chip amplification and the following sequence variation analysis of multiple DNA regions directly using microchip with an array of nanoliter gel pads containing specific sets of tethered primers. The method has three key features. First, DNA to be amplified is enriched at gel pads by its hybridization with immobilized primers. Second, different sets of specific primers are immobilized within various gel pads, and primers are detached within gel pads just before polymerase chain reaction to enhance the amplification. A gel pad may contain an additional permanently immobilized dormant primer that is activated to carry out the allele-specific primer extension reaction to detect mutations. Third, multiple polymerase chain reactions are confined within nanoliter gel pads covered and separated from each other with mineral oil. The method was applied to simultaneously identify several abundant drug-resistant mutations in three genes of Mycobacterium tuberculosis.

  7. Attribution of hydrological change using the Method of Multiple Working Hypotheses

    NASA Astrophysics Data System (ADS)

    Harrigan, Shaun

    2017-04-01

    The methods we have developed for managing our long-term water supply and protection from extreme hydrological events such as droughts and floods have been founded on the assumption that the hydrological cycle operates under natural conditions. However, it increasingly recognised that humans have the potential to induce significant change in almost every component of the hydrological cycle, for example, climate change, land-use change, and river engineering. Statistical detection of change in streamflow, outside that of natural variability, is an important scientific endeavour, but it does not tell us anything about the drivers of change. Attribution is the process of establishing the most likely cause(s) of a detected change - the why. Attribution is complex due to the integrated nature of streamflow and the proliferation of multiple possible drivers. It is perhaps this complexity, combined with few proven theoretical approaches to this problem in hydrology that has led to others to call for "more efforts and scientific rigour" (Merz et al., 2012). It is easier to limit the cause of a detected change to a single driver, or use simple correlation analysis alone as evidence of causation. It is convenient when the direction of a change in streamflow is consistent with what is expected from a well-known driver such as climate change. Over a century ago, Thomas Chamberlin argued these types of issues were common in many disciplines given how the scientific method is approached in general. His 1890 article introduces the Method of Multiple Working Hypotheses (MMWH) in an attempt to limit our confirmation bias and strives for increased objectivity. This presentation will argue that the MMWH offers an attractive theoretical approach to the attribution of hydrological change in modern hydrology as demonstrated through a case study of a well-documented change point in streamflow within the Boyne Catchment in Ireland. Further Reading Chamberlin, T. C.: The Method of Multiple

  8. Detection method for dissociation of multiple-charged ions

    DOEpatents

    Smith, Richard D.; Udseth, Harold R.; Rockwood, Alan L.

    1991-01-01

    Dissociations of multiple-charged ions are detected and analyzed by charge-separation tandem mass spectrometry. Analyte molecules are ionized to form multiple-charged parent ions. A particular charge parent ion state is selected in a first-stage mass spectrometer and its mass-to-charge ratio (M/Z) is detected to determine its mass and charge. The selected parent ions are then dissociated, each into a plurality of fragments including a set of daughter ions each having a mass of at least one molecular weight and a charge of at least one. Sets of daughter ions resulting from the dissociation of one parent ion (sibling ions) vary in number but typically include two to four ions, one or more multiply-charged. A second stage mass spectrometer detects mass-to-charge ratio (m/z) of the daughter ions and a temporal or temporo-spatial relationship among them. This relationship is used to correlate the daughter ions to determine which (m/z) ratios belong to a set of sibling ions. Values of mass and charge of each of the sibling ions are determined simultaneously from their respective (m/z) ratios such that the sibling ion charges are integers and sum to the parent ion charge.

  9. Photometric Detection of Multiple Populations in Globular Clusters Using Integrated Light

    NASA Astrophysics Data System (ADS)

    Bowman, William P.; Pilachowski, Catherine A.; van Zee, Liese; Winans, Amanda; Ciardullo, Robin; Gronwall, Caryl

    2017-10-01

    We investigate the multiple stellar populations of the globular clusters (GCs) M3, M5, M13, and M71 using {g}{\\prime } and intermediate-band CN-λ 3883 photometry obtained with the WIYN 0.9 m telescope on Kitt Peak. We find a strong correlation between red giant stars’ CN-{g}{\\prime } colors and their spectroscopic sodium abundances, thus demonstrating the efficacy of the two-filter system for stellar population studies. In all four clusters, the observed spread in red giant branch CN-{g}{\\prime } colors is wider than that expected from photometric uncertainty, confirming the well-known chemical inhomogeneity of these systems. M3 and M13 show clear evidence for a radial dependence in the CN-band strengths of its red giants, while the evidence for such a radial dependence of CN strengths in M5 is ambiguous. Our data suggest that the dynamically old, relatively metal-rich M71 system is well mixed, as it shows no evidence for chemical segregation. Finally, we measure the radial gradients in the integrated CN-{g}{\\prime } color of the clusters and find that such gradients are easily detectable in the integrated light. We suggest that photometric observations of color gradients within GCs throughout the Local Group can be used to characterize their multiple populations, and thereby constrain the formation history of GCs in different galactic environments.

  10. Linear Multistep Methods for Integrating Reversible Differential Equations

    NASA Astrophysics Data System (ADS)

    Evans, N. Wyn; Tremaine, Scott

    1999-10-01

    This paper studies multistep methods for the integration of reversible dynamical systems, with particular emphasis on the planar Kepler problem. It has previously been shown by Cano & Sanz-Serna that reversible linear multisteps for first-order differential equations are generally unstable. Here we report on a subset of these methods-the zero-growth methods-that evade these instabilities. We provide an algorithm for identifying these rare methods. We find and study all zero-growth, reversible multisteps with six or fewer steps. This select group includes two well-known second-order multisteps (the trapezoidal and explicit midpoint methods), as well as three new fourth-order multisteps-one of which is explicit. Variable time steps can be readily implemented without spoiling the reversibility. Tests on Keplerian orbits show that these new reversible multisteps work well on orbits with low or moderate eccentricity, although at least 100 steps per radian are required for stability.

  11. Inclusion of Separation in Integral Boundary Layer Methods

    NASA Astrophysics Data System (ADS)

    Wallace, Brodie; O'Neill, Charles

    2016-11-01

    An integral boundary layer (IBL) method coupled with a potential flow solver quickly allows simulating aerodynamic flows, allowing for aircraft geometries to be rapidly designed and optimized. However, most current IBL methods lack the ability to accurately model three-dimensional separated flows. Various IBL equations and closure relations were investigated in an effort to develop an IBL capable of modeling separation. Solution techniques, including a Newton's method and the inverse matrix solving program GMRES, as well as methods for coupling an IBL with a potential flow solver were also investigated. Results for two-dimensional attached flow as well as methods for expanding an IBL to model three-dimensional separation are presented. Funding from NSF REU site Grant EEC 1358991 is greatly appreciated.

  12. Multi-channel detector readout method and integrated circuit

    DOEpatents

    Moses, William W.; Beuville, Eric; Pedrali-Noy, Marzio

    2004-05-18

    An integrated circuit which provides multi-channel detector readout from a detector array. The circuit receives multiple signals from the elements of a detector array and compares the sampled amplitudes of these signals against a noise-floor threshold and against one another. A digital signal is generated which corresponds to the location of the highest of these signal amplitudes which exceeds the noise floor threshold. The digital signal is received by a multiplexing circuit which outputs an analog signal corresponding the highest of the input signal amplitudes. In addition a digital control section provides for programmatic control of the multiplexer circuit, amplifier gain, amplifier reset, masking selection, and test circuit functionality on each input thereof.

  13. Multi-channel detector readout method and integrated circuit

    SciTech Connect

    Moses, William W.; Beuville, Eric; Pedrali-Noy, Marzio

    2006-12-12

    An integrated circuit which provides multi-channel detector readout from a detector array. The circuit receives multiple signals from the elements of a detector array and compares the sampled amplitudes of these signals against a noise-floor threshold and against one another. A digital signal is generated which corresponds to the location of the highest of these signal amplitudes which exceeds the noise floor threshold. The digital signal is received by a multiplexing circuit which outputs an analog signal corresponding the highest of the input signal amplitudes. In addition a digital control section provides for programmatic control of the multiplexer circuit, amplifier gain, amplifier reset, masking selection, and test circuit functionality on each input thereof.

  14. [Application of multiple seasonal autoregressive integrated moving average model in predicting the mumps incidence].

    PubMed

    Hui, Shisheng; Chen, Lizhang; Liu, Fuqiang; Ouyang, Yanhao

    2015-12-01

    To establish multiple seasonal autoregressive integrated moving average model(ARIMA) according to mumps disease incidence in Hunan province, and to predict the mumps incidence from May 2015 to April 2016 in Hunan province by the model. The data were downloaded from "Disease Surveillance Information Reporting Management System" in China Information System for Disease Control and Prevention. The monthly incidence of mumps in Hunan province was collected from January 2004 to April 2015 according to the onset date, including clinical diagnosis and laboratory confirmed cases. The predictive analysis method was the ARIMA model in SPSS 18.0 software, the ARIMA model was established on the monthly incidence of mumps from January 2004 to April 2014, and the date from May 2014 to April 2015 was used as the testing sample, Box-Ljung Q test was used to test the residual of the selected model. Finally, the monthly incidence of mumps from May 2015 to April 2016 was predicted by the model. The peak months of the mumps incidence were May to July every year, and the secondary peak months were November to January of the following year, during January 2004 to April 2014 in Hunan province. After the data sequence was handled by smooth sequence, model identification, establishment and diagnosis, the ARIMA(2,1,1) × (0,1,1)(12) was established, Box-Ljung Q test found, Q=8.40, P=0.868, the residual sequence was white noise, the established model to the data information extraction was complete, the model was reasonable. The R(2) value of the model fitting degree was 0.871, and the value of BIC was -1.646, while the average absolute error of the predicted value and the actual value was 0.025/100 000, the average relative error was 13.004%. The relative error of the model for the prediction of the mumps incidence in Hunan province was small, and the predicting results were reliable. Using the ARIMA(2,1,1) ×(0,1,1)(12) model to predict the mumps incidence from April 2016 to May 2015 in

  15. A Microfluidic Localized, Multiple Cell Culture Array using Vacuum Actuated Cell Seeding: Integrated Anticancer Drug Testing

    PubMed Central

    Gao, Yan; Li, Peng

    2013-01-01

    In this study, we introduced a novel and convenient approach to culture multiple cells in localized arrays of microfluidic chambers using one-step vacuum actuation. In one device, we integrated 8 individually addressable regions of culture chambers, each only requiring one simple vacuum operation to seed cells lines. Four cell lines were seeded in designated regions in one device via sequential injection with high purity (99.9%-100%) and cultured for long-term. The on-chip simultaneous culture of HuT 78, Ramos, PC-3 and C166-GFP cells for 48 h was demonstrated with viabilities of 92%+/−2%, 94%+/−4%, 96%+/−2% and 97%+/−2%, respectively. The longest culture period for C166-GFP cells in this study was 168 h with a viability of 96%+/−10%. Cell proliferation in each individual side channel can be tracked. Mass transport between the main channel and side channels was achieved through diffusion and studied using fluorescein solution. The main advantage of this device is the capability to perform multiple cell-based assays on the same device for better comparative studies. After treating cells with staurosporine or anti-human CD95 for 16 h, the apoptotic cell percentage of HuT 78, CCRF-CEM, PC-3 and Ramos cells were 36%+/−3%, 24%+/−4%, 12%+/−2%, 18%+/−4% for staurosporine, and 63%+/−2%, 45%+/−1%, 3%+/−3%, 27%+/−12% for anti-human CD95, respectively. With the advantages of enhanced integration, ease of use and fabrication, and flexibility, this device will be suitable for long-term multiple cell monitoring and cell based assays. PMID:23813077

  16. A novel generic optimization method for irrigation scheduling under multiple objectives and multiple hierarchical layers in a canal network

    NASA Astrophysics Data System (ADS)

    Delgoda, Dilini; Malano, Hector; Saleem, Syed K.; Halgamuge, Malka N.

    2017-07-01

    This research proposes a novel generic method for irrigation scheduling in a canal network to optimize multiple objectives related to canal scheduling (e.g. maximizing water supply and minimizing imbalance of water distribution) within multiple hierarchical layers (e.g. the layers consisting of the main canal, distributaries) while utilizing traditional canal scheduling methods. It is based on modularizing the optimization process. The method is theoretically capable of optimizing an unlimited number of user-defined objectives within an unlimited number of hierarchical layers and only limited by resource availability (e.g. maximum canal capacity and water limitations) in the network. It allows flexible decision-making through quantification of the mutual effects of optimizing conflicting objectives and is adaptable to available multi-objective evolutionary algorithms. The method's application is demonstrated using a hypothetical canal network example with six objectives and three hierarchical layers, and a real scenario with four objectives and two layers.

  17. Cumulative health risk assessment: integrated approaches for multiple contaminants, exposures, and effects

    SciTech Connect

    Rice, Glenn; Teuschler, Linda; MacDonel, Margaret; Butler, Jim; Finster, Molly; Hertzberg, Rick; Harou, Lynne

    2007-07-01

    Available in abstract form only. Full text of publication follows: As information about environmental contamination has increased in recent years, so has public interest in the combined effects of multiple contaminants. This interest has been highlighted by recent tragedies such as the World Trade Center disaster and hurricane Katrina. In fact, assessing multiple contaminants, exposures, and effects has long been an issue for contaminated sites, including U.S. Department of Energy (DOE) legacy waste sites. Local citizens have explicitly asked the federal government to account for cumulative risks, with contaminants moving offsite via groundwater flow, surface runoff, and air dispersal being a common emphasis. Multiple exposures range from ingestion and inhalation to dermal absorption and external gamma irradiation. Three types of concerns can lead to cumulative assessments: (1) specific sources or releases - e.g., industrial facilities or accidental discharges; (2) contaminant levels - in environmental media or human tissues; and (3) elevated rates of disease - e.g., asthma or cancer. The specific initiator frames the assessment strategy, including a determination of appropriate models to be used. Approaches are being developed to better integrate a variety of data, extending from environmental to internal co-location of contaminants and combined effects, to support more practical assessments of cumulative health risks. (authors)

  18. Encrypting three-dimensional information system based on integral imaging and multiple chaotic maps

    NASA Astrophysics Data System (ADS)

    Xing, Yan; Wang, Qiong-Hua; Xiong, Zhao-Long; Deng, Huan

    2016-02-01

    An encrypting three-dimensional (3-D) information system based on integral imaging (II) and multiple chaotic maps is proposed. In the encrypting process, the elemental image array (EIA) which represents spatial and angular information of the real 3-D scene is picked up by a microlens array. Subsequently, R, G, and B color components decomposed by the EIA are encrypted using multiple chaotic maps. Finally, these three encrypted components are interwoven to obtain the cipher information. The decryption process implements the reverse operation of the encryption process for retrieving the high-quality 3-D images. Since the encrypted EIA has the data redundancy property due to II, and all parameters of the pickup part are the secret keys of the encrypting system, the system sensitivity on the changes of the plaintext and secret keys can be significantly improved. Moreover, the algorithm based on multiple chaotic maps can effectively enhance the security. A preliminary experiment is carried out, and the experimental results verify the effectiveness, robustness, and security of the proposed system.

  19. Method for integrating microelectromechanical devices with electronic circuitry

    DOEpatents

    Montague, Stephen; Smith, James H.; Sniegowski, Jeffry J.; McWhorter, Paul J.

    1998-01-01

    A method for integrating one or more microelectromechanical (MEM) devices with electronic circuitry. The method comprises the steps of forming each MEM device within a cavity below a device surface of the substrate; encapsulating the MEM device prior to forming electronic circuitry on the substrate; and releasing the MEM device for operation after fabrication of the electronic circuitry. Planarization of the encapsulated MEM device prior to formation of the electronic circuitry allows the use of standard processing steps for fabrication of the electronic circuitry.

  20. Method for integrating microelectromechanical devices with electronic circuitry

    DOEpatents

    Montague, S.; Smith, J.H.; Sniegowski, J.J.; McWhorter, P.J.

    1998-08-25

    A method is disclosed for integrating one or more microelectromechanical (MEM) devices with electronic circuitry. The method comprises the steps of forming each MEM device within a cavity below a device surface of the substrate; encapsulating the MEM device prior to forming electronic circuitry on the substrate; and releasing the MEM device for operation after fabrication of the electronic circuitry. Planarization of the encapsulated MEM device prior to formation of the electronic circuitry allows the use of standard processing steps for fabrication of the electronic circuitry. 13 figs.

  1. [Integrated use of psychotherapeutic treatment methods in therapy of alcoholism].

    PubMed

    Scholz, H; McCutchan, J

    1998-01-01

    The treatment of alcoholism is more promising than commonly assumed. Its success is based on the acceptance of a long-term treatment concept over a period of approximately 2 years, the willingness to differentiate between the individual treatment courses according to their underlying individual psychopathologies as well as adapting treatment measures to the actual phases during restitution. Many years of experience with various psychotherapeutic methods have proven that not so much one certain method, but their integrative application depending on the individual situation is relevant to treatment success. Thus, during treatment, a change between supportive, confrontative, systemic and family therapy-oriented elements can occur.

  2. Synthesis of aircraft structures using integrated design and analysis methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  3. Material mechanical characterization method for multiple strains and strain rates

    SciTech Connect

    Erdmand, III, Donald L.; Kunc, Vlastimil; Simunovic, Srdjan; Wang, Yanli

    2016-01-19

    A specimen for measuring a material under multiple strains and strain rates. The specimen including a body having first and second ends and a gage region disposed between the first and second ends, wherein the body has a central, longitudinal axis passing through the first and second ends. The gage region includes a first gage section and a second gage section, wherein the first gage section defines a first cross-sectional area that is defined by a first plane that extends through the first gage section and is perpendicular to the central, longitudinal axis. The second gage section defines a second cross-sectional area that is defined by a second plane that extends through the second gage section and is perpendicular to the central, longitudinal axis and wherein the first cross-sectional area is different in size than the second cross-sectional area.

  4. Multiple cell radiation detector system, and method, and submersible sonde

    DOEpatents

    Johnson, Larry O.; McIsaac, Charles V.; Lawrence, Robert S.; Grafwallner, Ervin G.

    2002-01-01

    A multiple cell radiation detector includes a central cell having a first cylindrical wall providing a stopping power less than an upper threshold; an anode wire suspended along a cylindrical axis of the central cell; a second cell having a second cylindrical wall providing a stopping power greater than a lower threshold, the second cylindrical wall being mounted coaxially outside of the first cylindrical wall; a first end cap forming a gas-tight seal at first ends of the first and second cylindrical walls; a second end cap forming a gas-tight seal at second ends of the first and second cylindrical walls; and a first group of anode wires suspended between the first and second cylindrical walls.

  5. Yoga as a method of symptom management in multiple sclerosis.

    PubMed

    Frank, Rachael; Larimore, Jennifer

    2015-01-01

    Multiple Sclerosis (MS) is an immune-mediated process in which the body's immune system damages myelin in the central nervous system (CNS). The onset of this disorder typically occurs in young adults, and it is more common among women. Currently, there is no cure and the long-term disease progression makes symptomatic management critical for maintaining quality of life. Several pharmacotherapeutic agents are approved for treatment, but many patients seek complementary and alternative interventions. Reviews have been conducted regarding broad topics such as mindfulness-based interventions for people diagnosed with MS and the impact of yoga on a range of neurological disorders. The objective of the present review is to examine the potential benefits of yoga for individuals with MS and address its use in managing symptoms including pain, mental health, fatigue, spasticity, balance, bladder control, and sexual function.

  6. Yoga as a method of symptom management in multiple sclerosis

    PubMed Central

    Frank, Rachael; Larimore, Jennifer

    2015-01-01

    Multiple Sclerosis (MS) is an immune-mediated process in which the body's immune system damages myelin in the central nervous system (CNS). The onset of this disorder typically occurs in young adults, and it is more common among women. Currently, there is no cure and the long-term disease progression makes symptomatic management critical for maintaining quality of life. Several pharmacotherapeutic agents are approved for treatment, but many patients seek complementary and alternative interventions. Reviews have been conducted regarding broad topics such as mindfulness-based interventions for people diagnosed with MS and the impact of yoga on a range of neurological disorders. The objective of the present review is to examine the potential benefits of yoga for individuals with MS and address its use in managing symptoms including pain, mental health, fatigue, spasticity, balance, bladder control, and sexual function. PMID:25983675

  7. Integration of multiple determinants in the neuronal computation of economic values.

    PubMed

    Raghuraman, Anantha P; Padoa-Schioppa, Camillo

    2014-08-27

    Economic goods may vary on multiple dimensions (determinants). A central conjecture in decision neuroscience is that choices between goods are made by comparing subjective values computed through the integration of all relevant determinants. Previous work identified three groups of neurons in the orbitofrontal cortex (OFC) of monkeys engaged in economic choices: (1) offer value cells, which encode the value of individual offers; (2) chosen value cells, which encode the value of the chosen good; and (3) chosen juice cells, which encode the identity of the chosen good. In principle, these populations could be sufficient to generate a decision. Critically, previous work did not assess whether offer value cells (the putative input to the decision) indeed encode subjective values as opposed to physical properties of the goods, and/or whether offer value cells integrate multiple determinants. To address these issues, we recorded from the OFC while monkeys chose between risky outcomes. Confirming previous observations, three populations of neurons encoded the value of individual offers, the value of the chosen option, and the value-independent choice outcome. The activity of both offer value cells and chosen value cells encoded values defined by the integration of juice quantity and probability. Furthermore, both populations reflected the subjective risk attitude of the animals. We also found additional groups of neurons encoding the risk associated with a particular option, the risky nature of the chosen option, and whether the trial outcome was positive or negative. These results provide substantial support for the conjecture described above and for the involvement of OFC in good-based decisions.

  8. The Persian version of Community Integration Questionnaire in persons with multiple sclerosis: translation, reliability, validity, and factor analysis.

    PubMed

    Negahban, Hossein; Fattahizadeh, Parastoo; Ghasemzadeh, Roya; Salehi, Reza; Majdinasab, Nastaran; Mazaheri, Masood

    2013-08-01

    To culturally translate and validate the Community Integration Questionnaire (CIQ) in persons with multiple sclerosis (MS). After a forward-backward translation, 105 persons with MS completed the Persian versions of the CIQ and MS Quality of Life (MSQOL) questionnaires in the first visit. The CIQ was re-administered to a sample of 45 persons with MS 7-10 days after the first session. Test-retest reliability and internal consistency were assessed using intraclass correlation coefficient (ICC) and Cronbach's α coefficient, respectively. Construct validity was assessed by measuring associations between subscales of the Persian CIQ (including Home Integration (HI), Social Integration (SI), and Productivity (P)) and MSQOL (including Physical and Mental Components). Dimensionality was assessed through two methods of corrected item-subscale correlation and factor analysis. The acceptable level of test-retest reliability (ICC ≥0.70) was obtained for the Persian CIQ. However, Cronbach's α coefficient of ≥0.70 was only seen for the HI. The correlations between the Persian CIQ and the Physical MSQOL were higher than those of Persian CIQ and the Mental MSQOL. The corrected item-subscale Spearman's correlation coefficient of 0.40 was exceeded by most items of the HI and 2 items of P. A total of four factors were detected and similar to the results of item-subscale correlation, the most variability was seen for the items of SI which loaded on different factors. Persian CIQ seems to be a reliable and valid instrument for monitoring the level of community integration following rehabilitation in persons with MS. Some modifications need to be made in the SI of the Persian CIQ to improve extraction of information regarding community integration of persons with MS.

  9. A multiple indicator, multiple cause method for representing social capital with an application to psychological distress

    NASA Astrophysics Data System (ADS)

    Congdon, Peter

    2010-03-01

    This paper describes a structural equation methodology for obtaining social capital scores for survey subjects from multiple indicators of social support, neighbourhood and trust perceptions, and memberships of organizations. It adjusts for variation that is likely to occur in levels of social capital according to geographic context (e.g. level of area deprivation, geographic region, level of urbanity) and demographic group. Social capital is used as an explanatory factor for psychological distress using data from the 2006 Health Survey for England. A highly significant effect of social capital in reducing the chance of psychiatric caseness is obtained after controlling for other individual and geographic risk factors. Allowing for social capital has considerable effects on the impacts on psychiatric health of other risk factors. In particular, the impact of area deprivation category is much reduced. There is also evidence of significant differentiation in social capital between population categories and geographic contexts.

  10. An Integrative Framework for the Analysis of Multiple and Multimodal Representations for Meaning-Making in Science Education

    ERIC Educational Resources Information Center

    Tang, Kok-Sing; Delgado, Cesar; Moje, Elizabeth Birr

    2014-01-01

    This paper presents an integrative framework for analyzing science meaning-making with representations. It integrates the research on multiple representations and multimodal representations by identifying and leveraging the differences in their units of analysis in two dimensions: timescale and compositional grain size. Timescale considers the…

  11. An Integrative Framework for the Analysis of Multiple and Multimodal Representations for Meaning-Making in Science Education

    ERIC Educational Resources Information Center

    Tang, Kok-Sing; Delgado, Cesar; Moje, Elizabeth Birr

    2014-01-01

    This paper presents an integrative framework for analyzing science meaning-making with representations. It integrates the research on multiple representations and multimodal representations by identifying and leveraging the differences in their units of analysis in two dimensions: timescale and compositional grain size. Timescale considers the…

  12. Method and apparatus for determining material structural integrity

    DOEpatents

    Pechersky, Martin

    1996-01-01

    A non-destructive method and apparatus for determining the structural integrity of materials by combining laser vibrometry with damping analysis techniques to determine the damping loss factor of a material. The method comprises the steps of vibrating the area being tested over a known frequency range and measuring vibrational force and velocity as a function of time over the known frequency range. Vibrational velocity is preferably measured by a laser vibrometer. Measurement of the vibrational force depends on the vibration method. If an electromagnetic coil is used to vibrate a magnet secured to the area being tested, then the vibrational force is determined by the amount of coil current used in vibrating the magnet. If a reciprocating transducer is used to vibrate a magnet secured to the area being tested, then the vibrational force is determined by a force gauge in the reciprocating transducer. Using known vibrational analysis methods, a plot of the drive point mobility of the material over the preselected frequency range is generated from the vibrational force and velocity measurements. The damping loss factor is derived from a plot of the drive point mobility over the preselected frequency range using the resonance dwell method and compared with a reference damping loss factor for structural integrity evaluation.

  13. 'Unite and conquer': enhanced prediction of protein subcellular localization by integrating multiple specialized tools

    PubMed Central

    Shen, Yao Qing; Burger, Gertraud

    2007-01-01

    Background Knowing the subcellular location of proteins provides clues to their function as well as the interconnectivity of biological processes. Dozens of tools are available for predicting protein location in the eukaryotic cell. Each tool performs well on certain data sets, but their predictions often disagree for a given protein. Since the individual tools each have particular strengths, we set out to integrate them in a way that optimally exploits their potential. The method we present here is applicable to various subcellular locations, but tailored for predicting whether or not a protein is localized in mitochondria. Knowledge of the mitochondrial proteome is relevant to understanding the role of this organelle in global cellular processes. Results In order to develop a method for enhanced prediction of subcellular localization, we integrated the outputs of available localization prediction tools by several strategies, and tested the performance of each strategy with known mitochondrial proteins. The accuracy obtained (up to 92%) surpasses by far the individual tools. The method of integration proved crucial to the performance. For the prediction of mitochondrion-located proteins, integration via a two-layer decision tree clearly outperforms simpler methods, as it allows emphasis of biologically relevant features such as the mitochondrial targeting peptide and transmembrane domains. Conclusion We developed an approach that enhances the prediction accuracy of mitochondrial proteins by uniting the strength of specialized tools. The combination of machine-learning based integration with biological expert knowledge leads to improved performance. This approach also alleviates the conundrum of how to choose between conflicting predictions. Our approach is easy to implement, and applicable to predicting subcellular locations other than mitochondria, as well as other biological features. For a trial of our approach, we provide a webservice for mitochondrial protein

  14. Integrating multiple lines of evidence into historical biogeography hypothesis testing: a Bison bison case study

    PubMed Central

    Metcalf, Jessica L.; Prost, Stefan; Nogués-Bravo, David; DeChaine, Eric G.; Anderson, Christian; Batra, Persaram; Araújo, Miguel B.; Cooper, Alan; Guralnick, Robert P.

    2014-01-01

    One of the grand goals of historical biogeography is to understand how and why species' population sizes and distributions change over time. Multiple types of data drawn from disparate fields, combined into a single modelling framework, are necessary to document changes in a species's demography and distribution, and to determine the drivers responsible for change. Yet truly integrated approaches are challenging and rarely performed. Here, we discuss a modelling framework that integrates spatio-temporal fossil data, ancient DNA, palaeoclimatological reconstructions, bioclimatic envelope modelling and coalescence models in order to statistically test alternative hypotheses of demographic and potential distributional changes for the iconic American bison (Bison bison). Using different assumptions about the evolution of the bioclimatic niche, we generate hypothetical distributional and demographic histories of the species. We then test these demographic models by comparing the genetic signature predicted by serial coalescence against sequence data derived from subfossils and modern populations. Our results supported demographic models that include both climate and human-associated drivers of population declines. This synthetic approach, integrating palaeoclimatology, bioclimatic envelopes, serial coalescence, spatio-temporal fossil data and heterochronous DNA sequences, improves understanding of species' historical biogeography by allowing consideration of both abiotic and biotic interactions at the population level. PMID:24403338

  15. Integrating multiple lines of evidence into historical biogeography hypothesis testing: a Bison bison case study.

    PubMed

    Metcalf, Jessica L; Prost, Stefan; Nogués-Bravo, David; DeChaine, Eric G; Anderson, Christian; Batra, Persaram; Araújo, Miguel B; Cooper, Alan; Guralnick, Robert P

    2014-02-22

    One of the grand goals of historical biogeography is to understand how and why species' population sizes and distributions change over time. Multiple types of data drawn from disparate fields, combined into a single modelling framework, are necessary to document changes in a species's demography and distribution, and to determine the drivers responsible for change. Yet truly integrated approaches are challenging and rarely performed. Here, we discuss a modelling framework that integrates spatio-temporal fossil data, ancient DNA, palaeoclimatological reconstructions, bioclimatic envelope modelling and coalescence models in order to statistically test alternative hypotheses of demographic and potential distributional changes for the iconic American bison (Bison bison). Using different assumptions about the evolution of the bioclimatic niche, we generate hypothetical distributional and demographic histories of the species. We then test these demographic models by comparing the genetic signature predicted by serial coalescence against sequence data derived from subfossils and modern populations. Our results supported demographic models that include both climate and human-associated drivers of population declines. This synthetic approach, integrating palaeoclimatology, bioclimatic envelopes, serial coalescence, spatio-temporal fossil data and heterochronous DNA sequences, improves understanding of species' historical biogeography by allowing consideration of both abiotic and biotic interactions at the population level.

  16. No double-dissociation between optic ataxia and visual agnosia: multiple sub-streams for multiple visuo-manual integrations.

    PubMed

    Pisella, L; Binkofski, F; Lasek, K; Toni, I; Rossetti, Y

    2006-01-01

    The current dominant view of the visual system is marked by the functional and anatomical dissociation between a ventral stream specialised for perception and a dorsal stream specialised for action. The "double-dissociation" between visual agnosia (VA), a deficit of visual recognition, and optic ataxia (OA), a deficit of visuo-manual guidance, considered as consecutive to ventral and dorsal damage, respectively, has provided the main argument for this dichotomic view. In the first part of this paper, we show that the currently available empirical data do not suffice to support a double-dissociation between OA and VA. In the second part, we review evidence coming from human neuropsychology and monkey data, which cast further doubts on the validity of a simple double-dissociation between perception and action because they argue for a far more complex organisation with multiple parallel visual-to-motor connections: 1. A dorso-dorsal pathway (involving the most dorsal part of the parietal and pre-motor cortices): for immediate visuo-motor control--with OA as typical disturbance. The latest research about OA is reviewed, showing how these patients exhibit deficits restricted to the most direct and fast visuo-motor transformations. We also propose that mild mirror ataxia, consisting of misreaching errors when the controlesional hand is guided to a visual goal though a mirror, could correspond to OA with an isolated "hand effect". 2. A ventral stream-prefrontal pathway (connections from the ventral visual stream to pre-frontal areas, by-passing the parietal areas): for "mediate" control (involving spatial or temporal transpositions [Rossetti, Y., & Pisella, L. (2003). Mediate responses as direct evidence for intention: Neuropsychology of Not to-, Not now- and Not there-tasks. In S. Johnson (Ed.), Cognitive Neuroscience perspectives on the problem of intentional action (pp. 67-105). MIT Press.])--with VA as typical disturbance. Preserved visuo-manual guidance in patients

  17. Efficient Fully Implicit Time Integration Methods for Modeling Cardiac Dynamics

    PubMed Central

    Rose, Donald J.; Henriquez, Craig S.

    2013-01-01

    Implicit methods are well known to have greater stability than explicit methods for stiff systems, but they often are not used in practice due to perceived computational complexity. This paper applies the Backward Euler method and a second-order one-step two-stage composite backward differentiation formula (C-BDF2) for the monodomain equations arising from mathematically modeling the electrical activity of the heart. The C-BDF2 scheme is an L-stable implicit time integration method and easily implementable. It uses the simplest Forward Euler and Backward Euler methods as fundamental building blocks. The nonlinear system resulting from application of the Backward Euler method for the monodomain equations is solved for the first time by a nonlinear elimination method, which eliminates local and non-symmetric components by using a Jacobian-free Newton solver, called Newton-Krylov solver. Unlike other fully implicit methods proposed for the monodomain equations in the literature, the Jacobian of the global system after the nonlinear elimination has much smaller size, is symmetric and possibly positive definite, which can be solved efficiently by standard optimal solvers. Numerical results are presented demonstrating that the C-BDF2 scheme can yield accurate results with less CPU times than explicit methods for both a single patch and spatially extended domains. PMID:19126449

  18. Whole-Brain Radiotherapy With Simultaneous Integrated Boost to Multiple Brain Metastases Using Volumetric Modulated Arc Therapy

    SciTech Connect

    Lagerwaard, Frank J. Hoorn, Elles A.P. van der; Verbakel, Wilko; Haasbeek, Cornelis J.A.; Slotman, Ben J.; Senan, Suresh

    2009-09-01

    Purpose: Volumetric modulated arc therapy (RapidArc [RA]; Varian Medical Systems, Palo Alto, CA) allows for the generation of intensity-modulated dose distributions by use of a single gantry rotation. We used RA to plan and deliver whole-brain radiotherapy (WBRT) with a simultaneous integrated boost in patients with multiple brain metastases. Methods and Materials: Composite RA plans were generated for 8 patients, consisting of WBRT (20 Gy in 5 fractions) with an integrated boost, also 20 Gy in 5 fractions, to Brain metastases, and clinically delivered in 3 patients. Summated gross tumor volumes were 1.0 to 37.5 cm{sup 3}. RA plans were measured in a solid water phantom by use of Gafchromic films (International Specialty Products, Wayne, NJ). Results: Composite RA plans could be generated within 1 hour. Two arcs were needed to deliver the mean of 1,600 monitor units with a mean 'beam-on' time of 180 seconds. RA plans showed excellent coverage of planning target volume for WBRT and planning target volume for the boost, with mean volumes receiving at least 95% of the prescribed dose of 100% and 99.8%, respectively. The mean conformity index was 1.36. Composite plans showed much steeper dose gradients outside Brain metastases than plans with a conventional summation of WBRT and radiosurgery. Comparison of calculated and measured doses showed a mean gamma for double-arc plans of 0.30, and the area with a gamma larger than 1 was 2%. In-room times for clinical RA sessions were approximately 20 minutes for each patient. Conclusions: RA treatment planning and delivery of integrated plans of WBRT and boosts to multiple brain metastases is a rapid and accurate technique that has a higher conformity index than conventional summation of WBRT and radiosurgery boost.

  19. Methods and systems for integrating fluid dispensing technology with stereolithography

    DOEpatents

    Medina, Francisco; Wicker, Ryan; Palmer, Jeremy A.; Davis, Don W.; Chavez, Bart D.; Gallegos, Phillip L.

    2010-02-09

    An integrated system and method of integrating fluid dispensing technologies (e.g., direct-write (DW)) with rapid prototyping (RP) technologies (e.g., stereolithography (SL)) without part registration comprising: an SL apparatus and a fluid dispensing apparatus further comprising a translation mechanism adapted to translate the fluid dispensing apparatus along the Z-, Y- and Z-axes. The fluid dispensing apparatus comprises: a pressurized fluid container; a valve mechanism adapted to control the flow of fluid from the pressurized fluid container; and a dispensing nozzle adapted to deposit the fluid in a desired location. To aid in calibration, the integrated system includes a laser sensor and a mechanical switch. The method further comprises building a second part layer on top of the fluid deposits and optionally accommodating multi-layered circuitry by incorporating a connector trace. Thus, the present invention is capable of efficiently building single and multi-material SL fabricated parts embedded with complex three-dimensional circuitry using DW.

  20. A robust multiple-locus method for quantitative trait locus analysis of non-normally distributed multiple traits.

    PubMed

    Li, Z; Möttönen, J; Sillanpää, M J

    2015-12-01

    Linear regression-based quantitative trait loci/association mapping methods such as least squares commonly assume normality of residuals. In genetics studies of plants or animals, some quantitative traits may not follow normal distribution because the data include outlying observations or data that are collected from multiple sources, and in such cases the normal regression methods may lose some statistical power to detect quantitative trait loci. In this work, we propose a robust multiple-locus regression approach for analyzing multiple quantitative traits without normality assumption. In our method, the objective function is least absolute deviation (LAD), which corresponds to the assumption of multivariate Laplace distributed residual errors. This distribution has heavier tails than the normal distribution. In addition, we adopt a group LASSO penalty to produce shrinkage estimation of the marker effects and to describe the genetic correlation among phenotypes. Our LAD-LASSO approach is less sensitive to the outliers and is more appropriate for the analysis of data with skewedly distributed phenotypes. Another application of our robust approach is on missing phenotype problem in multiple-trait analysis, where the missing phenotype items can simply be filled with some extreme values, and be treated as outliers. The efficiency of the LAD-LASSO approach is illustrated on both simulated and real data sets.

  1. Method to assemble and integrate biochemical pathways into the chloroplast genome of Chlamydomonas reinhardtii.

    PubMed

    Noor-Mohammadi, Samaneh; Pourmir, Azadeh; Johannes, Tyler W

    2012-11-01

    Recombinant protein expression in the chloroplasts of green algae has recently become more routine; however, the heterologous expression of multiple proteins or complete biosynthetic pathways remains a significant challenge. Here, we show that a modified DNA Assembler approach can be used to rapidly assemble multiple-gene biosynthetic pathways in yeast and then integrate these assembled pathways at a site-specific location in the chloroplast genome of the microalgal species Chlamydomonas reinhardtii. As a proof of concept, this method was used to successfully integrate and functionally express up to three reporter proteins (AphA6, AadA, and GFP) in the chloroplast of C. reinhardtii. An analysis of the relative gene expression of the engineered strains showed significant differences in the mRNA expression levels of the reporter genes and thus highlights the importance of proper promoter/untranslated region selection when constructing a target pathway. This new method represents a useful genetic tool in the construction and integration of complex biochemical pathways into the chloroplast genome of microalgae and should aid current efforts to engineer algae for biofuels production and other desirable natural products.

  2. Evaluating the Accuracy and Efficiency of Multiple Sequence Alignment Methods

    PubMed Central

    Pervez, Muhammad Tariq; Babar, Masroor Ellahi; Nadeem, Asif; Aslam, Muhammad; Awan, Ali Raza; Aslam, Naeem; Hussain, Tanveer; Naveed, Nasir; Qadri, Salman; Waheed, Usman; Shoaib, Muhammad

    2014-01-01

    A comparison of 10 most popular Multiple Sequence Alignment (MSA) tools, namely, MUSCLE, MAFFT(L-INS-i), MAFFT (FFT-NS-2), T-Coffee, ProbCons, SATe, Clustal Omega, Kalign, Multalin, and Dialign-TX is presented. We also focused on the significance of some implementations embedded in algorithm of each tool. Based on 10 simulated trees of different number of taxa generated by R, 400 known alignments and sequence files were constructed using indel-Seq-Gen. A total of 4000 test alignments were generated to study the effect of sequence length, indel size, deletion rate, and insertion rate. Results showed that alignment quality was highly dependent on the number of deletions and insertions in the sequences and that the sequence length and indel size had a weaker effect. Overall, ProbCons was consistently on the top of list of the evaluated MSA tools. SATe, being little less accurate, was 529.10% faster than ProbCons and 236.72% faster than MAFFT(L-INS-i). Among other tools, Kalign and MUSCLE achieved the highest sum of pairs. We also considered BALiBASE benchmark datasets and the results relative to BAliBASE- and indel-Seq-Gen-generated alignments were consistent in the most cases. PMID:25574120

  3. The Multiple-Car Method. Exploring Its Use in Driver and Traffic Safety Education. Second Edition.

    ERIC Educational Resources Information Center

    American Driver and Traffic Safety Education Association, Washington, DC.

    Primarily written for school administrators and driver education teachers, this publication presents information on planning and implementing the multiple car method of driver instruction. An introductory section presents a definition of the multiple car method and its history of development. It is defined as an off-street paved area incorporating…

  4. An Illustration to Assist in Comparing and Remembering Several Multiplicity Adjustment Methods

    ERIC Educational Resources Information Center

    Hasler, Mario

    2017-01-01

    There are many well-known or new methods to adjust statistical tests for multiplicity. This article provides an illustration helping lecturers or consultants to remember the differences of three important multiplicity adjustment methods and to explain them to non-statisticians.

  5. Minimum Fuel Trajectory Design in Multiple Dynamical Environments Utilizing Direct Transcription Methods and Particle Swarm Optimization

    DTIC Science & Technology

    2016-03-01

    MINIMUM-FUEL TRAJECTORY DESIGN IN MULTIPLE DYNAMICAL ENVIRONMENTS UTILIZING DIRECT TRANSCRIPTION METHODS AND PARTICLE SWARM OPTIMIZATION THESIS...250 MINIMUM-FUEL TRAJECTORY DESIGN IN MULTIPLE DYNAMICAL ENVIRONMENTS UTILIZING DIRECT TRANSCRIPTION METHODS AND PARTICLE SWARM OPTIMIZATION THESIS... Education and Training Command in Partial Fulfillment of the Requirements for the Degree of Master of Science in Astronautical Engineering Alfredo G

  6. An Illustration to Assist in Comparing and Remembering Several Multiplicity Adjustment Methods

    ERIC Educational Resources Information Center

    Hasler, Mario

    2017-01-01

    There are many well-known or new methods to adjust statistical tests for multiplicity. This article provides an illustration helping lecturers or consultants to remember the differences of three important multiplicity adjustment methods and to explain them to non-statisticians.

  7. Method and apparatus for determining material structural integrity

    DOEpatents

    Pechersky, M.J.

    1994-01-01

    Disclosed are a nondestructive method and apparatus for determining the structural integrity of materials by combining laser vibrometry with damping analysis to determine the damping loss factor. The method comprises the steps of vibrating the area being tested over a known frequency range and measuring vibrational force and velocity vs time over the known frequency range. Vibrational velocity is preferably measured by a laser vibrometer. Measurement of the vibrational force depends on the vibration method: if an electromagnetic coil is used to vibrate a magnet secured to the area being tested, then the vibrational force is determined by the coil current. If a reciprocating transducer is used, the vibrational force is determined by a force gauge in the transducer. Using vibrational analysis, a plot of the drive point mobility of the material over the preselected frequency range is generated from the vibrational force and velocity data. Damping loss factor is derived from a plot of the drive point mobility over the preselected frequency range using the resonance dwell method and compared with a reference damping loss factor for structural integrity evaluation.

  8. Integral structural-functional method for characterizing microbial populations

    NASA Astrophysics Data System (ADS)

    Yakushev, A. V.

    2015-04-01

    An original integral structural-functional method has been proposed for characterizing microbial communities. The novelty of the approach is the in situ study of microorganisms based on the growth kinetics of microbial associations in liquid nutrient broth media under selective conditions rather than on the level of taxa or large functional groups. The method involves the analysis of the integral growth model of a periodic culture. The kinetic parameters of such associations reflect their capacity of growing on different media, i.e., their physiological diversity, and the metabolic capacity of the microorganisms for growth on a nutrient medium. Therefore, the obtained parameters are determined by the features of the microbial ecological strategies. The inoculation of a dense medium from the original inoculate allows characterizing the taxonomic composition of the dominants in the soil community. The inoculation from the associations developed on selective media characterizes the composition of syntrophic groups, which fulfill a specific function in nature. This method is of greater information value than the classical methods of inoculation on selective media.

  9. Integrated Force Method Solution to Indeterminate Structural Mechanics Problems

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Halford, Gary R.

    2004-01-01

    Strength of materials problems have been classified into determinate and indeterminate problems. Determinate analysis primarily based on the equilibrium concept is well understood. Solutions of indeterminate problems required additional compatibility conditions, and its comprehension was not exclusive. A solution to indeterminate problem is generated by manipulating the equilibrium concept, either by rewriting in the displacement variables or through the cutting and closing gap technique of the redundant force method. Compatibility improvisation has made analysis cumbersome. The authors have researched and understood the compatibility theory. Solutions can be generated with equal emphasis on the equilibrium and compatibility concepts. This technique is called the Integrated Force Method (IFM). Forces are the primary unknowns of IFM. Displacements are back-calculated from forces. IFM equations are manipulated to obtain the Dual Integrated Force Method (IFMD). Displacement is the primary variable of IFMD and force is back-calculated. The subject is introduced through response variables: force, deformation, displacement; and underlying concepts: equilibrium equation, force deformation relation, deformation displacement relation, and compatibility condition. Mechanical load, temperature variation, and support settling are equally emphasized. The basic theory is discussed. A set of examples illustrate the new concepts. IFM and IFMD based finite element methods are introduced for simple problems.

  10. Multiple Revolution Solutions for the Perturbed Lambert Problem using the Method of Particular Solutions and Picard Iteration

    NASA Astrophysics Data System (ADS)

    Woollands, Robyn M.; Read, Julie L.; Probe, Austin B.; Junkins, John L.

    2017-07-01

    We present a new method for solving the multiple revolution perturbed Lambert problem using the method of particular solutions and modified Chebyshev-Picard iteration. The method of particular solutions differs from the well-known Newton-shooting method in that integration of the state transition matrix (36 additional differential equations) is not required, and instead it makes use of a reference trajectory and a set of n particular solutions. Any numerical integrator can be used for solving two-point boundary problems with the method of particular solutions, however we show that using modified Chebyshev-Picard iteration affords an avenue for increased efficiency that is not available with other step-by-step integrators. We take advantage of the path approximation nature of modified Chebyshev-Picard iteration (nodes iteratively converge to fixed points in space) and utilize a variable fidelity force model for propagating the reference trajectory. Remarkably, we demonstrate that computing the particular solutions with only low fidelity function evaluations greatly increases the efficiency of the algorithm while maintaining machine precision accuracy. Our study reveals that solving the perturbed Lambert's problem using the method of particular solutions with modified Chebyshev-Picard iteration is about an order of magnitude faster compared with the classical shooting method and a tenth-twelfth order Runge-Kutta integrator. It is well known that the solution to Lambert's problem over multiple revolutions is not unique and to ensure that all possible solutions are considered we make use of a reliable preexisting Keplerian Lambert solver to warm start our perturbed algorithm.

  11. Comparison of Aurantii Fructus Immaturus and Aurantii Fructus based on multiple chromatographic analysis and chemometrics methods.

    PubMed

    Li, Pei; Zeng, Su-Ling; Duan, Li; Ma, Xiao-Dong; Dou, Li-Li; Wang, Lan-Jin; Li, Ping; Bi, Zhi-Ming; Liu, E-Hu

    2016-10-21

    To get a better understanding of the bioactive constituents in Aurantii Fructus Immaturus (AFI) and Aurantii Fructus (AF), in the present study, a comprehensive strategy integrating multiple chromatographic analysis and chemometrics methods was firstly proposed. Based on segmental monitoring, a high-performance liquid chromatography (HPLC)-variable wavelength detection method was established for simultaneous quantification of ten major flavonoids, and the quantitative data were further analyzed by hierarchical cluster analysis (HCA) and principal component analysis (PCA). A strong cation exchange-high performance liquid chromatography (SCX-HPLC) method combined with t-test and one-way analysis of variance (ANOVA) was developed to determine synephrine, the major alkaloid in AFI and AF. The essential oils were analyzed by gas chromatography-mass spectrometry (GC-MS) and further processed by partial least squares discrimination analysis (PLS-DA). The results indicated that the contents of ten flavonoids and synephrine in AFI were significantly higher than those in AF, and significant difference existed in samples from different geographical origins. Also, 9 differential volatile constituents detected could be used as chemical markers for discrimination of AFI and AF. Collectively, the proposed comprehensive analysis might be a well-acceptable strategy to evaluate the quality of traditional citrus herbs. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Comparison of multiple gene assembly methods for metabolic engineering

    Treesearch

    Chenfeng Lu; Karen Mansoorabadi; Thomas Jeffries

    2007-01-01

    A universal, rapid DNA assembly method for efficient multigene plasmid construction is important for biological research and for optimizing gene expression in industrial microbes. Three different approaches to achieve this goal were evaluated. These included creating long complementary extensions using a uracil-DNA glycosylase technique, overlap extension polymerase...

  13. Multiple Enactments of Method, Divergent Hinterlands and Production of Multiple Realities in Educational Research

    ERIC Educational Resources Information Center

    Rimpiläinen, Sanna

    2015-01-01

    What do different research methods and approaches "do" in practice? The article seeks to discuss this point by drawing upon socio-material research approaches and empirical examples taken from the early stages of an extensive case study on an interdisciplinary project between two multidisciplinary fields of study, education and computer…

  14. Multiple Enactments of Method, Divergent Hinterlands and Production of Multiple Realities in Educational Research

    ERIC Educational Resources Information Center

    Rimpiläinen, Sanna

    2015-01-01

    What do different research methods and approaches "do" in practice? The article seeks to discuss this point by drawing upon socio-material research approaches and empirical examples taken from the early stages of an extensive case study on an interdisciplinary project between two multidisciplinary fields of study, education and computer…

  15. Integrative analysis of multiple diverse omics datasets by sparse group multitask regression

    PubMed Central

    Lin, Dongdong; Zhang, Jigang; Li, Jingyao; He, Hao; Deng, Hong-Wen; Wang, Yu-Ping

    2014-01-01

    A variety of high throughput genome-wide assays enable the exploration of genetic risk factors underlying complex traits. Although these studies have remarkable impact on identifying susceptible biomarkers, they suffer from issues such as limited sample size and low reproducibility. Combining individual studies of different genetic levels/platforms has the promise to improve the power and consistency of biomarker identification. In this paper, we propose a novel integrative method, namely sparse group multitask regression, for integrating diverse omics datasets, platforms, and populations to identify risk genes/factors of complex diseases. This method combines multitask learning with sparse group regularization, which will: (1) treat the biomarker identification in each single study as a task and then combine them by multitask learning; (2) group variables from all studies for identifying significant genes; (3) enforce sparse constraint on groups of variables to overcome the “small sample, but large variables” problem. We introduce two sparse group penalties: sparse group lasso and sparse group ridge in our multitask model, and provide an effective algorithm for each model. In addition, we propose a significance test for the identification of potential risk genes. Two simulation studies are performed to evaluate the performance of our integrative method by comparing it with conventional meta-analysis method. The results show that our sparse group multitask method outperforms meta-analysis method significantly. In an application to our osteoporosis studies, 7 genes are identified as significant genes by our method and are found to have significant effects in other three independent studies for validation. The most significant gene SOD2 has been identified in our previous osteoporosis study involving the same expression dataset. Several other genes such as TREML2, HTR1E, and GLO1 are shown to be novel susceptible genes for osteoporosis, as confirmed from other

  16. A Scalable Bayesian Method for Integrating Functional Information in Genome-wide Association Studies.

    PubMed

    Yang, Jingjing; Fritsche, Lars G; Zhou, Xiang; Abecasis, Gonçalo

    2017-09-07

    Genome-wide association studies (GWASs) have identified many complex loci. However, most loci reside in noncoding regions and have unknown biological functions. Integrative analysis that incorporates known functional information into GWASs can help elucidate the underlying biological mechanisms and prioritize important functional variants. Hence, we develop a flexible Bayesian variable selection model with efficient computational techniques for such integrative analysis. Different from previous approaches, our method models the effect-size distribution and probability of causality for variants with different annotations and jointly models genome-wide variants to account for linkage disequilibrium (LD), thus prioritizing associations based on the quantification of the annotations and allowing for multiple associated variants per locus. Our method dramatically improves both computational speed and posterior sampling convergence by taking advantage of the block-wise LD structures in human genomes. In simulations, our method accurately quantifies the functional enrichment and performs more powerfully for prioritizing the true associations than alternative methods, where the power gain is especially apparent when multiple associated variants in LD reside in the same locus. We applied our method to an in-depth GWAS of age-related macular degeneration with 33,976 individuals and 9,857,286 variants. We find the strongest enrichment for causality among non-synonymous variants (54× more likely to be causal, 1.4× larger effect sizes) and variants in transcription, repressed Polycomb, and enhancer regions, as well as identify five additional candidate loci beyond the 32 known AMD risk loci. In conclusion, our method is shown to efficiently integrate functional information in GWASs, helping identify functional associated-variants and underlying biology. Published by Elsevier Inc.

  17. The use of the Integrated Discrete Multiple Organ Co-culture (IdMOC) system for the evaluation of multiple organ toxicity.

    PubMed

    Li, Albert P

    2009-09-01

    The application of the Integrated Discrete Multiple Organ Co-culture (IdMOC) system in the evaluation of organ-specific toxicity is reviewed. In vitro approaches to predict in vivo toxicity have met with limited success, mainly because of the complexity of in vivo toxic responses. In vivo properties that are not well-represented in vitro include organ-specific responses, multiple organ metabolism, and multiple organ interactions. The IdMOC system has been developed to address these deficiencies. The system uses a 'wells-within-a-well' concept for the co-culturing of cells or tissue slices from different organs as physically separated (discrete) entities in the small inner wells. These inner wells are nevertheless interconnected (integrated) by overlying culture medium in the large outer containing well. The IdMOC system thereby models the in vivo situation, in which multiple organs are physically separated but interconnected by the systemic circulation, permitting multiple organ interactions. The IdMOC system, with either cells or tissue slices from multiple organs, can be used to evaluate cell type-specific or organ-specific toxicity.

  18. Optical caries diagnostics: comparison of laser spectroscopic PNC method with method of laser integral fluorescence

    NASA Astrophysics Data System (ADS)

    Masychev, Victor I.

    2000-11-01

    In this research we present the results of approbation of two methods of optical caries diagnostics: PNC-spectral diagnostics and caries detection by laser integral fluorescence. The research was conducted in a dental clinic. PNC-method analyses parameters of probing laser radiation and PNC-spectrums of stimulated secondary radiations: backscattering and endogenous fluorescence of caries-involved bacterias. He-Ne-laser ((lambda) =632,8 nm, 1-2mW) was used as a source of probing (stimulated) radiation. For registration of signals, received from intact and pathological teeth PDA-detector was applied. PNC-spectrums were processed by special algorithms, and were displayed on PC monitor. The method of laser integral fluorescence was used for comparison. In this case integral power of fluorescence of human teeth was measured. As a source of probing (stimulated) radiation diode lasers ((lambda) =655 nm, 0.1 mW and 630nm, 1mW) and He-Ne laser were applied. For registration of signals Si-photodetector was used. Integral power was shown in a digital indicator. Advantages and disadvantages of these methods are described in this research. It is disclosed that the method of laser integral power of fluorescence has the following characteristics: simplicity of construction and schema-technical decisions. However the method of PNC-spectral diagnostics are characterized by considerably more sensitivity in diagnostics of initial caries and capability to differentiate pathologies of various stages (for example, calculus/initial caries). Estimation of spectral characteristics of PNC-signals allows eliminating a number of drawbacks, which are character for detection by method of laser integral fluorescence (for instance, detection of fluorescent fillings, plagues, calculus, discolorations generally, amalgam, gold fillings as if it were caries.

  19. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Werner, Arelia T.; Cannon, Alex J.

    2016-04-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e. correlation tests) and distributional properties (i.e. tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), the climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3-day peak flow and 7-day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational data sets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational data set. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7-day low-flow events, regardless of reanalysis or observational data set. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event

  20. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Werner, A. T.; Cannon, A. J.

    2015-06-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e., correlation tests) and distributional properties (i.e., tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3 day peak flow and 7 day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational datasets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational dataset. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7 day low flow events, regardless of reanalysis or observational dataset. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event