Sample records for restricted partition method

  1. Evolving bipartite authentication graph partitions

    DOE PAGES

    Pope, Aaron Scott; Tauritz, Daniel Remy; Kent, Alexander D.

    2017-01-16

    As large scale enterprise computer networks become more ubiquitous, finding the appropriate balance between user convenience and user access control is an increasingly challenging proposition. Suboptimal partitioning of users’ access and available services contributes to the vulnerability of enterprise networks. Previous edge-cut partitioning methods unduly restrict users’ access to network resources. This paper introduces a novel method of network partitioning superior to the current state-of-the-art which minimizes user impact by providing alternate avenues for access that reduce vulnerability. Networks are modeled as bipartite authentication access graphs and a multi-objective evolutionary algorithm is used to simultaneously minimize the size of largemore » connected components while minimizing overall restrictions on network users. Lastly, results are presented on a real world data set that demonstrate the effectiveness of the introduced method compared to previous naive methods.« less

  2. Evolving bipartite authentication graph partitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, Aaron Scott; Tauritz, Daniel Remy; Kent, Alexander D.

    As large scale enterprise computer networks become more ubiquitous, finding the appropriate balance between user convenience and user access control is an increasingly challenging proposition. Suboptimal partitioning of users’ access and available services contributes to the vulnerability of enterprise networks. Previous edge-cut partitioning methods unduly restrict users’ access to network resources. This paper introduces a novel method of network partitioning superior to the current state-of-the-art which minimizes user impact by providing alternate avenues for access that reduce vulnerability. Networks are modeled as bipartite authentication access graphs and a multi-objective evolutionary algorithm is used to simultaneously minimize the size of largemore » connected components while minimizing overall restrictions on network users. Lastly, results are presented on a real world data set that demonstrate the effectiveness of the introduced method compared to previous naive methods.« less

  3. A statistical mechanical approach to restricted integer partition functions

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-05-01

    The main aim of this paper is twofold: (1) suggesting a statistical mechanical approach to the calculation of the generating function of restricted integer partition functions which count the number of partitions—a way of writing an integer as a sum of other integers under certain restrictions. In this approach, the generating function of restricted integer partition functions is constructed from the canonical partition functions of various quantum gases. (2) Introducing a new type of restricted integer partition functions corresponding to general statistics which is a generalization of Gentile statistics in statistical mechanics; many kinds of restricted integer partition functions are special cases of this restricted integer partition function. Moreover, with statistical mechanics as a bridge, we reveal a mathematical fact: the generating function of restricted integer partition function is just the symmetric function which is a class of functions being invariant under the action of permutation groups. Using this approach, we provide some expressions of restricted integer partition functions as examples.

  4. Cylindric partitions, {{\\boldsymbol{ W }}}_{r} characters and the Andrews-Gordon-Bressoud identities

    NASA Astrophysics Data System (ADS)

    Foda, O.; Welsh, T. A.

    2016-04-01

    We study the Andrews-Gordon-Bressoud (AGB) generalisations of the Rogers-Ramanujan q-series identities in the context of cylindric partitions. We recall the definition of r-cylindric partitions, and provide a simple proof of Borodin’s product expression for their generating functions, that can be regarded as a limiting case of an unpublished proof by Krattenthaler. We also recall the relationships between the r-cylindric partition generating functions, the principal characters of {\\hat{{sl}}}r algebras, the {{\\boldsymbol{ M }}}r r,r+d minimal model characters of {{\\boldsymbol{ W }}}r algebras, and the r-string abaci generating functions, providing simple proofs for each. We then set r = 2, and use two-cylindric partitions to re-derive the AGB identities as follows. Firstly, we use Borodin’s product expression for the generating functions of the two-cylindric partitions with infinitely long parts, to obtain the product sides of the AGB identities, times a factor {(q;q)}∞ -1, which is the generating function of ordinary partitions. Next, we obtain a bijection from the two-cylindric partitions, via two-string abaci, into decorated versions of Bressoud’s restricted lattice paths. Extending Bressoud’s method of transforming between restricted paths that obey different restrictions, we obtain sum expressions with manifestly non-negative coefficients for the generating functions of the two-cylindric partitions which contains a factor {(q;q)}∞ -1. Equating the product and sum expressions of the same two-cylindric partitions, and canceling a factor of {(q;q)}∞ -1 on each side, we obtain the AGB identities.

  5. A multiscale restriction-smoothed basis method for high contrast porous media represented on unstructured grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Møyner, Olav, E-mail: olav.moyner@sintef.no; Lie, Knut-Andreas, E-mail: knut-andreas.lie@sintef.no

    2016-01-01

    A wide variety of multiscale methods have been proposed in the literature to reduce runtime and provide better scaling for the solution of Poisson-type equations modeling flow in porous media. We present a new multiscale restricted-smoothed basis (MsRSB) method that is designed to be applicable to both rectilinear grids and unstructured grids. Like many other multiscale methods, MsRSB relies on a coarse partition of the underlying fine grid and a set of local prolongation operators (multiscale basis functions) that map unknowns associated with the fine grid cells to unknowns associated with blocks in the coarse partition. These mappings are constructedmore » by restricted smoothing: Starting from a constant, a localized iterative scheme is applied directly to the fine-scale discretization to compute prolongation operators that are consistent with the local properties of the differential operators. The resulting method has three main advantages: First of all, both the coarse and the fine grid can have general polyhedral geometry and unstructured topology. This means that partitions and good prolongation operators can easily be constructed for complex models involving high media contrasts and unstructured cell connections introduced by faults, pinch-outs, erosion, local grid refinement, etc. In particular, the coarse partition can be adapted to geological or flow-field properties represented on cells or faces to improve accuracy. Secondly, the method is accurate and robust when compared to existing multiscale methods and does not need expensive recomputation of local basis functions to account for transient behavior: Dynamic mobility changes are incorporated by continuing to iterate a few extra steps on existing basis functions. This way, the cost of updating the prolongation operators becomes proportional to the amount of change in fluid mobility and one reduces the need for expensive, tolerance-based updates. Finally, since the MsRSB method is formulated on top of a cell-centered, conservative, finite-volume method, it is applicable to any flow model in which one can isolate a pressure equation. Herein, we only discuss single and two-phase incompressible models. Compressible flow, e.g., as modeled by the black-oil equations, is discussed in a separate paper.« less

  6. Decision tree modeling using R.

    PubMed

    Zhang, Zhongheng

    2016-08-01

    In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met. In the example I focus on conditional inference tree, which incorporates tree-structured regression models into conditional inference procedures. While growing a single tree is subject to small changes in the training data, random forests procedure is introduced to address this problem. The sources of diversity for random forests come from the random sampling and restricted set of input variables to be selected. Finally, I introduce R functions to perform model based recursive partitioning. This method incorporates recursive partitioning into conventional parametric model building.

  7. Automatic partitioning of head CTA for enabling segmentation

    NASA Astrophysics Data System (ADS)

    Suryanarayanan, Srikanth; Mullick, Rakesh; Mallya, Yogish; Kamath, Vidya; Nagaraj, Nithin

    2004-05-01

    Radiologists perform a CT Angiography procedure to examine vascular structures and associated pathologies such as aneurysms. Volume rendering is used to exploit volumetric capabilities of CT that provides complete interactive 3-D visualization. However, bone forms an occluding structure and must be segmented out. The anatomical complexity of the head creates a major challenge in the segmentation of bone and vessel. An analysis of the head volume reveals varying spatial relationships between vessel and bone that can be separated into three sub-volumes: "proximal", "middle", and "distal". The "proximal" and "distal" sub-volumes contain good spatial separation between bone and vessel (carotid referenced here). Bone and vessel appear contiguous in the "middle" partition that remains the most challenging region for segmentation. The partition algorithm is used to automatically identify these partition locations so that different segmentation methods can be developed for each sub-volume. The partition locations are computed using bone, image entropy, and sinus profiles along with a rule-based method. The algorithm is validated on 21 cases (varying volume sizes, resolution, clinical sites, pathologies) using ground truth identified visually. The algorithm is also computationally efficient, processing a 500+ slice volume in 6 seconds (an impressive 0.01 seconds / slice) that makes it an attractive algorithm for pre-processing large volumes. The partition algorithm is integrated into the segmentation workflow. Fast and simple algorithms are implemented for processing the "proximal" and "distal" partitions. Complex methods are restricted to only the "middle" partition. The partitionenabled segmentation has been successfully tested and results are shown from multiple cases.

  8. Dietary fat and not calcium supplementation or dairy product consumption is associated with changes in anthropometrics during a randomized, placebo-controlled energy-restriction trial

    USDA-ARS?s Scientific Manuscript database

    Insufficient calcium intake has been proposed to cause unbalanced energy partitioning leading to obesity. However, weight loss interventions including dietary calcium or dairy product consumption have not reported changes in lipid metabolism measured by the plasma lipidome. Methods. The objective ...

  9. Many-body formalism for fermions: The partition function

    NASA Astrophysics Data System (ADS)

    Watson, D. K.

    2017-09-01

    The partition function, a fundamental tenet in statistical thermodynamics, contains in principle all thermodynamic information about a system. It encapsulates both microscopic information through the quantum energy levels and statistical information from the partitioning of the particles among the available energy levels. For identical particles, this statistical accounting is complicated by the symmetry requirements of the allowed quantum states. In particular, for Fermi systems, the enforcement of the Pauli principle is typically a numerically demanding task, responsible for much of the cost of the calculations. The interplay of these three elements—the structure of the many-body spectrum, the statistical partitioning of the N particles among the available levels, and the enforcement of the Pauli principle—drives the behavior of mesoscopic and macroscopic Fermi systems. In this paper, we develop an approach for the determination of the partition function, a numerically difficult task, for systems of strongly interacting identical fermions and apply it to a model system of harmonically confined, harmonically interacting fermions. This approach uses a recently introduced many-body method that is an extension of the symmetry-invariant perturbation method (SPT) originally developed for bosons. It uses group theory and graphical techniques to avoid the heavy computational demands of conventional many-body methods which typically scale exponentially with the number of particles. The SPT application of the Pauli principle is trivial to implement since it is done "on paper" by imposing restrictions on the normal-mode quantum numbers at first order in the perturbation. The method is applied through first order and represents an extension of the SPT method to excited states. Our method of determining the partition function and various thermodynamic quantities is accurate and efficient and has the potential to yield interesting insight into the role played by the Pauli principle and the influence of large degeneracies on the emergence of the thermodynamic behavior of large-N systems.

  10. 25 CFR 117.1 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... partition sales). (3) Payments made by insurance companies or others for loss or damage to restricted real... rentals and income from restricted lands owned by the minor children of the Indian, as provided in § 117.3...

  11. 25 CFR 117.1 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... partition sales). (3) Payments made by insurance companies or others for loss or damage to restricted real... rentals and income from restricted lands owned by the minor children of the Indian, as provided in § 117.3...

  12. 25 CFR 117.1 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... partition sales). (3) Payments made by insurance companies or others for loss or damage to restricted real... rentals and income from restricted lands owned by the minor children of the Indian, as provided in § 117.3...

  13. 25 CFR 117.1 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... partition sales). (3) Payments made by insurance companies or others for loss or damage to restricted real... rentals and income from restricted lands owned by the minor children of the Indian, as provided in § 117.3...

  14. 25 CFR 117.1 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... partition sales). (3) Payments made by insurance companies or others for loss or damage to restricted real... rentals and income from restricted lands owned by the minor children of the Indian, as provided in § 117.3...

  15. The Usefulness of Zone Division Using Belt Partition at the Entry Zone of MRI Machine Room: An Analysis of the Restrictive Effect of Dangerous Action Using a Questionnaire.

    PubMed

    Funada, Tatsuro; Shibuya, Tsubasa

    2016-08-01

    The American College of Radiology recommends dividing magnetic resonance imaging (MRI) machine rooms into four zones depending on the education level. However, structural limitations restrict us to apply such recommendation in most of the Japanese facilities. This study examines the effectiveness of the usage of a belt partition to create the zonal division by a questionnaire survey including three critical parameters. They are, the influence of individuals' background (relevance to MRI, years of experience, individuals' post, occupation [i.e., nurse or nursing assistant], outpatient section or ward), the presence or absence of a door or belt partition (opening or closing), and any four personnel scenarios that may be encountered during a visit to an MRI site (e.g., from visiting the MRI site to receive a patient) . In this survey, the influence of dangerous action is uncertain on individuals' backgrounds (maximum odds ratio: 6.3, 95% CI: 1.47-27.31) and the scenarios of personnel (maximum risk ratio: 2.4, 95% CI: 1.16-4.85). Conversely, the presence of the door and belt partition influences significantly (maximum risk ratio: 17.4, 95% CI: 7.94-17.38). For that reason, we suggest that visual impression has a strong influence on an individuals' actions. Even if structural limitations are present, zonal division by belt partition will provide a visual deterrent. Then, the partitioned zone will serve as a buffer zone. We conclude that if the belt partition is used properly, it is an inexpensive and effective safety management device for MRI rooms.

  16. Lost in the supermarket: Quantifying the cost of partitioning memory sets in hybrid search.

    PubMed

    Boettcher, Sage E P; Drew, Trafton; Wolfe, Jeremy M

    2018-01-01

    The items on a memorized grocery list are not relevant in every aisle; for example, it is useless to search for the cabbage in the cereal aisle. It might be beneficial if one could mentally partition the list so only the relevant subset was active, so that vegetables would be activated in the produce section. In four experiments, we explored observers' abilities to partition memory searches. For example, if observers held 16 items in memory, but only eight of the items were relevant, would response times resemble a search through eight or 16 items? In Experiments 1a and 1b, observers were not faster for the partition set; however, they suffered relatively small deficits when "lures" (items from the irrelevant subset) were presented, indicating that they were aware of the partition. In Experiment 2 the partitions were based on semantic distinctions, and again, observers were unable to restrict search to the relevant items. In Experiments 3a and 3b, observers attempted to remove items from the list one trial at a time but did not speed up over the course of a block, indicating that they also could not limit their memory searches. Finally, Experiments 4a, 4b, 4c, and 4d showed that observers were able to limit their memory searches when a subset was relevant for a run of trials. Overall, observers appear to be unable or unwilling to partition memory sets from trial to trial, yet they are capable of restricting search to a memory subset that remains relevant for several trials. This pattern is consistent with a cost to switching between currently relevant memory items.

  17. A Bayesian partition modelling approach to resolve spatial variability in climate records from borehole temperature inversion

    NASA Astrophysics Data System (ADS)

    Hopcroft, Peter O.; Gallagher, Kerry; Pain, Christopher C.

    2009-08-01

    Collections of suitably chosen borehole profiles can be used to infer large-scale trends in ground-surface temperature (GST) histories for the past few hundred years. These reconstructions are based on a large database of carefully selected borehole temperature measurements from around the globe. Since non-climatic thermal influences are difficult to identify, representative temperature histories are derived by averaging individual reconstructions to minimize the influence of these perturbing factors. This may lead to three potentially important drawbacks: the net signal of non-climatic factors may not be zero, meaning that the average does not reflect the best estimate of past climate; the averaging over large areas restricts the useful amount of more local climate change information available; and the inversion methods used to reconstruct the past temperatures at each site must be mathematically identical and are therefore not necessarily best suited to all data sets. In this work, we avoid these issues by using a Bayesian partition model (BPM), which is computed using a trans-dimensional form of a Markov chain Monte Carlo algorithm. This then allows the number and spatial distribution of different GST histories to be inferred from a given set of borehole data by partitioning the geographical area into discrete partitions. Profiles that are heavily influenced by non-climatic factors will be partitioned separately. Conversely, profiles with climatic information, which is consistent with neighbouring profiles, will then be inferred to lie in the same partition. The geographical extent of these partitions then leads to information on the regional extent of the climatic signal. In this study, three case studies are described using synthetic and real data. The first demonstrates that the Bayesian partition model method is able to correctly partition a suite of synthetic profiles according to the inferred GST history. In the second, more realistic case, a series of temperature profiles are calculated using surface air temperatures of a global climate model simulation. In the final case, 23 real boreholes from the United Kingdom, previously used for climatic reconstructions, are examined and the results compared with a local instrumental temperature series and the previous estimate derived from the same borehole data. The results indicate that the majority (17) of the 23 boreholes are unsuitable for climatic reconstruction purposes, at least without including other thermal processes in the forward model.

  18. 25 CFR 161.6 - Are there any other restrictions on information given to BIA?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Are there any other restrictions on information given to BIA? 161.6 Section 161.6 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER NAVAJO PARTITIONED LANDS GRAZING PERMITS Definitions, Authority, Purpose, and Scope § 161.6 Are there any...

  19. 25 CFR 161.6 - Are there any other restrictions on information given to BIA?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Are there any other restrictions on information given to BIA? 161.6 Section 161.6 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER NAVAJO PARTITIONED LANDS GRAZING PERMITS Definitions, Authority, Purpose, and Scope § 161.6 Are there any...

  20. Entanglement, replicas, and Thetas

    NASA Astrophysics Data System (ADS)

    Mukhi, Sunil; Murthy, Sameer; Wu, Jie-Qiang

    2018-01-01

    We compute the single-interval Rényi entropy (replica partition function) for free fermions in 1+1d at finite temperature and finite spatial size by two methods: (i) using the higher-genus partition function on the replica Riemann surface, and (ii) using twist operators on the torus. We compare the two answers for a restricted set of spin structures, leading to a non-trivial proposed equivalence between higher-genus Siegel Θ-functions and Jacobi θ-functions. We exhibit this proposal and provide substantial evidence for it. The resulting expressions can be elegantly written in terms of Jacobi forms. Thereafter we argue that the correct Rényi entropy for modular-invariant free-fermion theories, such as the Ising model and the Dirac CFT, is given by the higher-genus computation summed over all spin structures. The result satisfies the physical checks of modular covariance, the thermal entropy relation, and Bose-Fermi equivalence.

  1. Hydraulic balancing of a control component within a nuclear reactor

    DOEpatents

    Marinos, D.; Ripfel, H.C.F.

    1975-10-14

    A reactor control component includes an inner conduit, for instance containing neutron absorber elements, adapted for longitudinal movement within an outer guide duct. A transverse partition partially encloses one end of the conduit and meets a transverse wall within the guide duct when the conduit is fully inserted into the reactor core. A tube piece extends from the transverse partition and is coaxially aligned to be received within a tubular receptacle which extends from the transverse wall. The tube piece and receptacle cooperate in engagement to restrict the flow and pressure of coolant beneath the transverse partition and thereby minimize upward forces tending to expel the inner conduit.

  2. [On the partition of acupuncture academic schools].

    PubMed

    Yang, Pengyan; Luo, Xi; Xia, Youbing

    2016-05-01

    Nowadays extensive attention has been paid on the research of acupuncture academic schools, however, a widely accepted method of partition of acupuncture academic schools is still in need. In this paper, the methods of partition of acupuncture academic schools in the history have been arranged, and three typical methods of"partition of five schools" "partition of eighteen schools" and "two-stage based partition" are summarized. After adeep analysis on the disadvantages and advantages of these three methods, a new method of partition of acupuncture academic schools that is called "three-stage based partition" is proposed. In this method, after the overall acupuncture academic schools are divided into an ancient stage, a modern stage and a contemporary stage, each schoolis divided into its sub-school category. It is believed that this method of partition can remedy the weaknesses ofcurrent methods, but also explore a new model of inheritance and development under a different aspect through thedifferentiation and interaction of acupuncture academic schools at three stages.

  3. Chaos M-ary modulation and demodulation method based on Hamilton oscillator and its application in communication.

    PubMed

    Fu, Yongqing; Li, Xingyuan; Li, Yanan; Yang, Wei; Song, Hailiang

    2013-03-01

    Chaotic communication has aroused general interests in recent years, but its communication effect is not ideal with the restriction of chaos synchronization. In this paper a new chaos M-ary digital modulation and demodulation method is proposed. By using region controllable characteristics of spatiotemporal chaos Hamilton map in phase plane and chaos unique characteristic, which is sensitive to initial value, zone mapping method is proposed. It establishes the map relationship between M-ary digital information and the region of Hamilton map phase plane, thus the M-ary information chaos modulation is realized. In addition, zone partition demodulation method is proposed based on the structure characteristic of Hamilton modulated information, which separates M-ary information from phase trajectory of chaotic Hamilton map, and the theory analysis of zone partition demodulator's boundary range is given. Finally, the communication system based on the two methods is constructed on the personal computer. The simulation shows that in high speed transmission communications and with no chaos synchronization circumstance, the proposed chaotic M-ary modulation and demodulation method has outperformed some conventional M-ary modulation methods, such as quadrature phase shift keying and M-ary pulse amplitude modulation in bit error rate. Besides, it has performance improvement in bandwidth efficiency, transmission efficiency and anti-noise performance, and the system complexity is low and chaos signal is easy to generate.

  4. Generalized monogamy relations of concurrence for N -qubit systems

    NASA Astrophysics Data System (ADS)

    Zhu, Xue-Na; Fei, Shao-Ming

    2015-12-01

    We present a different kind of monogamous relations based on concurrence and concurrence of assistance. For N -qubit systems A B C1...CN -2 , the monogamy relations satisfied by the concurrence of N -qubit pure states under the partition A B and C1...CN -2 , as well as under the partition A B C1 and C2...CN -2 , are established, which gives rise to a kind of restrictions on the entanglement distribution and trade off among the subsystems.

  5. Interaction of Airspace Partitions and Traffic Flow Management Delay

    NASA Technical Reports Server (NTRS)

    Palopo, Kee; Chatterji, Gano B.; Lee, Hak-Tae

    2010-01-01

    To ensure that air traffic demand does not exceed airport and airspace capacities, traffic management restrictions, such as delaying aircraft on the ground, assigning them different routes and metering them in the airspace, are implemented. To reduce the delays resulting from these restrictions, revising the partitioning of airspace has been proposed to distribute capacity to yield a more efficient airspace configuration. The capacity of an airspace partition, commonly referred to as a sector, is limited by the number of flights that an air traffic controller can safely manage within the sector. Where viable, re-partitioning of the airspace distributes the flights over more efficient sectors and reduces individual sector demand. This increases the overall airspace efficiency, but requires additional resources in some sectors in terms of controllers and equipment, which is undesirable. This study examines the tradeoff of the number of sectors designed for a specified amount of traffic in a clear-weather day and the delays needed for accommodating the traffic demand. Results show that most of the delays are caused by airport arrival and departure capacity constraints. Some delays caused by airspace capacity constraints can be eliminated by re-partitioning the airspace. Analyses show that about 360 high-altitude sectors, which are approximately today s operational number of sectors of 373, are adequate for delays to be driven solely by airport capacity constraints for the current daily air traffic demand. For a marginal increase of 15 seconds of average delay, the number of sectors can be reduced to 283. In addition, simulations of traffic growths of 15% and 20% with forecasted airport capacities in the years 2018 and 2025 show that delays will continue to be governed by airport capacities. In clear-weather days, for small increases in traffic demand, increasing sector capacities will have almost no effect on delays.

  6. A modified anomaly detection method for capsule endoscopy images using non-linear color conversion and Higher-order Local Auto-Correlation (HLAC).

    PubMed

    Hu, Erzhong; Nosato, Hirokazu; Sakanashi, Hidenori; Murakawa, Masahiro

    2013-01-01

    Capsule endoscopy is a patient-friendly endoscopy broadly utilized in gastrointestinal examination. However, the efficacy of diagnosis is restricted by the large quantity of images. This paper presents a modified anomaly detection method, by which both known and unknown anomalies in capsule endoscopy images of small intestine are expected to be detected. To achieve this goal, this paper introduces feature extraction using a non-linear color conversion and Higher-order Local Auto Correlation (HLAC) Features, and makes use of image partition and subspace method for anomaly detection. Experiments are implemented among several major anomalies with combinations of proposed techniques. As the result, the proposed method achieved 91.7% and 100% detection accuracy for swelling and bleeding respectively, so that the effectiveness of proposed method is demonstrated.

  7. Nonequilibrium thermodynamics of restricted Boltzmann machines.

    PubMed

    Salazar, Domingos S P

    2017-08-01

    In this work, we analyze the nonequilibrium thermodynamics of a class of neural networks known as restricted Boltzmann machines (RBMs) in the context of unsupervised learning. We show how the network is described as a discrete Markov process and how the detailed balance condition and the Maxwell-Boltzmann equilibrium distribution are sufficient conditions for a complete thermodynamics description, including nonequilibrium fluctuation theorems. Numerical simulations in a fully trained RBM are performed and the heat exchange fluctuation theorem is verified with excellent agreement to the theory. We observe how the contrastive divergence functional, mostly used in unsupervised learning of RBMs, is closely related to nonequilibrium thermodynamic quantities. We also use the framework to interpret the estimation of the partition function of RBMs with the annealed importance sampling method from a thermodynamics standpoint. Finally, we argue that unsupervised learning of RBMs is equivalent to a work protocol in a system driven by the laws of thermodynamics in the absence of labeled data.

  8. Censored quantile regression with recursive partitioning-based weights

    PubMed Central

    Wey, Andrew; Wang, Lan; Rudser, Kyle

    2014-01-01

    Censored quantile regression provides a useful alternative to the Cox proportional hazards model for analyzing survival data. It directly models the conditional quantile of the survival time and hence is easy to interpret. Moreover, it relaxes the proportionality constraint on the hazard function associated with the popular Cox model and is natural for modeling heterogeneity of the data. Recently, Wang and Wang (2009. Locally weighted censored quantile regression. Journal of the American Statistical Association 103, 1117–1128) proposed a locally weighted censored quantile regression approach that allows for covariate-dependent censoring and is less restrictive than other censored quantile regression methods. However, their kernel smoothing-based weighting scheme requires all covariates to be continuous and encounters practical difficulty with even a moderate number of covariates. We propose a new weighting approach that uses recursive partitioning, e.g. survival trees, that offers greater flexibility in handling covariate-dependent censoring in moderately high dimensions and can incorporate both continuous and discrete covariates. We prove that this new weighting scheme leads to consistent estimation of the quantile regression coefficients and demonstrate its effectiveness via Monte Carlo simulations. We also illustrate the new method using a widely recognized data set from a clinical trial on primary biliary cirrhosis. PMID:23975800

  9. Excess flow shutoff valve

    DOEpatents

    Kiffer, Micah S.; Tentarelli, Stephen Clyde

    2016-02-09

    Excess flow shutoff valve comprising a valve body, a valve plug, a partition, and an activation component where the valve plug, the partition, and activation component are disposed within the valve body. A suitable flow restriction is provided to create a pressure difference between the upstream end of the valve plug and the downstream end of the valve plug when fluid flows through the valve body. The pressure difference exceeds a target pressure difference needed to activate the activation component when fluid flow through the valve body is higher than a desired rate, and thereby closes the valve.

  10. Weighted graph cuts without eigenvectors a multilevel approach.

    PubMed

    Dhillon, Inderjit S; Guan, Yuqiang; Kulis, Brian

    2007-11-01

    A variety of clustering algorithms have recently been proposed to handle data that is not linearly separable; spectral clustering and kernel k-means are two of the main methods. In this paper, we discuss an equivalence between the objective functions used in these seemingly different methods--in particular, a general weighted kernel k-means objective is mathematically equivalent to a weighted graph clustering objective. We exploit this equivalence to develop a fast, high-quality multilevel algorithm that directly optimizes various weighted graph clustering objectives, such as the popular ratio cut, normalized cut, and ratio association criteria. This eliminates the need for any eigenvector computation for graph clustering problems, which can be prohibitive for very large graphs. Previous multilevel graph partitioning methods, such as Metis, have suffered from the restriction of equal-sized clusters; our multilevel algorithm removes this restriction by using kernel k-means to optimize weighted graph cuts. Experimental results show that our multilevel algorithm outperforms a state-of-the-art spectral clustering algorithm in terms of speed, memory usage, and quality. We demonstrate that our algorithm is applicable to large-scale clustering tasks such as image segmentation, social network analysis and gene network analysis.

  11. On N = 1 partition functions without R-symmetry

    DOE PAGES

    Knodel, Gino; Liu, James T.; Zayas, Leopoldo A. Pando

    2015-03-25

    Here, we examine the dependence of four-dimensional Euclidean N = 1 partition functions on coupling constants. In particular, we focus on backgrounds without R-symmetry, which arise in the rigid limit of old minimal supergravity. Backgrounds preserving a single supercharge may be classified as having either trivial or SU(2) structure, with the former including S 4. We show that, in the absence of additional symmetries, the partition function depends non-trivially on all couplings in the trivial structure case, and (anti)-holomorphically on couplings in the SU(2) structure case. In both cases, this allows for ambiguities in the form of finite counterterms, whichmore » in principle render the partition function unphysical. However, we argue that on dimensional grounds, ambiguities are restricted to finite powers in relevant couplings, and can therefore be kept under control. On the other hand, for backgrounds preserving supercharges of opposite chiralities, the partition function is completely independent of all couplings. In this case, the background admits an R-symmetry, and the partition function is physical, in agreement with the results obtained in the rigid limit of new minimal supergravity. Based on a systematic analysis of supersymmetric invariants, we also demonstrate that N = 1 localization is not possible for backgrounds without R-symmetry.« less

  12. Robust Intratumor Partitioning to Identify High-Risk Subregions in Lung Cancer: A Pilot Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jia; Gensheimer, Michael F.; Dong, Xinzhe

    2016-08-01

    Purpose: To develop an intratumor partitioning framework for identifying high-risk subregions from {sup 18}F-fluorodeoxyglucose positron emission tomography (FDG-PET) and computed tomography (CT) imaging and to test whether tumor burden associated with the high-risk subregions is prognostic of outcomes in lung cancer. Methods and Materials: In this institutional review board–approved retrospective study, we analyzed the pretreatment FDG-PET and CT scans of 44 lung cancer patients treated with radiation therapy. A novel, intratumor partitioning method was developed, based on a 2-stage clustering process: first at the patient level, each tumor was over-segmented into many superpixels by k-means clustering of integrated PET andmore » CT images; next, tumor subregions were identified by merging previously defined superpixels via population-level hierarchical clustering. The volume associated with each of the subregions was evaluated using Kaplan-Meier analysis regarding its prognostic capability in predicting overall survival (OS) and out-of-field progression (OFP). Results: Three spatially distinct subregions were identified within each tumor that were highly robust to uncertainty in PET/CT co-registration. Among these, the volume of the most metabolically active and metabolically heterogeneous solid component of the tumor was predictive of OS and OFP on the entire cohort, with a concordance index or CI of 0.66-0.67. When restricting the analysis to patients with stage III disease (n=32), the same subregion achieved an even higher CI of 0.75 (hazard ratio 3.93, log-rank P=.002) for predicting OS, and a CI of 0.76 (hazard ratio 4.84, log-rank P=.002) for predicting OFP. In comparison, conventional imaging markers, including tumor volume, maximum standardized uptake value, and metabolic tumor volume using threshold of 50% standardized uptake value maximum, were not predictive of OS or OFP, with CI mostly below 0.60 (log-rank P>.05). Conclusion: We propose a robust intratumor partitioning method to identify clinically relevant, high-risk subregions in lung cancer. We envision that this approach will be applicable to identifying useful imaging biomarkers in many cancer types.« less

  13. Third order maximum-principle-satisfying direct discontinuous Galerkin methods for time dependent convection diffusion equations on unstructured triangular meshes

    DOE PAGES

    Chen, Zheng; Huang, Hongying; Yan, Jue

    2015-12-21

    We develop 3rd order maximum-principle-satisfying direct discontinuous Galerkin methods [8], [9], [19] and [21] for convection diffusion equations on unstructured triangular mesh. We carefully calculate the normal derivative numerical flux across element edges and prove that, with proper choice of parameter pair (β 0,β 1) in the numerical flux formula, the quadratic polynomial solution satisfies strict maximum principle. The polynomial solution is bounded within the given range and third order accuracy is maintained. There is no geometric restriction on the meshes and obtuse triangles are allowed in the partition. As a result, a sequence of numerical examples are carried outmore » to demonstrate the accuracy and capability of the maximum-principle-satisfying limiter.« less

  14. Partitioning Strategy Using Static Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Seo, Yongjin; Soo Kim, Hyeon

    2016-08-01

    Flight software is software used in satellites' on-board computers. It has requirements such as real time and reliability. The IMA architecture is used to satisfy these requirements. The IMA architecture has the concept of partitions and this affected the configuration of flight software. That is, situations occurred in which software that had been loaded on one system was divided into many partitions when being loaded. For new issues, existing studies use experience based partitioning methods. However, these methods have a problem that they cannot be reused. In this respect, this paper proposes a partitioning method that is reusable and consistent.

  15. Brain Network Regional Synchrony Analysis in Deafness

    PubMed Central

    Xu, Lei; Liang, Mao-Jin

    2018-01-01

    Deafness, the most common auditory disease, has greatly affected people for a long time. The major treatment for deafness is cochlear implantation (CI). However, till today, there is still a lack of objective and precise indicator serving as evaluation of the effectiveness of the cochlear implantation. The goal of this EEG-based study is to effectively distinguish CI children from those prelingual deafened children without cochlear implantation. The proposed method is based on the functional connectivity analysis, which focuses on the brain network regional synchrony. Specifically, we compute the functional connectivity between each channel pair first. Then, we quantify the brain network synchrony among regions of interests (ROIs), where both intraregional synchrony and interregional synchrony are computed. And finally the synchrony values are concatenated to form the feature vector for the SVM classifier. What is more, we develop a new ROI partition method of 128-channel EEG recording system. That is, both the existing ROI partition method and the proposed ROI partition method are used in the experiments. Compared with the existing EEG signal classification methods, our proposed method has achieved significant improvements as large as 87.20% and 86.30% when the existing ROI partition method and the proposed ROI partition method are used, respectively. It further demonstrates that the new ROI partition method is comparable to the existing ROI partition method. PMID:29854776

  16. A Novel Coarsening Method for Scalable and Efficient Mesh Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, A; Hysom, D; Gunney, B

    2010-12-02

    In this paper, we propose a novel mesh coarsening method called brick coarsening method. The proposed method can be used in conjunction with any graph partitioners and scales to very large meshes. This method reduces problem space by decomposing the original mesh into fixed-size blocks of nodes called bricks, layered in a similar way to conventional brick laying, and then assigning each node of the original mesh to appropriate brick. Our experiments indicate that the proposed method scales to very large meshes while allowing simple RCB partitioner to produce higher-quality partitions with significantly less edge cuts. Our results further indicatemore » that the proposed brick-coarsening method allows more complicated partitioners like PT-Scotch to scale to very large problem size while still maintaining good partitioning performance with relatively good edge-cut metric. Graph partitioning is an important problem that has many scientific and engineering applications in such areas as VLSI design, scientific computing, and resource management. Given a graph G = (V,E), where V is the set of vertices and E is the set of edges, (k-way) graph partitioning problem is to partition the vertices of the graph (V) into k disjoint groups such that each group contains roughly equal number of vertices and the number of edges connecting vertices in different groups is minimized. Graph partitioning plays a key role in large scientific computing, especially in mesh-based computations, as it is used as a tool to minimize the volume of communication and to ensure well-balanced load across computing nodes. The impact of graph partitioning on the reduction of communication can be easily seen, for example, in different iterative methods to solve a sparse system of linear equation. Here, a graph partitioning technique is applied to the matrix, which is basically a graph in which each edge is a non-zero entry in the matrix, to allocate groups of vertices to processors in such a way that many of matrix-vector multiplication can be performed locally on each processor and hence to minimize communication. Furthermore, a good graph partitioning scheme ensures the equal amount of computation performed on each processor. Graph partitioning is a well known NP-complete problem, and thus the most commonly used graph partitioning algorithms employ some forms of heuristics. These algorithms vary in terms of their complexity, partition generation time, and the quality of partitions, and they tend to trade off these factors. A significant challenge we are currently facing at the Lawrence Livermore National Laboratory is how to partition very large meshes on massive-size distributed memory machines like IBM BlueGene/P, where scalability becomes a big issue. For example, we have found that the ParMetis, a very popular graph partitioning tool, can only scale to 16K processors. An ideal graph partitioning method on such an environment should be fast and scale to very large meshes, while producing high quality partitions. This is an extremely challenging task, as to scale to that level, the partitioning algorithm should be simple and be able to produce partitions that minimize inter-processor communications and balance the load imposed on the processors. Our goals in this work are two-fold: (1) To develop a new scalable graph partitioning method with good load balancing and communication reduction capability. (2) To study the performance of the proposed partitioning method on very large parallel machines using actual data sets and compare the performance to that of existing methods. The proposed method achieves the desired scalability by reducing the mesh size. For this, it coarsens an input mesh into a smaller size mesh by coalescing the vertices and edges of the original mesh into a set of mega-vertices and mega-edges. A new coarsening method called brick algorithm is developed in this research. In the brick algorithm, the zones in a given mesh are first grouped into fixed size blocks called bricks. These brick are then laid in a way similar to conventional brick laying technique, which reduces the number of neighboring blocks each block needs to communicate. Contributions of this research are as follows: (1) We have developed a novel method that scales to a really large problem size while producing high quality mesh partitions; (2) We measured the performance and scalability of the proposed method on a machine of massive size using a set of actual large complex data sets, where we have scaled to a mesh with 110 million zones using our method. To the best of our knowledge, this is the largest complex mesh that a partitioning method is successfully applied to; and (3) We have shown that proposed method can reduce the number of edge cuts by as much as 65%.« less

  17. 21 CFR 1301.73 - Physical security controls for non-practitioners; compounders for narcotic treatment programs...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Physical security controls for non-practitioners... and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE REGISTRATION OF MANUFACTURERS... such as walls or partitions, by traffic control lines or restricted space designation. The employee...

  18. Method for chemical amplification based on fluid partitioning in an immiscible liquid

    DOEpatents

    Anderson, Brian L.; Colston, Bill W.; Elkin, Christopher J.

    2015-06-02

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  19. Method for chemical amplification based on fluid partitioning in an immiscible liquid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Brian L.; Colston, Bill W.; Elkin, Christopher J.

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Zheng; Huang, Hongying; Yan, Jue

    We develop 3rd order maximum-principle-satisfying direct discontinuous Galerkin methods [8], [9], [19] and [21] for convection diffusion equations on unstructured triangular mesh. We carefully calculate the normal derivative numerical flux across element edges and prove that, with proper choice of parameter pair (β 0,β 1) in the numerical flux formula, the quadratic polynomial solution satisfies strict maximum principle. The polynomial solution is bounded within the given range and third order accuracy is maintained. There is no geometric restriction on the meshes and obtuse triangles are allowed in the partition. As a result, a sequence of numerical examples are carried outmore » to demonstrate the accuracy and capability of the maximum-principle-satisfying limiter.« less

  1. The prediction of blood-tissue partitions, water-skin partitions and skin permeation for agrochemicals.

    PubMed

    Abraham, Michael H; Gola, Joelle M R; Ibrahim, Adam; Acree, William E; Liu, Xiangli

    2014-07-01

    There is considerable interest in the blood-tissue distribution of agrochemicals, and a number of researchers have developed experimental methods for in vitro distribution. These methods involve the determination of saline-blood and saline-tissue partitions; not only are they indirect, but they do not yield the required in vivo distribution. The authors set out equations for gas-tissue and blood-tissue distribution, for partition from water into skin and for permeation from water through human skin. Together with Abraham descriptors for the agrochemicals, these equations can be used to predict values for all of these processes. The present predictions compare favourably with experimental in vivo blood-tissue distribution where available. The predictions require no more than simple arithmetic. The present method represents a much easier and much more economic way of estimating blood-tissue partitions than the method that uses saline-blood and saline-tissue partitions. It has the added advantages of yielding the required in vivo partitions and being easily extended to the prediction of partition of agrochemicals from water into skin and permeation from water through skin. © 2013 Society of Chemical Industry.

  2. A strategy to load balancing for non-connectivity MapReduce job

    NASA Astrophysics Data System (ADS)

    Zhou, Huaping; Liu, Guangzong; Gui, Haixia

    2017-09-01

    MapReduce has been widely used in large scale and complex datasets as a kind of distributed programming model. Original Hash partitioning function in MapReduce often results the problem of data skew when data distribution is uneven. To solve the imbalance of data partitioning, we proposes a strategy to change the remaining partitioning index when data is skewed. In Map phase, we count the amount of data which will be distributed to each reducer, then Job Tracker monitor the global partitioning information and dynamically modify the original partitioning function according to the data skew model, so the Partitioner can change the index of these partitioning which will cause data skew to the other reducer that has less load in the next partitioning process, and can eventually balance the load of each node. Finally, we experimentally compare our method with existing methods on both synthetic and real datasets, the experimental results show our strategy can solve the problem of data skew with better stability and efficiency than Hash method and Sampling method for non-connectivity MapReduce task.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, J; Gensheimer, M; Dong, X

    Purpose: To develop an intra-tumor partitioning framework for identifying high-risk subregions from 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) and CT imaging, and to test whether tumor burden associated with the high-risk subregions is prognostic of outcomes in lung cancer. Methods: In this institutional review board-approved retrospective study, we analyzed the pre-treatment FDG-PET and CT scans of 44 lung cancer patients treated with radiotherapy. A novel, intra-tumor partitioning method was developed based on a two-stage clustering process: first at patient-level, each tumor was over-segmented into many superpixels by k-means clustering of integrated PET and CT images; next, tumor subregions were identified bymore » merging previously defined superpixels via population-level hierarchical clustering. The volume associated with each of the subregions was evaluated using Kaplan-Meier analysis regarding its prognostic capability in predicting overall survival (OS) and out-of-field progression (OFP). Results: Three spatially distinct subregions were identified within each tumor, which were highly robust to uncertainty in PET/CT co-registration. Among these, the volume of the most metabolically active and metabolically heterogeneous solid component of the tumor was predictive of OS and OFP on the entire cohort, with a concordance index or CI = 0.66–0.67. When restricting the analysis to patients with stage III disease (n = 32), the same subregion achieved an even higher CI = 0.75 (HR = 3.93, logrank p = 0.002) for predicting OS, and a CI = 0.76 (HR = 4.84, logrank p = 0.002) for predicting OFP. In comparison, conventional imaging markers including tumor volume, SUVmax and MTV50 were not predictive of OS or OFP, with CI mostly below 0.60 (p < 0.001). Conclusion: We propose a robust intra-tumor partitioning method to identify clinically relevant, high-risk subregions in lung cancer. We envision that this approach will be applicable to identifying useful imaging biomarkers in many cancer types.« less

  4. Partitioning of Alkali Metal Salts and Boric Acid from Aqueous Phase into the Polyamide Active Layers of Reverse Osmosis Membranes.

    PubMed

    Wang, Jingbo; Kingsbury, Ryan S; Perry, Lamar A; Coronell, Orlando

    2017-02-21

    The partition coefficient of solutes into the polyamide active layer of reverse osmosis (RO) membranes is one of the three membrane properties (together with solute diffusion coefficient and active layer thickness) that determine solute permeation. However, no well-established method exists to measure solute partition coefficients into polyamide active layers. Further, the few studies that measured partition coefficients for inorganic salts report values significantly higher than one (∼3-8), which is contrary to expectations from Donnan theory and the observed high rejection of salts. As such, we developed a benchtop method to determine solute partition coefficients into the polyamide active layers of RO membranes. The method uses a quartz crystal microbalance (QCM) to measure the change in the mass of the active layer caused by the uptake of the partitioned solutes. The method was evaluated using several inorganic salts (alkali metal salts of chloride) and a weak acid of common concern in water desalination (boric acid). All partition coefficients were found to be lower than 1, in general agreement with expectations from Donnan theory. Results reported in this study advance the fundamental understanding of contaminant transport through RO membranes, and can be used in future studies to decouple the contributions of contaminant partitioning and diffusion to contaminant permeation.

  5. Domain decomposition by the advancing-partition method for parallel unstructured grid generation

    NASA Technical Reports Server (NTRS)

    Banihashemi, legal representative, Soheila (Inventor); Pirzadeh, Shahyar Z. (Inventor)

    2012-01-01

    In a method for domain decomposition for generating unstructured grids, a surface mesh is generated for a spatial domain. A location of a partition plane dividing the domain into two sections is determined. Triangular faces on the surface mesh that intersect the partition plane are identified. A partition grid of tetrahedral cells, dividing the domain into two sub-domains, is generated using a marching process in which a front comprises only faces of new cells which intersect the partition plane. The partition grid is generated until no active faces remain on the front. Triangular faces on each side of the partition plane are collected into two separate subsets. Each subset of triangular faces is renumbered locally and a local/global mapping is created for each sub-domain. A volume grid is generated for each sub-domain. The partition grid and volume grids are then merged using the local-global mapping.

  6. Methods and Systems for Authorizing an Effector Command in an Integrated Modular Environment

    NASA Technical Reports Server (NTRS)

    Sunderland, Dean E. (Inventor); Ahrendt, Terry J. (Inventor); Moore, Tim (Inventor)

    2013-01-01

    Methods and systems are provided for authorizing a command of an integrated modular environment in which a plurality of partitions control actions of a plurality of effectors is provided. A first identifier, a second identifier, and a third identifier are determined. The first identifier identifies a first partition of the plurality of partitions from which the command originated. The second identifier identifies a first effector of the plurality of effectors for which the command is intended. The third identifier identifies a second partition of the plurality of partitions that is responsible for controlling the first effector. The first identifier and the third identifier are compared to determine whether the first partition is the same as the second partition for authorization of the command.

  7. Domain Decomposition By the Advancing-Partition Method for Parallel Unstructured Grid Generation

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.; Zagaris, George

    2009-01-01

    A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.

  8. Domain Decomposition By the Advancing-Partition Method

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    2008-01-01

    A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.

  9. Construction and Analysis of Multi-Rate Partitioned Runge-Kutta Methods

    DTIC Science & Technology

    2012-06-01

    ANALYSIS OF MULTI-RATE PARTITIONED RUNGE-KUTTA METHODS by Patrick R. Mugg June 2012 Thesis Advisor: Francis Giraldo Second Reader: Hong...COVERED Master’s Thesis 4. TITLE AND SUBTITLE Construction and Analysis of Multi-Rate Partitioned Runge-Kutta Methods 5. FUNDING NUMBERS 6. AUTHOR...The most widely known and used procedure for analyzing stability is the Von Neumann Method , such that Von Neumann’s stability analysis looks at

  10. Robust Intratumor Partitioning to Identify High-Risk Subregions in Lung Cancer: A Pilot Study.

    PubMed

    Wu, Jia; Gensheimer, Michael F; Dong, Xinzhe; Rubin, Daniel L; Napel, Sandy; Diehn, Maximilian; Loo, Billy W; Li, Ruijiang

    2016-08-01

    To develop an intratumor partitioning framework for identifying high-risk subregions from (18)F-fluorodeoxyglucose positron emission tomography (FDG-PET) and computed tomography (CT) imaging and to test whether tumor burden associated with the high-risk subregions is prognostic of outcomes in lung cancer. In this institutional review board-approved retrospective study, we analyzed the pretreatment FDG-PET and CT scans of 44 lung cancer patients treated with radiation therapy. A novel, intratumor partitioning method was developed, based on a 2-stage clustering process: first at the patient level, each tumor was over-segmented into many superpixels by k-means clustering of integrated PET and CT images; next, tumor subregions were identified by merging previously defined superpixels via population-level hierarchical clustering. The volume associated with each of the subregions was evaluated using Kaplan-Meier analysis regarding its prognostic capability in predicting overall survival (OS) and out-of-field progression (OFP). Three spatially distinct subregions were identified within each tumor that were highly robust to uncertainty in PET/CT co-registration. Among these, the volume of the most metabolically active and metabolically heterogeneous solid component of the tumor was predictive of OS and OFP on the entire cohort, with a concordance index or CI of 0.66-0.67. When restricting the analysis to patients with stage III disease (n=32), the same subregion achieved an even higher CI of 0.75 (hazard ratio 3.93, log-rank P=.002) for predicting OS, and a CI of 0.76 (hazard ratio 4.84, log-rank P=.002) for predicting OFP. In comparison, conventional imaging markers, including tumor volume, maximum standardized uptake value, and metabolic tumor volume using threshold of 50% standardized uptake value maximum, were not predictive of OS or OFP, with CI mostly below 0.60 (log-rank P>.05). We propose a robust intratumor partitioning method to identify clinically relevant, high-risk subregions in lung cancer. We envision that this approach will be applicable to identifying useful imaging biomarkers in many cancer types. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. New approach to canonical partition functions computation in Nf=2 lattice QCD at finite baryon density

    NASA Astrophysics Data System (ADS)

    Bornyakov, V. G.; Boyda, D. L.; Goy, V. A.; Molochkov, A. V.; Nakamura, Atsushi; Nikolaev, A. A.; Zakharov, V. I.

    2017-05-01

    We propose and test a new approach to computation of canonical partition functions in lattice QCD at finite density. We suggest a few steps procedure. We first compute numerically the quark number density for imaginary chemical potential i μq I . Then we restore the grand canonical partition function for imaginary chemical potential using the fitting procedure for the quark number density. Finally we compute the canonical partition functions using high precision numerical Fourier transformation. Additionally we compute the canonical partition functions using the known method of the hopping parameter expansion and compare results obtained by two methods in the deconfining as well as in the confining phases. The agreement between two methods indicates the validity of the new method. Our numerical results are obtained in two flavor lattice QCD with clover improved Wilson fermions.

  12. Alterations in internal partitioning of carbon in soybean plants in response to nitrogen stress

    NASA Technical Reports Server (NTRS)

    Rufty, T. W. Jr; Raper, C. D. Jr; Huber, S. C.

    1984-01-01

    Alterations in internal partitioning of carbon were evaluated in plants exposed to limited nitrogen supply. Vegetative, nonnodulated soybean plants (Glycine max (L.) Merrill, 'Ransom') were grown for 21 days with 1.0 mM NO3- and then exposed to solutions containing 1.0, 0.1, or 0.0 mM NO3- for a 25-day treatment period. In nitrogen-limited plants, there were decreases in emergence of new leaves and in the expansion rate and final area at full expansion of individual leaves. As indicated by alterations in accumulation of dry weight, a larger proportion of available carbon in the plant was partitioned to the roots with decreased availability of nitrogen. Partitioning of reduced nitrogen to the root also was increased and, in plants devoid of an external supply, considerable redistribution of reduced nitrogen from leaves to the root occurred. The general decrease in growth potential and sink strength for nutrients in leaves of nitrogen-limited plants suggested that factors other than simply availability of nitrogen likely were involved in the restriction of growth in the leaf canopy and the associated increase in carbon allocation to the roots.

  13. A novel partitioning method for block-structured adaptive meshes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Lin, E-mail: lin.fu@tum.de; Litvinov, Sergej, E-mail: sergej.litvinov@aer.mw.tum.de; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de

    We propose a novel partitioning method for block-structured adaptive meshes utilizing the meshless Lagrangian particle concept. With the observation that an optimum partitioning has high analogy to the relaxation of a multi-phase fluid to steady state, physically motivated model equations are developed to characterize the background mesh topology and are solved by multi-phase smoothed-particle hydrodynamics. In contrast to well established partitioning approaches, all optimization objectives are implicitly incorporated and achieved during the particle relaxation to stationary state. Distinct partitioning sub-domains are represented by colored particles and separated by a sharp interface with a surface tension model. In order to obtainmore » the particle relaxation, special viscous and skin friction models, coupled with a tailored time integration algorithm are proposed. Numerical experiments show that the present method has several important properties: generation of approximately equal-sized partitions without dependence on the mesh-element type, optimized interface communication between distinct partitioning sub-domains, continuous domain decomposition which is physically localized and implicitly incremental. Therefore it is particularly suitable for load-balancing of high-performance CFD simulations.« less

  14. A novel partitioning method for block-structured adaptive meshes

    NASA Astrophysics Data System (ADS)

    Fu, Lin; Litvinov, Sergej; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-07-01

    We propose a novel partitioning method for block-structured adaptive meshes utilizing the meshless Lagrangian particle concept. With the observation that an optimum partitioning has high analogy to the relaxation of a multi-phase fluid to steady state, physically motivated model equations are developed to characterize the background mesh topology and are solved by multi-phase smoothed-particle hydrodynamics. In contrast to well established partitioning approaches, all optimization objectives are implicitly incorporated and achieved during the particle relaxation to stationary state. Distinct partitioning sub-domains are represented by colored particles and separated by a sharp interface with a surface tension model. In order to obtain the particle relaxation, special viscous and skin friction models, coupled with a tailored time integration algorithm are proposed. Numerical experiments show that the present method has several important properties: generation of approximately equal-sized partitions without dependence on the mesh-element type, optimized interface communication between distinct partitioning sub-domains, continuous domain decomposition which is physically localized and implicitly incremental. Therefore it is particularly suitable for load-balancing of high-performance CFD simulations.

  15. The Partition of Multi-Resolution LOD Based on Qtm

    NASA Astrophysics Data System (ADS)

    Hou, M.-L.; Xing, H.-Q.; Zhao, X.-S.; Chen, J.

    2011-08-01

    The partition hierarch of Quaternary Triangular Mesh (QTM) determine the accuracy of spatial analysis and application based on QTM. In order to resolve the problem that the partition hierarch of QTM is limited by the level of the computer hardware, the new method that Multi- Resolution LOD (Level of Details) based on QTM will be discussed in this paper. This method can make the resolution of the cells varying with the viewpoint position by partitioning the cells of QTM, selecting the particular area according to the viewpoint; dealing with the cracks caused by different subdivisions, it satisfies the request of unlimited partition in part.

  16. Correlation functions in first-order phase transitions

    NASA Astrophysics Data System (ADS)

    Garrido, V.; Crespo, D.

    1997-09-01

    Most of the physical properties of systems underlying first-order phase transitions can be obtained from the spatial correlation functions. In this paper, we obtain expressions that allow us to calculate all the correlation functions from the droplet size distribution. Nucleation and growth kinetics is considered, and exact solutions are obtained for the case of isotropic growth by using self-similarity properties. The calculation is performed by using the particle size distribution obtained by a recently developed model (populational Kolmogorov-Johnson-Mehl-Avrami model). Since this model is less restrictive than that used in previously existing theories, the result is that the correlation functions can be obtained for any dependence of the kinetic parameters. The validity of the method is tested by comparison with the exact correlation functions, which had been obtained in the available cases by the time-cone method. Finally, the correlation functions corresponding to the microstructure developed in partitioning transformations are obtained.

  17. Dominant partition method. [based on a wave function formalism

    NASA Technical Reports Server (NTRS)

    Dixon, R. M.; Redish, E. F.

    1979-01-01

    By use of the L'Huillier, Redish, and Tandy (LRT) wave function formalism, a partially connected method, the dominant partition method (DPM) is developed for obtaining few body reductions of the many body problem in the LRT and Bencze, Redish, and Sloan (BRS) formalisms. The DPM maps the many body problem to a fewer body one by using the criterion that the truncated formalism must be such that consistency with the full Schroedinger equation is preserved. The DPM is based on a class of new forms for the irreducible cluster potential, which is introduced in the LRT formalism. Connectivity is maintained with respect to all partitions containing a given partition, which is referred to as the dominant partition. Degrees of freedom corresponding to the breakup of one or more of the clusters of the dominant partition are treated in a disconnected manner. This approach for simplifying the complicated BRS equations is appropriate for physical problems where a few body reaction mechanism prevails.

  18. Analysis of Partitioned Methods for the Biot System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bukac, Martina; Layton, William; Moraiti, Marina

    2015-02-18

    In this work, we present a comprehensive study of several partitioned methods for the coupling of flow and mechanics. We derive energy estimates for each method for the fully-discrete problem. We write the obtained stability conditions in terms of a key control parameter defined as a ratio of the coupling strength and the speed of propagation. Depending on the parameters in the problem, give the choice of the partitioned method which allows the largest time step. (C) 2015 Wiley Periodicals, Inc.

  19. Comparing methods for partitioning a decade of carbon dioxide and water vapor fluxes in a temperate forest

    Treesearch

    Benjamin N. Sulman; Daniel Tyler Roman; Todd M. Scanlon; Lixin Wang; Kimberly A. Novick

    2016-01-01

    The eddy covariance (EC) method is routinely used to measure net ecosystem fluxes of carbon dioxide (CO2) and evapotranspiration (ET) in terrestrial ecosystems. It is often desirable to partition CO2 flux into gross primary production (GPP) and ecosystem respiration (RE), and to partition ET into evaporation and...

  20. Significant Scales in Community Structure

    NASA Astrophysics Data System (ADS)

    Traag, V. A.; Krings, G.; van Dooren, P.

    2013-10-01

    Many complex networks show signs of modular structure, uncovered by community detection. Although many methods succeed in revealing various partitions, it remains difficult to detect at what scale some partition is significant. This problem shows foremost in multi-resolution methods. We here introduce an efficient method for scanning for resolutions in one such method. Additionally, we introduce the notion of ``significance'' of a partition, based on subgraph probabilities. Significance is independent of the exact method used, so could also be applied in other methods, and can be interpreted as the gain in encoding a graph by making use of a partition. Using significance, we can determine ``good'' resolution parameters, which we demonstrate on benchmark networks. Moreover, optimizing significance itself also shows excellent performance. We demonstrate our method on voting data from the European Parliament. Our analysis suggests the European Parliament has become increasingly ideologically divided and that nationality plays no role.

  1. Innovative Bayesian and Parsimony Phylogeny of Dung Beetles (Coleoptera, Scarabaeidae, Scarabaeinae) Enhanced by Ontology-Based Partitioning of Morphological Characters

    PubMed Central

    Tarasov, Sergei; Génier, François

    2015-01-01

    Scarabaeine dung beetles are the dominant dung feeding group of insects and are widely used as model organisms in conservation, ecology and developmental biology. Due to the conflicts among 13 recently published phylogenies dealing with the higher-level relationships of dung beetles, the phylogeny of this lineage remains largely unresolved. In this study, we conduct rigorous phylogenetic analyses of dung beetles, based on an unprecedented taxon sample (110 taxa) and detailed investigation of morphology (205 characters). We provide the description of morphology and thoroughly illustrate the used characters. Along with parsimony, traditionally used in the analysis of morphological data, we also apply the Bayesian method with a novel approach that uses anatomy ontology for matrix partitioning. This approach allows for heterogeneity in evolutionary rates among characters from different anatomical regions. Anatomy ontology generates a number of parameter-partition schemes which we compare using Bayes factor. We also test the effect of inclusion of autapomorphies in the morphological analysis, which hitherto has not been examined. Generally, schemes with more parameters were favored in the Bayesian comparison suggesting that characters located on different body regions evolve at different rates and that partitioning of the data matrix using anatomy ontology is reasonable; however, trees from the parsimony and all the Bayesian analyses were quite consistent. The hypothesized phylogeny reveals many novel clades and provides additional support for some clades recovered in previous analyses. Our results provide a solid basis for a new classification of dung beetles, in which the taxonomic limits of the tribes Dichotomiini, Deltochilini and Coprini are restricted and many new tribes must be described. Based on the consistency of the phylogeny with biogeography, we speculate that dung beetles may have originated in the Mesozoic contrary to the traditional view pointing to a Cenozoic origin. PMID:25781019

  2. Improving Unstructured Mesh Partitions for Multiple Criteria Using Mesh Adjacencies

    DOE PAGES

    Smith, Cameron W.; Rasquin, Michel; Ibanez, Dan; ...

    2018-02-13

    The scalability of unstructured mesh based applications depends on partitioning methods that quickly balance the computational work while reducing communication costs. Zhou et al. [SIAM J. Sci. Comput., 32 (2010), pp. 3201{3227; J. Supercomput., 59 (2012), pp. 1218{1228] demonstrated the combination of (hyper)graph methods with vertex and element partition improvement for PHASTA CFD scaling to hundreds of thousands of processes. Our work generalizes partition improvement to support balancing combinations of all the mesh entity dimensions (vertices, edges, faces, regions) in partitions with imbalances exceeding 70%. Improvement results are then presented for multiple entity dimensions on up to one million processesmore » on meshes with over 12 billion tetrahedral elements.« less

  3. Improving Unstructured Mesh Partitions for Multiple Criteria Using Mesh Adjacencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Cameron W.; Rasquin, Michel; Ibanez, Dan

    The scalability of unstructured mesh based applications depends on partitioning methods that quickly balance the computational work while reducing communication costs. Zhou et al. [SIAM J. Sci. Comput., 32 (2010), pp. 3201{3227; J. Supercomput., 59 (2012), pp. 1218{1228] demonstrated the combination of (hyper)graph methods with vertex and element partition improvement for PHASTA CFD scaling to hundreds of thousands of processes. Our work generalizes partition improvement to support balancing combinations of all the mesh entity dimensions (vertices, edges, faces, regions) in partitions with imbalances exceeding 70%. Improvement results are then presented for multiple entity dimensions on up to one million processesmore » on meshes with over 12 billion tetrahedral elements.« less

  4. Multilevel algorithms for nonlinear optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Dennis, J. E., Jr.

    1994-01-01

    Multidisciplinary design optimization (MDO) gives rise to nonlinear optimization problems characterized by a large number of constraints that naturally occur in blocks. We propose a class of multilevel optimization methods motivated by the structure and number of constraints and by the expense of the derivative computations for MDO. The algorithms are an extension to the nonlinear programming problem of the successful class of local Brown-Brent algorithms for nonlinear equations. Our extensions allow the user to partition constraints into arbitrary blocks to fit the application, and they separately process each block and the objective function, restricted to certain subspaces. The methods use trust regions as a globalization strategy, and they have been shown to be globally convergent under reasonable assumptions. The multilevel algorithms can be applied to all classes of MDO formulations. Multilevel algorithms for solving nonlinear systems of equations are a special case of the multilevel optimization methods. In this case, they can be viewed as a trust-region globalization of the Brown-Brent class.

  5. Overlapped Partitioning for Ensemble Classifiers of P300-Based Brain-Computer Interfaces

    PubMed Central

    Onishi, Akinari; Natsume, Kiyohisa

    2014-01-01

    A P300-based brain-computer interface (BCI) enables a wide range of people to control devices that improve their quality of life. Ensemble classifiers with naive partitioning were recently applied to the P300-based BCI and these classification performances were assessed. However, they were usually trained on a large amount of training data (e.g., 15300). In this study, we evaluated ensemble linear discriminant analysis (LDA) classifiers with a newly proposed overlapped partitioning method using 900 training data. In addition, the classification performances of the ensemble classifier with naive partitioning and a single LDA classifier were compared. One of three conditions for dimension reduction was applied: the stepwise method, principal component analysis (PCA), or none. The results show that an ensemble stepwise LDA (SWLDA) classifier with overlapped partitioning achieved a better performance than the commonly used single SWLDA classifier and an ensemble SWLDA classifier with naive partitioning. This result implies that the performance of the SWLDA is improved by overlapped partitioning and the ensemble classifier with overlapped partitioning requires less training data than that with naive partitioning. This study contributes towards reducing the required amount of training data and achieving better classification performance. PMID:24695550

  6. Overlapped partitioning for ensemble classifiers of P300-based brain-computer interfaces.

    PubMed

    Onishi, Akinari; Natsume, Kiyohisa

    2014-01-01

    A P300-based brain-computer interface (BCI) enables a wide range of people to control devices that improve their quality of life. Ensemble classifiers with naive partitioning were recently applied to the P300-based BCI and these classification performances were assessed. However, they were usually trained on a large amount of training data (e.g., 15300). In this study, we evaluated ensemble linear discriminant analysis (LDA) classifiers with a newly proposed overlapped partitioning method using 900 training data. In addition, the classification performances of the ensemble classifier with naive partitioning and a single LDA classifier were compared. One of three conditions for dimension reduction was applied: the stepwise method, principal component analysis (PCA), or none. The results show that an ensemble stepwise LDA (SWLDA) classifier with overlapped partitioning achieved a better performance than the commonly used single SWLDA classifier and an ensemble SWLDA classifier with naive partitioning. This result implies that the performance of the SWLDA is improved by overlapped partitioning and the ensemble classifier with overlapped partitioning requires less training data than that with naive partitioning. This study contributes towards reducing the required amount of training data and achieving better classification performance.

  7. Using Optimisation Techniques to Granulise Rough Set Partitions

    NASA Astrophysics Data System (ADS)

    Crossingham, Bodie; Marwala, Tshilidzi

    2007-11-01

    This paper presents an approach to optimise rough set partition sizes using various optimisation techniques. Three optimisation techniques are implemented to perform the granularisation process, namely, genetic algorithm (GA), hill climbing (HC) and simulated annealing (SA). These optimisation methods maximise the classification accuracy of the rough sets. The proposed rough set partition method is tested on a set of demographic properties of individuals obtained from the South African antenatal survey. The three techniques are compared in terms of their computational time, accuracy and number of rules produced when applied to the Human Immunodeficiency Virus (HIV) data set. The optimised methods results are compared to a well known non-optimised discretisation method, equal-width-bin partitioning (EWB). The accuracies achieved after optimising the partitions using GA, HC and SA are 66.89%, 65.84% and 65.48% respectively, compared to the accuracy of EWB of 59.86%. In addition to rough sets providing the plausabilities of the estimated HIV status, they also provide the linguistic rules describing how the demographic parameters drive the risk of HIV.

  8. Use of strainrange partitioning to predict high temperature low-cycle fatigue life. [of metallic materials

    NASA Technical Reports Server (NTRS)

    Hirschberg, M. H.; Halford, G. R.

    1976-01-01

    The fundamental concepts of the strainrange partitioning approach to high temperature, low low-cycle fatigue are reviewed. Procedures are presented by which the partitioned strainrange versus life relationships for any material can be generated. Laboratory tests are suggested for further verifying the ability of the method of strainrange partitioning to predict life.

  9. Improved 3-D turbomachinery CFD algorithm

    NASA Technical Reports Server (NTRS)

    Janus, J. Mark; Whitfield, David L.

    1988-01-01

    The building blocks of a computer algorithm developed for the time-accurate flow analysis of rotating machines are described. The flow model is a finite volume method utilizing a high resolution approximate Riemann solver for interface flux definitions. This block LU implicit numerical scheme possesses apparent unconditional stability. Multi-block composite gridding is used to orderly partition the field into a specified arrangement. Block interfaces, including dynamic interfaces, are treated such as to mimic interior block communication. Special attention is given to the reduction of in-core memory requirements by placing the burden on secondary storage media. Broad applicability is implied, although the results presented are restricted to that of an even blade count configuration. Several other configurations are presently under investigation, the results of which will appear in subsequent publications.

  10. A regional strategy for ecological sustainability: A case study in Southwest China.

    PubMed

    Wu, Xue; Liu, Shiliang; Cheng, Fangyan; Hou, Xiaoyun; Zhang, Yueqiu; Dong, Shikui; Liu, Guohua

    2018-03-01

    Partitioning, a method considering environmental protection and development potential, is an effective way to provide regional management strategies to maintain ecological sustainability. In this study, we provide a large-scale regional division approach and present a strategy for Southwest China, which also has extremely high development potential because of the "Western development" policy. Based on the superposition of 15 factors, including species diversity, pattern restriction, agricultural potential, accessibility, urbanization potential, and topographical limitations, the environmental value and development benefit in the region were quantified spatially by weighting the sum of indicators within environmental and development categories. By comparing the scores with their respective median values, the study area was divided into four different strategy zones: Conserve zones (34.94%), Construction zones (32.95%), Conflict zones (16.96%), and Low-tension zones (15.16%). The Conflict zones in which environmental value and development benefit were both higher than the respective medians were separated further into the following 5 levels: Extreme conflict (36.20%), Serious conflict (28.07%), Moderate conflict (12.28%), Minor conflict (6.55%), and Slight conflict (16.91%). We found that 9.04% of nature reserves were in Conflict zones, and thus should be given more attention. This study provides a simple and feasible method for regional partitioning, as well as comprehensive support that weighs both the environmental value and development benefit for China's current Ecological Red Line and space planning and for regional management in similar situations. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Restricted ADP movement in cardiomyocytes: Cytosolic diffusion obstacles are complemented with a small number of open mitochondrial voltage-dependent anion channels.

    PubMed

    Simson, Päivo; Jepihhina, Natalja; Laasmaa, Martin; Peterson, Pearu; Birkedal, Rikke; Vendelin, Marko

    2016-08-01

    Adequate intracellular energy transfer is crucial for proper cardiac function. In energy starved failing hearts, partial restoration of energy transfer can rescue mechanical performance. There are two types of diffusion obstacles that interfere with energy transfer from mitochondria to ATPases: mitochondrial outer membrane (MOM) with voltage-dependent anion channel (VDAC) permeable to small hydrophilic molecules and cytoplasmatic diffusion barriers grouping ATP-producers and -consumers. So far, there is no method developed to clearly distinguish the contributions of cytoplasmatic barriers and MOM to the overall diffusion restriction. Furthermore, the number of open VDACs in vivo remains unknown. The aim of this work was to establish the partitioning of intracellular diffusion obstacles in cardiomyocytes. We studied the response of mitochondrial oxidative phosphorylation of permeabilized rat cardiomyocytes to changes in extracellular ADP by recording 3D image stacks of NADH autofluorescence. Using cell-specific mathematical models, we determined the permeability of MOM and cytoplasmatic barriers. We found that only ~2% of VDACs are accessible to cytosolic ADP and cytoplasmatic diffusion barriers reduce the apparent diffusion coefficient by 6-10×. In cardiomyocytes, diffusion barriers in the cytoplasm and by the MOM restrict ADP/ATP diffusion to similar extents suggesting a major role of both barriers in energy transfer and other intracellular processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Air Traffic Sector Configuration Change Frequency

    NASA Technical Reports Server (NTRS)

    Chatterji, Gano Broto; Drew, Michael

    2009-01-01

    Several techniques for partitioning airspace have been developed in the literature. The question of whether a region of airspace created by such methods can be used with other days of traffic, and the number of times a different partition is needed during the day is examined in this paper. Both these aspects are examined for the Fort Worth Center airspace sectors. A Mixed Integer Linear Programming method is used with actual air traffic data of ten high-volume low-weather-delay days for creating sectors. Nine solutions were obtained for each two-hour period of the day by partitioning the center airspace into two through 18 sectors in steps of two sectors. Actual track-data were played back with the generated partitions for creating histograms of the traffic-counts. The best partition for each two-hour period was then identified based on the nine traffic-count distributions. Numbers of sectors in such partitions were analyzed to determine the number of times a different configuration is needed during the day. One to three partitions were selected for the 24-hour period, and traffic data from ten days were played back to test if the traffic-counts stayed below the threshold values associated with these partitions. Results show that these partitions are robust and can be used for longer durations than they were designed for

  13. Unsupervised hierarchical partitioning of hyperspectral images: application to marine algae identification

    NASA Astrophysics Data System (ADS)

    Chen, B.; Chehdi, K.; De Oliveria, E.; Cariou, C.; Charbonnier, B.

    2015-10-01

    In this paper a new unsupervised top-down hierarchical classification method to partition airborne hyperspectral images is proposed. The unsupervised approach is preferred because the difficulty of area access and the human and financial resources required to obtain ground truth data, constitute serious handicaps especially over large areas which can be covered by airborne or satellite images. The developed classification approach allows i) a successive partitioning of data into several levels or partitions in which the main classes are first identified, ii) an estimation of the number of classes automatically at each level without any end user help, iii) a nonsystematic subdivision of all classes of a partition Pj to form a partition Pj+1, iv) a stable partitioning result of the same data set from one run of the method to another. The proposed approach was validated on synthetic and real hyperspectral images related to the identification of several marine algae species. In addition to highly accurate and consistent results (correct classification rate over 99%), this approach is completely unsupervised. It estimates at each level, the optimal number of classes and the final partition without any end user intervention.

  14. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  15. ADHM and the 4d quantum Hall effect

    NASA Astrophysics Data System (ADS)

    Barns-Graham, Alec; Dorey, Nick; Lohitsiri, Nakarin; Tong, David; Turner, Carl

    2018-04-01

    Yang-Mills instantons are solitonic particles in d = 4 + 1 dimensional gauge theories. We construct and analyse the quantum Hall states that arise when these particles are restricted to the lowest Landau level. We describe the ground state wavefunctions for both Abelian and non-Abelian quantum Hall states. Although our model is purely bosonic, we show that the excitations of this 4d quantum Hall state are governed by the Nekrasov partition function of a certain five dimensional supersymmetric gauge theory with Chern-Simons term. The partition function can also be interpreted as a variant of the Hilbert series of the instanton moduli space, counting holomorphic sections rather than holomorphic functions. It is known that the Hilbert series of the instanton moduli space can be rewritten using mirror symmetry of 3d gauge theories in terms of Coulomb branch variables. We generalise this approach to include the effect of a five dimensional Chern-Simons term. We demonstrate that the resulting Coulomb branch formula coincides with the corresponding Higgs branch Molien integral which, in turn, reproduces the standard formula for the Nekrasov partition function.

  16. Antisense Suppression of the Small Chloroplast Protein CP12 in Tobacco Alters Carbon Partitioning and Severely Restricts Growth1[W

    PubMed Central

    Howard, Thomas P.; Fryer, Michael J.; Singh, Prashant; Metodiev, Metodi; Lytovchenko, Anna; Obata, Toshihiro; Fernie, Alisdair R.; Kruger, Nicholas J.; Quick, W. Paul; Lloyd, Julie C.; Raines, Christine A.

    2011-01-01

    The thioredoxin-regulated chloroplast protein CP12 forms a multienzyme complex with the Calvin-Benson cycle enzymes phosphoribulokinase (PRK) and glyceraldehyde-3-phosphate dehydrogenase (GAPDH). PRK and GAPDH are inactivated when present in this complex, a process shown in vitro to be dependent upon oxidized CP12. The importance of CP12 in vivo in higher plants, however, has not been investigated. Here, antisense suppression of CP12 in tobacco (Nicotiana tabacum) was observed to impact on NAD-induced PRK and GAPDH complex formation but had little effect on enzyme activity. Additionally, only minor changes in photosynthetic carbon fixation were observed. Despite this, antisense plants displayed changes in growth rates and morphology, including dwarfism and reduced apical dominance. The hypothesis that CP12 is essential to separate oxidative pentose phosphate pathway activity from Calvin-Benson cycle activity, as proposed in cyanobacteria, was tested. No evidence was found to support this role in tobacco. Evidence was seen, however, for a restriction to malate valve capacity, with decreases in NADP-malate dehydrogenase activity (but not protein levels) and pyridine nucleotide content. Antisense repression of CP12 also led to significant changes in carbon partitioning, with increased carbon allocation to the cell wall and the organic acids malate and fumarate and decreased allocation to starch and soluble carbohydrates. Severe decreases were also seen in 2-oxoglutarate content, a key indicator of cellular carbon sufficiency. The data presented here indicate that in tobacco, CP12 has a role in redox-mediated regulation of carbon partitioning from the chloroplast and provides strong in vivo evidence that CP12 is required for normal growth and development in plants. PMID:21865489

  17. Life prediction of thermal-mechanical fatigue using strainrange partitioning

    NASA Technical Reports Server (NTRS)

    Halford, G. R.; Manson, S. S.

    1975-01-01

    This paper describes the applicability of the method of Strainrange Partitioning to the life prediction of thermal-mechanical strain-cycling fatigue. An in-phase test on 316 stainless steel is analyzed as an illustrative example. The observed life is in excellent agreement with the life predicted by the method using the recently proposed Step-Stress Method of experimental partitioning, the Interaction Damage Rule, and the life relationships determined at an isothermal temperature of 705 C. Implications of the present study are discussed relative to the general thermal fatigue problem.

  18. Life prediction of thermal-mechanical fatigue using strain-range partitioning

    NASA Technical Reports Server (NTRS)

    Halford, G. R.; Manson, S. S.

    1975-01-01

    The applicability is described of the method of Strainrange Partitioning to the life prediction of thermal-mechanical strain-cycling fatigue. An in-phase test on 316 stainless steel is analyzed as an illustrative example. The observed life is in excellent agreement with the life predicted by the method using the recently proposed Step-Stress Method of experimental partitioning, the Interation Damage Rule, and the life relationships determined at an isothermal temperature of 705 C. Implications of the study are discussed relative to the general thermal fatigue problem.

  19. Choosing the best partition of the output from a large-scale simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Challacombe, Chelsea Jordan; Casleton, Emily Michele

    Data partitioning becomes necessary when a large-scale simulation produces more data than can be feasibly stored. The goal is to partition the data, typically so that every element belongs to one and only one partition, and store summary information about the partition, either a representative value plus an estimate of the error or a distribution. Once the partitions are determined and the summary information stored, the raw data is discarded. This process can be performed in-situ; meaning while the simulation is running. When creating the partitions there are many decisions that researchers must make. For instance, how to determine oncemore » an adequate number of partitions have been created, how are the partitions created with respect to dividing the data, or how many variables should be considered simultaneously. In addition, decisions must be made for how to summarize the information within each partition. Because of the combinatorial number of possible ways to partition and summarize the data, a method of comparing the different possibilities will help guide researchers into choosing a good partitioning and summarization scheme for their application.« less

  20. Strainrange partitioning life predictions of the long time metal properties council creep-fatigue tests

    NASA Technical Reports Server (NTRS)

    Saltsman, J. F.; Halford, G. R.

    1979-01-01

    The method of strainrange partitioning is used to predict the cyclic lives of the Metal Properties Council's long time creep-fatigue interspersion tests of several steel alloys. Comparisons are made with predictions based upon the time- and cycle-fraction approach. The method of strainrange partitioning is shown to give consistently more accurate predictions of cyclic life than is given by the time- and cycle-fraction approach.

  1. Implementation of spectral clustering with partitioning around medoids (PAM) algorithm on microarray data of carcinoma

    NASA Astrophysics Data System (ADS)

    Cahyaningrum, Rosalia D.; Bustamam, Alhadi; Siswantining, Titin

    2017-03-01

    Technology of microarray became one of the imperative tools in life science to observe the gene expression levels, one of which is the expression of the genes of people with carcinoma. Carcinoma is a cancer that forms in the epithelial tissue. These data can be analyzed such as the identification expressions hereditary gene and also build classifications that can be used to improve diagnosis of carcinoma. Microarray data usually served in large dimension that most methods require large computing time to do the grouping. Therefore, this study uses spectral clustering method which allows to work with any object for reduces dimension. Spectral clustering method is a method based on spectral decomposition of the matrix which is represented in the form of a graph. After the data dimensions are reduced, then the data are partitioned. One of the famous partition method is Partitioning Around Medoids (PAM) which is minimize the objective function with exchanges all the non-medoid points into medoid point iteratively until converge. Objectivity of this research is to implement methods spectral clustering and partitioning algorithm PAM to obtain groups of 7457 genes with carcinoma based on the similarity value. The result in this study is two groups of genes with carcinoma.

  2. Construction of exponentially fitted symplectic Runge-Kutta-Nyström methods from partitioned Runge-Kutta methods

    NASA Astrophysics Data System (ADS)

    Monovasilis, Theodore; Kalogiratou, Zacharoula; Simos, T. E.

    2014-10-01

    In this work we derive exponentially fitted symplectic Runge-Kutta-Nyström (RKN) methods from symplectic exponentially fitted partitioned Runge-Kutta (PRK) methods methods (for the approximate solution of general problems of this category see [18] - [40] and references therein). We construct RKN methods from PRK methods with up to five stages and fourth algebraic order.

  3. Canonical partition functions: ideal quantum gases, interacting classical gases, and interacting quantum gases

    NASA Astrophysics Data System (ADS)

    Zhou, Chi-Chun; Dai, Wu-Sheng

    2018-02-01

    In statistical mechanics, for a system with a fixed number of particles, e.g. a finite-size system, strictly speaking, the thermodynamic quantity needs to be calculated in the canonical ensemble. Nevertheless, the calculation of the canonical partition function is difficult. In this paper, based on the mathematical theory of the symmetric function, we suggest a method for the calculation of the canonical partition function of ideal quantum gases, including ideal Bose, Fermi, and Gentile gases. Moreover, we express the canonical partition functions of interacting classical and quantum gases given by the classical and quantum cluster expansion methods in terms of the Bell polynomial in mathematics. The virial coefficients of ideal Bose, Fermi, and Gentile gases are calculated from the exact canonical partition function. The virial coefficients of interacting classical and quantum gases are calculated from the canonical partition function by using the expansion of the Bell polynomial, rather than calculated from the grand canonical potential.

  4. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  5. 40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Solubilities and Octanol-Water Partition Coefficients of Hydrophobic Substances,” Journal of Research of the...

  6. 40 CFR 799.6756 - TSCA partition coefficient (n-octanol/water), generator column method.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... method, or any other reliable quantitative procedure must be used for those compounds that do not absorb... any other reliable quantitative method, aqueous solutions from the generator column enter a collecting... Solubilities and Octanol-Water Partition Coefficients of Hydrophobic Substances,” Journal of Research of the...

  7. Cytotoxicity of Sargassum angustifolium Partitions against Breast and Cervical Cancer Cell Lines

    PubMed Central

    Vaseghi, Golnaz; Sharifi, Mohsen; Dana, Nasim; Ghasemi, Ahmad; Yegdaneh, Afsaneh

    2018-01-01

    Background: Marine organisms produce a variety of compounds with pharmacological activities including anticancer effects. This study attempt to find cytotoxicity of hexane (HEX), dichloromethane (DCM), and butanol (BUTOH) partitions of Sargassum angustifolium. Materials and Methods: S. angustifolium was collected from Bushehr, a Southwest coastline of Persian Gulf. The plant was extracted by maceration with methanol-ethyl acetate. The extract was evaporated under vacuum and partitioned by Kupchan method to yield HEX, DCM, and BUTOH partitions. The cytotoxic activity of the extract (150, 450, and 900 μg/ml) was investigated against MCF-7 (breast cancer), HeLa (cervical cancer), and human umbilical vein endothelial cells cell lines by mitochondrial tetrazolium test assay after 72 h. Results: The cell survivals of HeLa and MCF-7 cell were decreased by increasing the concentration of extracts from 150 μg/ml to 900 μg/ml. The median growth inhibitory concentration value of HEX partition was 71 and 77 μg/ml against HeLa and MCF-7, dichloromethane partition was 36 and 88 μg/ml against HeLa and MCF-7, respectively. BUTOH partition was 25 μg/ml against MCF-7. Conclusion: This study reveals that different partitions of S. angustifolium have cytotoxic activity against cancer cell lines. PMID:29657928

  8. The r-Stirling Numbers.

    DTIC Science & Technology

    1982-12-01

    partitions, the restriction being that the first r elements must be in distinct cycles and respectively distinct subsets. The combinatorial and algebraic ...symmetric functions The Stirling numbers of the first kind, ("], for ixed n, are the elementary symmetric functions or the numbers 1,...,n (see, e.g., [4 or...5J). The r-Stirling numbers of the Iirst kind are the elementary symmetric runctions or the numbers r,... ,n. --- -4i: ? - , , .i

  9. Convergence Analysis of the Graph Allen-Cahn Scheme

    DTIC Science & Technology

    2016-02-01

    CONVERGENCE ANALYSIS OF THE GRAPH ALLEN-CAHN SCHEME ∗ XIYANG LUO† AND ANDREA L. BERTOZZI† Abstract. Graph partitioning problems have a wide range of...optimization, convergence and monotonicity are shown for a class of schemes under a graph-independent timestep restriction. We also analyze the effects of...spectral truncation, a common technique used to save computational cost. Convergence of the scheme with spectral truncation is also proved under a

  10. A Recursive Method for Calculating Certain Partition Functions.

    ERIC Educational Resources Information Center

    Woodrum, Luther; And Others

    1978-01-01

    Describes a simple recursive method for calculating the partition function and average energy of a system consisting of N electrons and L energy levels. Also, presents an efficient APL computer program to utilize the recursion relation. (Author/GA)

  11. An automated and objective method for age partitioning of reference intervals based on continuous centile curves.

    PubMed

    Yang, Qian; Lew, Hwee Yeong; Peh, Raymond Hock Huat; Metz, Michael Patrick; Loh, Tze Ping

    2016-10-01

    Reference intervals are the most commonly used decision support tool when interpreting quantitative laboratory results. They may require partitioning to better describe subpopulations that display significantly different reference values. Partitioning by age is particularly important for the paediatric population since there are marked physiological changes associated with growth and maturation. However, most partitioning methods are either technically complex or require prior knowledge of the underlying physiology/biological variation of the population. There is growing interest in the use of continuous centile curves, which provides seamless laboratory reference values as a child grows, as an alternative to rigidly described fixed reference intervals. However, the mathematical functions that describe these curves can be complex and may not be easily implemented in laboratory information systems. Hence, the use of fixed reference intervals is expected to continue for a foreseeable time. We developed a method that objectively proposes optimised age partitions and reference intervals for quantitative laboratory data (http://research.sph.nus.edu.sg/pp/ppResult.aspx), based on the sum of gradient that best describes the underlying distribution of the continuous centile curves. It is hoped that this method may improve the selection of age intervals for partitioning, which is receiving increasing attention in paediatric laboratory medicine. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  12. Determination of air-loop volume and radon partition coefficient for measuring radon in water sample.

    PubMed

    Lee, Kil Yong; Burnett, William C

    A simple method for the direct determination of the air-loop volume in a RAD7 system as well as the radon partition coefficient was developed allowing for an accurate measurement of the radon activity in any type of water. The air-loop volume may be measured directly using an external radon source and an empty bottle with a precisely measured volume. The partition coefficient and activity of radon in the water sample may then be determined via the RAD7 using the determined air-loop volume. Activity ratios instead of absolute activities were used to measure the air-loop volume and the radon partition coefficient. In order to verify this approach, we measured the radon partition coefficient in deionized water in the temperature range of 10-30 °C and compared the values to those calculated from the well-known Weigel equation. The results were within 5 % variance throughout the temperature range. We also applied the approach for measurement of the radon partition coefficient in synthetic saline water (0-75 ppt salinity) as well as tap water. The radon activity of the tap water sample was determined by this method as well as the standard RAD-H 2 O and BigBottle RAD-H 2 O. The results have shown good agreement between this method and the standard methods.

  13. A Partitioning and Bounded Variable Algorithm for Linear Programming

    ERIC Educational Resources Information Center

    Sheskin, Theodore J.

    2006-01-01

    An interesting new partitioning and bounded variable algorithm (PBVA) is proposed for solving linear programming problems. The PBVA is a variant of the simplex algorithm which uses a modified form of the simplex method followed by the dual simplex method for bounded variables. In contrast to the two-phase method and the big M method, the PBVA does…

  14. MSTor version 2013: A new version of the computer code for the multi-structural torsional anharmonicity, now with a coupled torsional potential

    NASA Astrophysics Data System (ADS)

    Zheng, Jingjing; Meana-Pañeda, Rubén; Truhlar, Donald G.

    2013-08-01

    We present an improved version of the MSTor program package, which calculates partition functions and thermodynamic functions of complex molecules involving multiple torsions; the method is based on either a coupled torsional potential or an uncoupled torsional potential. The program can also carry out calculations in the multiple-structure local harmonic approximation. The program package also includes seven utility codes that can be used as stand-alone programs to calculate reduced moment of inertia matrices by the method of Kilpatrick and Pitzer, to generate conformational structures, to calculate, either analytically or by Monte Carlo sampling, volumes for torsional subdomains defined by Voronoi tessellation of the conformational subspace, to generate template input files for the MSTor calculation and Voronoi calculation, and to calculate one-dimensional torsional partition functions using the torsional eigenvalue summation method. Restrictions: There is no limit on the number of torsions that can be included in either the Voronoi calculation or the full MS-T calculation. In practice, the range of problems that can be addressed with the present method consists of all multitorsional problems for which one can afford to calculate all the conformational structures and their frequencies. Unusual features: The method can be applied to transition states as well as stable molecules. The program package also includes the hull program for the calculation of Voronoi volumes, the symmetry program for determining point group symmetry of a molecule, and seven utility codes that can be used as stand-alone programs to calculate reduced moment-of-inertia matrices by the method of Kilpatrick and Pitzer, to generate conformational structures, to calculate, either analytically or by Monte Carlo sampling, volumes of the torsional subdomains defined by Voronoi tessellation of the conformational subspace, to generate template input files, and to calculate one-dimensional torsional partition functions using the torsional eigenvalue summation method. Additional comments: The program package includes a manual, installation script, and input and output files for a test suite. Running time: There are 26 test runs. The running time of the test runs on a single processor of the Itasca computer is less than 2 s. References: [1] MS-T(C) method: Quantum Thermochemistry: Multi-Structural Method with Torsional Anharmonicity Based on a Coupled Torsional Potential, J. Zheng and D.G. Truhlar, Journal of Chemical Theory and Computation 9 (2013) 1356-1367, DOI: http://dx.doi.org/10.1021/ct3010722. [2] MS-T(U) method: Practical Methods for Including Torsional Anharmonicity in Thermochemical Calculations of Complex Molecules: The Internal-Coordinate Multi-Structural Approximation, J. Zheng, T. Yu, E. Papajak, I, M. Alecu, S.L. Mielke, and D.G. Truhlar, Physical Chemistry Chemical Physics 13 (2011) 10885-10907.

  15. The challenge for genetic epidemiologists: how to analyze large numbers of SNPs in relation to complex diseases.

    PubMed

    Heidema, A Geert; Boer, Jolanda M A; Nagelkerke, Nico; Mariman, Edwin C M; van der A, Daphne L; Feskens, Edith J M

    2006-04-21

    Genetic epidemiologists have taken the challenge to identify genetic polymorphisms involved in the development of diseases. Many have collected data on large numbers of genetic markers but are not familiar with available methods to assess their association with complex diseases. Statistical methods have been developed for analyzing the relation between large numbers of genetic and environmental predictors to disease or disease-related variables in genetic association studies. In this commentary we discuss logistic regression analysis, neural networks, including the parameter decreasing method (PDM) and genetic programming optimized neural networks (GPNN) and several non-parametric methods, which include the set association approach, combinatorial partitioning method (CPM), restricted partitioning method (RPM), multifactor dimensionality reduction (MDR) method and the random forests approach. The relative strengths and weaknesses of these methods are highlighted. Logistic regression and neural networks can handle only a limited number of predictor variables, depending on the number of observations in the dataset. Therefore, they are less useful than the non-parametric methods to approach association studies with large numbers of predictor variables. GPNN on the other hand may be a useful approach to select and model important predictors, but its performance to select the important effects in the presence of large numbers of predictors needs to be examined. Both the set association approach and random forests approach are able to handle a large number of predictors and are useful in reducing these predictors to a subset of predictors with an important contribution to disease. The combinatorial methods give more insight in combination patterns for sets of genetic and/or environmental predictor variables that may be related to the outcome variable. As the non-parametric methods have different strengths and weaknesses we conclude that to approach genetic association studies using the case-control design, the application of a combination of several methods, including the set association approach, MDR and the random forests approach, will likely be a useful strategy to find the important genes and interaction patterns involved in complex diseases.

  16. Prediction of pH-dependent properties of DNA triple helices.

    PubMed

    Hüsler, P L; Klump, H H

    1995-02-20

    The thermodynamic properties of two triple helices were investigated by uv thermal denaturation, differential scanning calorimetry, and pH titrations. Starting from the grand partition function and using matrix methods we present a formalism that describes pH effects on the thermal stability of triple helices. The formalism can be used over a wide pH range and is not restricted to the limiting case where the pH is larger or smaller than the pK alpha of cytosine. Furthermore, it covers nearest neighbor electrostatic effects of closely spaced cytosines in the Hoogsteen strand which can shift the pK alpha of cytosine to lower pH values. A procedure is employed to predict enthalpy and entropy changes for triplex formation. These values are in accordance with the results obtained by differential scanning calorimetry.

  17. Multilevel Green's function interpolation method for scattering from composite metallic and dielectric objects.

    PubMed

    Shi, Yan; Wang, Hao Gang; Li, Long; Chan, Chi Hou

    2008-10-01

    A multilevel Green's function interpolation method based on two kinds of multilevel partitioning schemes--the quasi-2D and the hybrid partitioning scheme--is proposed for analyzing electromagnetic scattering from objects comprising both conducting and dielectric parts. The problem is formulated using the surface integral equation for homogeneous dielectric and conducting bodies. A quasi-2D multilevel partitioning scheme is devised to improve the efficiency of the Green's function interpolation. In contrast to previous multilevel partitioning schemes, noncubic groups are introduced to discretize the whole EM structure in this quasi-2D multilevel partitioning scheme. Based on the detailed analysis of the dimension of the group in this partitioning scheme, a hybrid quasi-2D/3D multilevel partitioning scheme is proposed to effectively handle objects with fine local structures. Selection criteria for some key parameters relating to the interpolation technique are given. The proposed algorithm is ideal for the solution of problems involving objects such as missiles, microstrip antenna arrays, photonic bandgap structures, etc. Numerical examples are presented to show that CPU time is between O(N) and O(N log N) while the computer memory requirement is O(N).

  18. Strainrange partitioning behavior of the nickel-base superalloys, Rene' 80 and in 100

    NASA Technical Reports Server (NTRS)

    Halford, G. R.; Nachtigall, A. J.

    1978-01-01

    A study was made to assess the ability of the method of Strainrange Partitioning (SRP) to both correlate and predict high-temperature, low cycle fatigue lives of nickel base superalloys for gas turbine applications. The partitioned strainrange versus life relationships for uncoated Rene' 80 and cast IN 100 were also determined from the ductility normalized-Strainrange Partitioning equations. These were used to predict the cyclic lives of the baseline tests. The life predictability of the method was verified for cast IN 100 by applying the baseline results to the cyclic life prediction of a series of complex strain cycling tests with multiple hold periods at constant strain. It was concluded that the method of SRP can correlate and predict the cyclic lives of laboratory specimens of the nickel base superalloys evaluated in this program.

  19. Calcic amphibole thermobarometry in metamorphic and igneous rocks: New calibrations based on plagioclase/amphibole Al-Si partitioning and amphibole/liquid Mg partitioning

    NASA Astrophysics Data System (ADS)

    Molina, J. F.; Moreno, J. A.; Castro, A.; Rodríguez, C.; Fershtater, G. B.

    2015-09-01

    Dependencies of plagioclase/amphibole Al-Si partitioning, DAl/Siplg/amp, and amphibole/liquid Mg partitioning, DMgamp/liq, on temperature, pressure and phase compositions are investigated employing robust regression methods based on MM-estimators. A database with 92 amphibole-plagioclase pairs - temperature range: 650-1050 °C; amphibole compositional limits: > 0.02 apfu (23O) Ti and > 0.05 apfu Al - and 148 amphibole-glass pairs - temperature range: 800-1100 °C; amphibole compositional limit: CaM4/(CaM4 + NaM4) > 0.75 - compiled from experiments in the literature was used for the calculations (amphibole normalization scheme: 13-CNK method).

  20. A Novel Method for Discovering Fuzzy Sequential Patterns Using the Simple Fuzzy Partition Method.

    ERIC Educational Resources Information Center

    Chen, Ruey-Shun; Hu, Yi-Chung

    2003-01-01

    Discusses sequential patterns, data mining, knowledge acquisition, and fuzzy sequential patterns described by natural language. Proposes a fuzzy data mining technique to discover fuzzy sequential patterns by using the simple partition method which allows the linguistic interpretation of each fuzzy set to be easily obtained. (Author/LRW)

  1. MSTor: A program for calculating partition functions, free energies, enthalpies, entropies, and heat capacities of complex molecules including torsional anharmonicity

    NASA Astrophysics Data System (ADS)

    Zheng, Jingjing; Mielke, Steven L.; Clarkson, Kenneth L.; Truhlar, Donald G.

    2012-08-01

    We present a Fortran program package, MSTor, which calculates partition functions and thermodynamic functions of complex molecules involving multiple torsional motions by the recently proposed MS-T method. This method interpolates between the local harmonic approximation in the low-temperature limit, and the limit of free internal rotation of all torsions at high temperature. The program can also carry out calculations in the multiple-structure local harmonic approximation. The program package also includes six utility codes that can be used as stand-alone programs to calculate reduced moment of inertia matrices by the method of Kilpatrick and Pitzer, to generate conformational structures, to calculate, either analytically or by Monte Carlo sampling, volumes for torsional subdomains defined by Voronoi tessellation of the conformational subspace, to generate template input files, and to calculate one-dimensional torsional partition functions using the torsional eigenvalue summation method. Catalogue identifier: AEMF_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMF_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 77 434 No. of bytes in distributed program, including test data, etc.: 3 264 737 Distribution format: tar.gz Programming language: Fortran 90, C, and Perl Computer: Itasca (HP Linux cluster, each node has two-socket, quad-core 2.8 GHz Intel Xeon X5560 “Nehalem EP” processors), Calhoun (SGI Altix XE 1300 cluster, each node containing two quad-core 2.66 GHz Intel Xeon “Clovertown”-class processors sharing 16 GB of main memory), Koronis (Altix UV 1000 server with 190 6-core Intel Xeon X7542 “Westmere” processors at 2.66 GHz), Elmo (Sun Fire X4600 Linux cluster with AMD Opteron cores), and Mac Pro (two 2.8 GHz Quad-core Intel Xeon processors) Operating system: Linux/Unix/Mac OS RAM: 2 Mbytes Classification: 16.3, 16.12, 23 Nature of problem: Calculation of the partition functions and thermodynamic functions (standard-state energy, enthalpy, entropy, and free energy as functions of temperatures) of complex molecules involving multiple torsional motions. Solution method: The multi-structural approximation with torsional anharmonicity (MS-T). The program also provides results for the multi-structural local harmonic approximation [1]. Restrictions: There is no limit on the number of torsions that can be included in either the Voronoi calculation or the full MS-T calculation. In practice, the range of problems that can be addressed with the present method consists of all multi-torsional problems for which one can afford to calculate all the conformations and their frequencies. Unusual features: The method can be applied to transition states as well as stable molecules. The program package also includes the hull program for the calculation of Voronoi volumes and six utility codes that can be used as stand-alone programs to calculate reduced moment-of-inertia matrices by the method of Kilpatrick and Pitzer, to generate conformational structures, to calculate, either analytically or by Monte Carlo sampling, volumes for torsional subdomain defined by Voronoi tessellation of the conformational subspace, to generate template input files, and to calculate one-dimensional torsional partition functions using the torsional eigenvalue summation method. Additional comments: The program package includes a manual, installation script, and input and output files for a test suite. Running time: There are 24 test runs. The running time of the test runs on a single processor of the Itasca computer is less than 2 seconds. J. Zheng, T. Yu, E. Papajak, I.M. Alecu, S.L. Mielke, D.G. Truhlar, Practical methods for including torsional anharmonicity in thermochemical calculations of complex molecules: The internal-coordinate multi-structural approximation, Phys. Chem. Chem. Phys. 13 (2011) 10885-10907.

  2. Finite element modeling of diffusion and partitioning in biological systems: the infinite composite medium problem.

    PubMed

    Missel, P J

    2000-01-01

    Four methods are proposed for modeling diffusion in heterogeneous media where diffusion and partition coefficients take on differing values in each subregion. The exercise was conducted to validate finite element modeling (FEM) procedures in anticipation of modeling drug diffusion with regional partitioning into ocular tissue, though the approach can be useful for other organs, or for modeling diffusion in laminate devices. Partitioning creates a discontinuous value in the dependent variable (concentration) at an intertissue boundary that is not easily handled by available general-purpose FEM codes, which allow for only one value at each node. The discontinuity is handled using a transformation on the dependent variable based upon the region-specific partition coefficient. Methods were evaluated by their ability to reproduce a known exact result, for the problem of the infinite composite medium (Crank, J. The Mathematics of Diffusion, 2nd ed. New York: Oxford University Press, 1975, pp. 38-39.). The most physically intuitive method is based upon the concept of chemical potential, which is continuous across an interphase boundary (method III). This method makes the equation of the dependent variable highly nonlinear. This can be linearized easily by a change of variables (method IV). Results are also given for a one-dimensional problem simulating bolus injection into the vitreous, predicting time disposition of drug in vitreous and retina.

  3. A new strategy for genome assembly using short sequence reads and reduced representation libraries.

    PubMed

    Young, Andrew L; Abaan, Hatice Ozel; Zerbino, Daniel; Mullikin, James C; Birney, Ewan; Margulies, Elliott H

    2010-02-01

    We have developed a novel approach for using massively parallel short-read sequencing to generate fast and inexpensive de novo genomic assemblies comparable to those generated by capillary-based methods. The ultrashort (<100 base) sequences generated by this technology pose specific biological and computational challenges for de novo assembly of large genomes. To account for this, we devised a method for experimentally partitioning the genome using reduced representation (RR) libraries prior to assembly. We use two restriction enzymes independently to create a series of overlapping fragment libraries, each containing a tractable subset of the genome. Together, these libraries allow us to reassemble the entire genome without the need of a reference sequence. As proof of concept, we applied this approach to sequence and assembled the majority of the 125-Mb Drosophila melanogaster genome. We subsequently demonstrate the accuracy of our assembly method with meaningful comparisons against the current available D. melanogaster reference genome (dm3). The ease of assembly and accuracy for comparative genomics suggest that our approach will scale to future mammalian genome-sequencing efforts, saving both time and money without sacrificing quality.

  4. Recent Advances and Perspectives on Nonadiabatic Mixed Quantum-Classical Dynamics.

    PubMed

    Crespo-Otero, Rachel; Barbatti, Mario

    2018-05-16

    Nonadiabatic mixed quantum-classical (NA-MQC) dynamics methods form a class of computational theoretical approaches in quantum chemistry tailored to investigate the time evolution of nonadiabatic phenomena in molecules and supramolecular assemblies. NA-MQC is characterized by a partition of the molecular system into two subsystems: one to be treated quantum mechanically (usually but not restricted to electrons) and another to be dealt with classically (nuclei). The two subsystems are connected through nonadiabatic couplings terms to enforce self-consistency. A local approximation underlies the classical subsystem, implying that direct dynamics can be simulated, without needing precomputed potential energy surfaces. The NA-MQC split allows reducing computational costs, enabling the treatment of realistic molecular systems in diverse fields. Starting from the three most well-established methods-mean-field Ehrenfest, trajectory surface hopping, and multiple spawning-this review focuses on the NA-MQC dynamics methods and programs developed in the last 10 years. It stresses the relations between approaches and their domains of application. The electronic structure methods most commonly used together with NA-MQC dynamics are reviewed as well. The accuracy and precision of NA-MQC simulations are critically discussed, and general guidelines to choose an adequate method for each application are delivered.

  5. Dietary fat and not calcium supplementation or dairy product consumption is associated with changes in anthropometrics during a randomized, placebo-controlled energy-restriction trial

    PubMed Central

    2011-01-01

    Insufficient calcium intake has been proposed to cause unbalanced energy partitioning leading to obesity. However, weight loss interventions including dietary calcium or dairy product consumption have not reported changes in lipid metabolism measured by the plasma lipidome. Methods The objective of this study was to determine the relationships between dairy product or supplemental calcium intake with changes in the plasma lipidome and body composition during energy restriction. A secondary objective of this study was to explore the relationships among calculated macronutrient composition of the energy restricted diet to changes in the plasma lipidome, and body composition during energy restriction. Overweight adults (n = 61) were randomized into one of three intervention groups including a deficit of 500kcal/d: 1) placebo; 2) 900 mg/d calcium supplement; and 3) 3-4 servings of dairy products/d plus a placebo supplement. Plasma fatty acid methyl esters of cholesterol ester, diacylglycerol, free fatty acids, lysophosphatidylcholine, phosphatidylcholine, phosphatidylethanolamine and triacylglycerol were quantified by capillary gas chromatography. Results After adjustments for energy and protein (g/d) intake, there was no significant effect of treatment on changes in weight, waist circumference or body composition. Plasma lipidome did not differ among dietary treatment groups. Stepwise regression identified correlations between reported intake of monounsaturated fat (% of energy) and changes in % lean mass (r = -0.44, P < 0.01) and % body fat (r = 0.48, P < 0.001). Polyunsaturated fat intake was associated with the % change in waist circumference (r = 0.44, P < 0.01). Dietary saturated fat was not associated with any changes in anthropometrics or the plasma lipidome. Conclusions Dairy product consumption or calcium supplementation during energy restriction over the course of 12 weeks did not affect plasma lipids. Independent of calcium and dairy product consumption, short-term energy restriction altered body composition. Reported dietary fat composition of energy restricted diets was associated with the degree of change in body composition in these overweight and obese individuals. PMID:21970320

  6. Implementation of hybrid clustering based on partitioning around medoids algorithm and divisive analysis on human Papillomavirus DNA

    NASA Astrophysics Data System (ADS)

    Arimbi, Mentari Dian; Bustamam, Alhadi; Lestari, Dian

    2017-03-01

    Data clustering can be executed through partition or hierarchical method for many types of data including DNA sequences. Both clustering methods can be combined by processing partition algorithm in the first level and hierarchical in the second level, called hybrid clustering. In the partition phase some popular methods such as PAM, K-means, or Fuzzy c-means methods could be applied. In this study we selected partitioning around medoids (PAM) in our partition stage. Furthermore, following the partition algorithm, in hierarchical stage we applied divisive analysis algorithm (DIANA) in order to have more specific clusters and sub clusters structures. The number of main clusters is determined using Davies Bouldin Index (DBI) value. We choose the optimal number of clusters if the results minimize the DBI value. In this work, we conduct the clustering on 1252 HPV DNA sequences data from GenBank. The characteristic extraction is initially performed, followed by normalizing and genetic distance calculation using Euclidean distance. In our implementation, we used the hybrid PAM and DIANA using the R open source programming tool. In our results, we obtained 3 main clusters with average DBI value is 0.979, using PAM in the first stage. After executing DIANA in the second stage, we obtained 4 sub clusters for Cluster-1, 9 sub clusters for Cluster-2 and 2 sub clusters in Cluster-3, with the BDI value 0.972, 0.771, and 0.768 for each main cluster respectively. Since the second stage produce lower DBI value compare to the DBI value in the first stage, we conclude that this hybrid approach can improve the accuracy of our clustering results.

  7. Scoring and staging systems using cox linear regression modeling and recursive partitioning.

    PubMed

    Lee, J W; Um, S H; Lee, J B; Mun, J; Cho, H

    2006-01-01

    Scoring and staging systems are used to determine the order and class of data according to predictors. Systems used for medical data, such as the Child-Turcotte-Pugh scoring and staging systems for ordering and classifying patients with liver disease, are often derived strictly from physicians' experience and intuition. We construct objective and data-based scoring/staging systems using statistical methods. We consider Cox linear regression modeling and recursive partitioning techniques for censored survival data. In particular, to obtain a target number of stages we propose cross-validation and amalgamation algorithms. We also propose an algorithm for constructing scoring and staging systems by integrating local Cox linear regression models into recursive partitioning, so that we can retain the merits of both methods such as superior predictive accuracy, ease of use, and detection of interactions between predictors. The staging system construction algorithms are compared by cross-validation evaluation of real data. The data-based cross-validation comparison shows that Cox linear regression modeling is somewhat better than recursive partitioning when there are only continuous predictors, while recursive partitioning is better when there are significant categorical predictors. The proposed local Cox linear recursive partitioning has better predictive accuracy than Cox linear modeling and simple recursive partitioning. This study indicates that integrating local linear modeling into recursive partitioning can significantly improve prediction accuracy in constructing scoring and staging systems.

  8. The Use of Binary Search Trees in External Distribution Sorting.

    ERIC Educational Resources Information Center

    Cooper, David; Lynch, Michael F.

    1984-01-01

    Suggests new method of external distribution called tree partitioning that involves use of binary tree to split incoming file into successively smaller partitions for internal sorting. Number of disc accesses during a tree-partitioning sort were calculated in simulation using files extracted from British National Bibliography catalog files. (19…

  9. A physics-motivated Centroidal Voronoi Particle domain decomposition method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Lin, E-mail: lin.fu@tum.de; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de; Adams, Nikolaus A., E-mail: nikolaus.adams@tum.de

    2017-04-15

    In this paper, we propose a novel domain decomposition method for large-scale simulations in continuum mechanics by merging the concepts of Centroidal Voronoi Tessellation (CVT) and Voronoi Particle dynamics (VP). The CVT is introduced to achieve a high-level compactness of the partitioning subdomains by the Lloyd algorithm which monotonically decreases the CVT energy. The number of computational elements between neighboring partitioning subdomains, which scales the communication effort for parallel simulations, is optimized implicitly as the generated partitioning subdomains are convex and simply connected with small aspect-ratios. Moreover, Voronoi Particle dynamics employing physical analogy with a tailored equation of state ismore » developed, which relaxes the particle system towards the target partition with good load balance. Since the equilibrium is computed by an iterative approach, the partitioning subdomains exhibit locality and the incremental property. Numerical experiments reveal that the proposed Centroidal Voronoi Particle (CVP) based algorithm produces high-quality partitioning with high efficiency, independently of computational-element types. Thus it can be used for a wide range of applications in computational science and engineering.« less

  10. A physics-motivated Centroidal Voronoi Particle domain decomposition method

    NASA Astrophysics Data System (ADS)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-04-01

    In this paper, we propose a novel domain decomposition method for large-scale simulations in continuum mechanics by merging the concepts of Centroidal Voronoi Tessellation (CVT) and Voronoi Particle dynamics (VP). The CVT is introduced to achieve a high-level compactness of the partitioning subdomains by the Lloyd algorithm which monotonically decreases the CVT energy. The number of computational elements between neighboring partitioning subdomains, which scales the communication effort for parallel simulations, is optimized implicitly as the generated partitioning subdomains are convex and simply connected with small aspect-ratios. Moreover, Voronoi Particle dynamics employing physical analogy with a tailored equation of state is developed, which relaxes the particle system towards the target partition with good load balance. Since the equilibrium is computed by an iterative approach, the partitioning subdomains exhibit locality and the incremental property. Numerical experiments reveal that the proposed Centroidal Voronoi Particle (CVP) based algorithm produces high-quality partitioning with high efficiency, independently of computational-element types. Thus it can be used for a wide range of applications in computational science and engineering.

  11. Monogamy relations of concurrence for any dimensional quantum systems

    NASA Astrophysics Data System (ADS)

    Zhu, Xue-Na; Li-Jost, Xianqing; Fei, Shao-Ming

    2017-11-01

    We study monogamy relations for arbitrary dimensional multipartite systems. Monogamy relations based on concurrence and concurrence of assistance for any dimensional m_1⊗ m_2⊗ \\cdots ⊗ mN quantum states are derived, which give rise to the restrictions on the entanglement distributions among the subsystems. Besides, we give the lower bound of concurrence for four-partite mixed states. The approach can be readily generalized to arbitrary multipartite systems.

  12. Partition functions for heterotic WZW conformal field theories

    NASA Astrophysics Data System (ADS)

    Gannon, Terry

    1993-08-01

    Thus far in the search for, and classification of, "physical" modular invariant partition functions ΣN LRχ Lχ R∗ the attention has been focused on the symmetric case where the holomorphic and anti-holomorphic sectors, and hence the characters χLand χR, are associated with the same Kac-Moody algebras ĝL = ĝR and levels κ L = κ R. In this paper we consider the more general possibility where ( ĝL, κ L) may not equal ( ĝR, κ R). We discuss which choices of algebras and levels may correspond to well-defined conformal field theories, we find the "smallest" such heterotic (i.e. asymmetric) partition functions, and we give a method, generalizing the Roberts-Terao-Warner lattice method, for explicitly constructing many other modular invariants. We conclude the paper by proving that this new lattice method will succeed in generating all the heterotic partition functions, for all choices of algebras and levels.

  13. Resource partitioning facilitates coexistence in sympatric cetaceans in the California Current.

    PubMed

    Fossette, Sabrina; Abrahms, Briana; Hazen, Elliott L; Bograd, Steven J; Zilliacus, Kelly M; Calambokidis, John; Burrows, Julia A; Goldbogen, Jeremy A; Harvey, James T; Marinovic, Baldo; Tershy, Bernie; Croll, Donald A

    2017-11-01

    Resource partitioning is an important process driving habitat use and foraging strategies in sympatric species that potentially compete. Differences in foraging behavior are hypothesized to contribute to species coexistence by facilitating resource partitioning, but little is known on the multiple mechanisms for partitioning that may occur simultaneously. Studies are further limited in the marine environment, where the spatial and temporal distribution of resources is highly dynamic and subsequently difficult to quantify. We investigated potential pathways by which foraging behavior may facilitate resource partitioning in two of the largest co-occurring and closely related species on Earth, blue ( Balaenoptera musculus ) and humpback ( Megaptera novaeangliae ) whales. We integrated multiple long-term datasets (line-transect surveys, whale-watching records, net sampling, stable isotope analysis, and remote-sensing of oceanographic parameters) to compare the diet, phenology, and distribution of the two species during their foraging periods in the highly productive waters of Monterey Bay, California, USA within the California Current Ecosystem. Our long-term study reveals that blue and humpback whales likely facilitate sympatry by partitioning their foraging along three axes: trophic, temporal, and spatial. Blue whales were specialists foraging on krill, predictably targeting a seasonal peak in krill abundance, were present in the bay for an average of 4.7 months, and were spatially restricted at the continental shelf break. In contrast, humpback whales were generalists apparently feeding on a mixed diet of krill and fishes depending on relative abundances, were present in the bay for a more extended period (average of 6.6 months), and had a broader spatial distribution at the shelf break and inshore. Ultimately, competition for common resources can lead to behavioral, morphological, and physiological character displacement between sympatric species. Understanding the mechanisms for species coexistence is both fundamental to maintaining biodiverse ecosystems, and provides insight into the evolutionary drivers of morphological differences in closely related species.

  14. High Performance Computing Based Parallel HIearchical Modal Association Clustering (HPAR HMAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patlolla, Dilip R; Surendran Nair, Sujithkumar; Graves, Daniel A.

    For many applications, clustering is a crucial step in order to gain insight into the makeup of a dataset. The best approach to a given problem often depends on a variety of factors, such as the size of the dataset, time restrictions, and soft clustering requirements. The HMAC algorithm seeks to combine the strengths of 2 particular clustering approaches: model-based and linkage-based clustering. One particular weakness of HMAC is its computational complexity. HMAC is not practical for mega-scale data clustering. For high-definition imagery, a user would have to wait months or years for a result; for a 16-megapixel image, themore » estimated runtime skyrockets to over a decade! To improve the execution time of HMAC, it is reasonable to consider an multi-core implementation that utilizes available system resources. An existing imple-mentation (Ray and Cheng 2014) divides the dataset into N partitions - one for each thread prior to executing the HMAC algorithm. This implementation benefits from 2 types of optimization: parallelization and divide-and-conquer. By running each partition in parallel, the program is able to accelerate computation by utilizing more system resources. Although the parallel implementation provides considerable improvement over the serial HMAC, it still suffers from poor computational complexity, O(N2). Once the maximum number of cores on a system is exhausted, the program exhibits slower behavior. We now consider a modification to HMAC that involves a recursive partitioning scheme. Our modification aims to exploit divide-and-conquer benefits seen by the parallel HMAC implementation. At each level in the recursion tree, partitions are divided into 2 sub-partitions until a threshold size is reached. When the partition can no longer be divided without falling below threshold size, the base HMAC algorithm is applied. This results in a significant speedup over the parallel HMAC.« less

  15. Targeted Repression of Essential Genes To Arrest Growth and Increase Carbon Partitioning and Biofuel Titers in Cyanobacteria.

    PubMed

    Shabestary, Kiyan; Anfelt, Josefine; Ljungqvist, Emil; Jahn, Michael; Yao, Lun; Hudson, Elton P

    2018-06-08

    Photoautotrophic production of fuels and chemicals by cyanobacteria typically gives lower volumetric productivities and titers than heterotrophic production. Cyanobacteria cultures become light limited above an optimal cell density, so that this substrate is not supplied to all cells sufficiently. Here, we investigate genetic strategies for a two-phase cultivation, where biofuel-producing Synechocystis cultures are limited to an optimal cell density through inducible CRISPR interference (CRISPRi) repression of cell growth. Fixed CO 2 is diverted to ethanol or n-butanol. Among the most successful strategies was partial repression of citrate synthase gltA. Strong repression (>90%) of gltA at low culture densities increased carbon partitioning to n-butanol 5-fold relative to a nonrepression strain, but sacrificed volumetric productivity due to severe growth restriction. CO 2 fixation continued for at least 3 days after growth was arrested. By targeting sgRNAs to different regions of the gltA gene, we could modulate GltA expression and carbon partitioning between growth and product to increase both specific and volumetric productivity. These growth arrest strategies can be useful for improving performance of other photoautotrophic processes.

  16. Genotypic distribution of a specialist model microorganism, Methanosaeta, along an estuarine gradient: does metabolic restriction limit niche differentiation potential?

    PubMed

    Carbonero, Franck; Oakley, Brian B; Hawkins, Robert J; Purdy, Kevin J

    2012-05-01

    A reductionist ecological approach of using a model genus was adopted in order to understand how microbial community structure is driven by metabolic properties. The distribution along an estuarine gradient of the highly specialised genus Methanosaeta was investigated and compared to the previously determined distribution of the more metabolically flexible Desulfobulbus. Methanosaeta genotypic distribution along the Colne estuary (Essex, UK) was determined by DNA- and RNA-based denaturing gradient gel electrophoresis and 16S rRNA gene sequence analyses. Methanosaeta distribution was monotonic, with a consistently diverse community and no apparent niche partitioning either in DNA or RNA analyses. This distribution pattern contrasts markedly with the previously described niche partitioning and sympatric differentiation of the model generalist, Desulfobulbus. To explain this difference, it is hypothesised that Methanosaeta's strict metabolic needs limit its adaptation potential, thus populations do not partition into spatially distinct groups and so do not appear to be constrained by gross environmental factors such as salinity. Thus, at least for these two model genera, it appears that metabolic flexibility may be an important factor in spatial distribution and this may be applicable to other microbes.

  17. Comparative phylogeography of reef fishes from the Gulf of Aden to the Arabian Sea reveals two cryptic lineages

    NASA Astrophysics Data System (ADS)

    DiBattista, Joseph D.; Gaither, Michelle R.; Hobbs, Jean-Paul A.; Saenz-Agudelo, Pablo; Piatek, Marek J.; Bowen, Brian W.; Rocha, Luiz A.; Howard Choat, J.; McIlwain, Jennifer H.; Priest, Mark A.; Sinclair-Taylor, Tane H.; Berumen, Michael L.

    2017-06-01

    The Arabian Sea is a heterogeneous region with high coral cover and warm stable conditions at the western end (Djibouti), in contrast to sparse coral cover, cooler temperatures, and upwelling at the eastern end (southern Oman). We tested for barriers to dispersal across this region (including the Gulf of Aden and Gulf of Oman), using mitochondrial DNA surveys of 11 reef fishes. Study species included seven taxa from six families with broad distributions across the Indo-Pacific and four species restricted to the Arabian Sea (and adjacent areas). Nine species showed no significant genetic partitions, indicating connectivity among contrasting environments spread across 2000 km. One butterflyfish ( Chaetodon melannotus) and a snapper ( Lutjanus kasmira) showed phylogenetic divergences of d = 0.008 and 0.048, respectively, possibly indicating cryptic species within these broadly distributed taxa. These genetic partitions at the western periphery of the Indo-Pacific reflect similar partitions recently discovered at the eastern periphery of the Indo-Pacific (the Hawaiian and the Marquesan Archipelagos), indicating that these disjunctive habitats at the ends of the range may serve as evolutionary incubators for coral reef organisms.

  18. KmL3D: a non-parametric algorithm for clustering joint trajectories.

    PubMed

    Genolini, C; Pingault, J B; Driss, T; Côté, S; Tremblay, R E; Vitaro, F; Arnaud, C; Falissard, B

    2013-01-01

    In cohort studies, variables are measured repeatedly and can be considered as trajectories. A classic way to work with trajectories is to cluster them in order to detect the existence of homogeneous patterns of evolution. Since cohort studies usually measure a large number of variables, it might be interesting to study the joint evolution of several variables (also called joint-variable trajectories). To date, the only way to cluster joint-trajectories is to cluster each trajectory independently, then to cross the partitions obtained. This approach is unsatisfactory because it does not take into account a possible co-evolution of variable-trajectories. KmL3D is an R package that implements a version of k-means dedicated to clustering joint-trajectories. It provides facilities for the management of missing values, offers several quality criteria and its graphic interface helps the user to select the best partition. KmL3D can work with any number of joint-variable trajectories. In the restricted case of two joint trajectories, it proposes 3D tools to visualize the partitioning and then export 3D dynamic rotating-graphs to PDF format. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Task-specific image partitioning.

    PubMed

    Kim, Sungwoong; Nowozin, Sebastian; Kohli, Pushmeet; Yoo, Chang D

    2013-02-01

    Image partitioning is an important preprocessing step for many of the state-of-the-art algorithms used for performing high-level computer vision tasks. Typically, partitioning is conducted without regard to the task in hand. We propose a task-specific image partitioning framework to produce a region-based image representation that will lead to a higher task performance than that reached using any task-oblivious partitioning framework and existing supervised partitioning framework, albeit few in number. The proposed method partitions the image by means of correlation clustering, maximizing a linear discriminant function defined over a superpixel graph. The parameters of the discriminant function that define task-specific similarity/dissimilarity among superpixels are estimated based on structured support vector machine (S-SVM) using task-specific training data. The S-SVM learning leads to a better generalization ability while the construction of the superpixel graph used to define the discriminant function allows a rich set of features to be incorporated to improve discriminability and robustness. We evaluate the learned task-aware partitioning algorithms on three benchmark datasets. Results show that task-aware partitioning leads to better labeling performance than the partitioning computed by the state-of-the-art general-purpose and supervised partitioning algorithms. We believe that the task-specific image partitioning paradigm is widely applicable to improving performance in high-level image understanding tasks.

  20. Partition coefficients of methylated DNA bases obtained from free energy calculations with molecular electron density derived atomic charges.

    PubMed

    Lara, A; Riquelme, M; Vöhringer-Martinez, E

    2018-05-11

    Partition coefficients serve in various areas as pharmacology and environmental sciences to predict the hydrophobicity of different substances. Recently, they have also been used to address the accuracy of force fields for various organic compounds and specifically the methylated DNA bases. In this study, atomic charges were derived by different partitioning methods (Hirshfeld and Minimal Basis Iterative Stockholder) directly from the electron density obtained by electronic structure calculations in a vacuum, with an implicit solvation model or with explicit solvation taking the dynamics of the solute and the solvent into account. To test the ability of these charges to describe electrostatic interactions in force fields for condensed phases, the original atomic charges of the AMBER99 force field were replaced with the new atomic charges and combined with different solvent models to obtain the hydration and chloroform solvation free energies by molecular dynamics simulations. Chloroform-water partition coefficients derived from the obtained free energies were compared to experimental and previously reported values obtained with the GAFF or the AMBER-99 force field. The results show that good agreement with experimental data is obtained when the polarization of the electron density by the solvent has been taken into account, and when the energy needed to polarize the electron density of the solute has been considered in the transfer free energy. These results were further confirmed by hydration free energies of polar and aromatic amino acid side chain analogs. Comparison of the two partitioning methods, Hirshfeld-I and Minimal Basis Iterative Stockholder (MBIS), revealed some deficiencies in the Hirshfeld-I method related to the unstable isolated anionic nitrogen pro-atom used in the method. Hydration free energies and partitioning coefficients obtained with atomic charges from the MBIS partitioning method accounting for polarization by the implicit solvation model are in good agreement with the experimental values. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  1. Spectral (Finite) Volume Method for Conservation Laws on Unstructured Grids II: Extension to Two Dimensional Scalar Equation

    NASA Technical Reports Server (NTRS)

    Wang, Z. J.; Liu, Yen; Kwak, Dochan (Technical Monitor)

    2002-01-01

    The framework for constructing a high-order, conservative Spectral (Finite) Volume (SV) method is presented for two-dimensional scalar hyperbolic conservation laws on unstructured triangular grids. Each triangular grid cell forms a spectral volume (SV), and the SV is further subdivided into polygonal control volumes (CVs) to supported high-order data reconstructions. Cell-averaged solutions from these CVs are used to reconstruct a high order polynomial approximation in the SV. Each CV is then updated independently with a Godunov-type finite volume method and a high-order Runge-Kutta time integration scheme. A universal reconstruction is obtained by partitioning all SVs in a geometrically similar manner. The convergence of the SV method is shown to depend on how a SV is partitioned. A criterion based on the Lebesgue constant has been developed and used successfully to determine the quality of various partitions. Symmetric, stable, and convergent linear, quadratic, and cubic SVs have been obtained, and many different types of partitions have been evaluated. The SV method is tested for both linear and non-linear model problems with and without discontinuities.

  2. Iron Partitioning in Ferropericlase and Consequences for the Magma Ocean.

    NASA Astrophysics Data System (ADS)

    Braithwaite, J. W. H.; Stixrude, L. P.; Holmstrom, E.; Pinilla, C.

    2016-12-01

    The relative buoyancy of crystals and liquid is likely to exert a strong influence on the thermal and chemical evolution of the magma ocean. Theory indicates that liquids approach, but do not exceed the density of iso-chemical crystals in the deep mantle. The partitioning of heavy elements, such as Fe, is therefore likely to control whether crystals sink or float. While some experimental results exist, our knowledge of silicate liquid-crystal element partitioning is still limited in the deep mantle. We have developed a method for computing the Mg-Fe partitioning of Fe in such systems. We have focused initially on ferropericlase, as a relatively simple system where the buoyancy effects of Fe partitioning are likely to be large. The method is based on molecular dynamics driven by density functional theory (spin polarized, PBEsol+U). We compute the free energy of Mg for Fe substitution in simulations of liquid and B1 crystalline phases via adiabatic switching. We investigate the dependence of partitioning on pressure, temperature, and iron concentration. We find that the liquid is denser than the coexisting crystalline phase at all conditions studies. We also find that the high-spin to low-spin transition in the crystal and the liquid, have an important influence on partitioning behavior.

  3. An iterative network partition algorithm for accurate identification of dense network modules

    PubMed Central

    Sun, Siqi; Dong, Xinran; Fu, Yao; Tian, Weidong

    2012-01-01

    A key step in network analysis is to partition a complex network into dense modules. Currently, modularity is one of the most popular benefit functions used to partition network modules. However, recent studies suggested that it has an inherent limitation in detecting dense network modules. In this study, we observed that despite the limitation, modularity has the advantage of preserving the primary network structure of the undetected modules. Thus, we have developed a simple iterative Network Partition (iNP) algorithm to partition a network. The iNP algorithm provides a general framework in which any modularity-based algorithm can be implemented in the network partition step. Here, we tested iNP with three modularity-based algorithms: multi-step greedy (MSG), spectral clustering and Qcut. Compared with the original three methods, iNP achieved a significant improvement in the quality of network partition in a benchmark study with simulated networks, identified more modules with significantly better enrichment of functionally related genes in both yeast protein complex network and breast cancer gene co-expression network, and discovered more cancer-specific modules in the cancer gene co-expression network. As such, iNP should have a broad application as a general method to assist in the analysis of biological networks. PMID:22121225

  4. Partitioning of monomethylmercury between freshwater algae and water.

    PubMed

    Miles, C J; Moye, H A; Phlips, E J; Sargent, B

    2001-11-01

    Phytoplankton-water monomethylmercury (MeHg) partition constants (KpI) have been determined in the laboratory for two green algae Selenastrum capricornutum and Cosmarium botrytis, the blue-green algae Schizothrix calcicola, and the diatom Thallasiosira spp., algal species that are commonly found in natural surface waters. Two methods were used to determine KpI, the Freundlich isotherm method and the flow-through/dialysis bag method. Both methods yielded KpI values of about 10(6.6) for S. capricornutum and were not significantly different. The KpI for the four algae studied were similar except for Schizothrix, which was significantly lower than S. capricornutum. The KpI for MeHg and S. capricornutum (exponential growth) was not significantly different in systems with predominantly MeHgOH or MeHgCl species. This is consistent with other studies that show metal speciation controls uptake kinetics, but the reactivity with intracellular components controls steady-state concentrations. Partitioning constants determined with exponential and stationary phase S. capricornutum cells at the same conditions were not significantly different, while the partitioning constant for exponential phase, phosphorus-limited cells was significantly lower, suggesting that P-limitation alters the ecophysiology of S. capricornutum sufficiently to impact partitioning, which may then ultimately affect mercury levels in higher trophic species.

  5. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation.

    PubMed

    Huang, Dongmei; Xu, Chenyixuan; Zhao, Danfeng; Song, Wei; He, Qi

    2017-09-21

    Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertices, establish the edge based on marine events, and abstract the marine sensor network as a graph. Then, we construct a multi-objective balanced partition method to partition the abstract graph into multiple regions and store them in the cloud computing platform. This method effectively increases the correlation of the sensors and decreases the retrieval cost. On this basis, an incremental optimization strategy is designed to dynamically optimize existing partitions when new sensors are added into the network. Experimental results show that the proposed method can achieve the optimal layout for distributed storage in the process of disaster data retrieval in the China Sea area, and effectively optimize the result of partitions when new buoys are deployed, which eventually will provide efficient data access service for marine events.

  6. Comparison of Source Partitioning Methods for CO2 and H2O Fluxes Based on High Frequency Eddy Covariance Data

    NASA Astrophysics Data System (ADS)

    Klosterhalfen, Anne; Moene, Arnold; Schmidt, Marius; Ney, Patrizia; Graf, Alexander

    2017-04-01

    Source partitioning of eddy covariance (EC) measurements of CO2 into respiration and photosynthesis is routinely used for a better understanding of the exchange of greenhouse gases, especially between terrestrial ecosystems and the atmosphere. The most frequently used methods are usually based either on relations of fluxes to environmental drivers or on chamber measurements. However, they often depend strongly on assumptions or invasive measurements and do usually not offer partitioning estimates for latent heat fluxes into evaporation and transpiration. SCANLON and SAHU (2008) and SCANLON and KUSTAS (2010) proposed an promising method to estimate the contributions of transpiration and evaporation using measured high frequency time series of CO2 and H2O fluxes - no extra instrumentation necessary. This method (SK10 in the following) is based on the spatial separation and relative strength of sources and sinks of CO2 and water vapor among the sub-canopy and canopy. Assuming that air from those sources and sinks is not yet perfectly mixed before reaching EC sensors, partitioning is estimated based on the separate application of the flux-variance similarity theory to the stomatal and non-stomatal components of the regarded fluxes, as well as on additional assumptions on stomatal water use efficiency (WUE). The CO2 partitioning method after THOMAS et al. (2008) (TH08 in the following) also follows the argument that the dissimilarities of sources and sinks in and below a canopy affect the relation between H2O and CO2 fluctuations. Instead of involving assumptions on WUE, TH08 directly screens their scattergram for signals of joint respiration and evaporation events and applies a conditional sampling methodology. In spite of their different main targets (H2O vs. CO2), both methods can yield partitioning estimates on both fluxes. We therefore compare various sub-methods of SK10 and TH08 including own modifications (e.g., cluster analysis) to each other, to established source partitioning methods, and to chamber measurements at various agroecosystems. Further, profile measurements and a canopy-resolving Large Eddy Simulation model are used to test the assumptions involved in SK10. Scanlon, T.M., Kustas, W.P., 2010. Partitioning carbon dioxide and water vapor fluxes using correlation analysis. Agricultural and Forest Meteorology 150 (1), 89-99. Scanlon, T.M., Sahu, P., 2008. On the correlation structure of water vapor and carbon dioxide in the atmospheric surface layer: A basis for flux partitioning. Water Resources Research 44 (10), W10418, 15 pp. Thomas, C., Martin, J.G., Goeckede, M., Siqueira, M.B., Foken, T., Law, B.E., Loescher H.W., Katul, G., 2008. Estimating daytime subcanopy respiration from conditional sampling methods applied to multi-scalar high frequency turbulence time series. Agricultural and Forest Meteorology 148 (8-9), 1210-1229.

  7. Maritime Search and Rescue via Multiple Coordinated UAS

    DTIC Science & Technology

    2016-01-01

    partitioning method uses the underlying probability distribution assumptions to place that probability near the geometric center of the partitions. There...During partitioning the known locations are accommodated, but the unaccounted for objects are placed into geometrically unfavorable conditions. The...Zeitlin, A.D.: UAS Sence and Avoid Develop- ment - the Challenges of Technology, Standards, and Certification. Aerospace Sciences Meeting including

  8. Cost efficient CFD simulations: Proper selection of domain partitioning strategies

    NASA Astrophysics Data System (ADS)

    Haddadi, Bahram; Jordan, Christian; Harasek, Michael

    2017-10-01

    Computational Fluid Dynamics (CFD) is one of the most powerful simulation methods, which is used for temporally and spatially resolved solutions of fluid flow, heat transfer, mass transfer, etc. One of the challenges of Computational Fluid Dynamics is the extreme hardware demand. Nowadays super-computers (e.g. High Performance Computing, HPC) featuring multiple CPU cores are applied for solving-the simulation domain is split into partitions for each core. Some of the different methods for partitioning are investigated in this paper. As a practical example, a new open source based solver was utilized for simulating packed bed adsorption, a common separation method within the field of thermal process engineering. Adsorption can for example be applied for removal of trace gases from a gas stream or pure gases production like Hydrogen. For comparing the performance of the partitioning methods, a 60 million cell mesh for a packed bed of spherical adsorbents was created; one second of the adsorption process was simulated. Different partitioning methods available in OpenFOAM® (Scotch, Simple, and Hierarchical) have been used with different numbers of sub-domains. The effect of the different methods and number of processor cores on the simulation speedup and also energy consumption were investigated for two different hardware infrastructures (Vienna Scientific Clusters VSC 2 and VSC 3). As a general recommendation an optimum number of cells per processor core was calculated. Optimized simulation speed, lower energy consumption and consequently the cost effects are reported here.

  9. Polymers as Reference Partitioning Phase: Polymer Calibration for an Analytically Operational Approach To Quantify Multimedia Phase Partitioning.

    PubMed

    Gilbert, Dorothea; Witt, Gesine; Smedes, Foppe; Mayer, Philipp

    2016-06-07

    Polymers are increasingly applied for the enrichment of hydrophobic organic chemicals (HOCs) from various types of samples and media in many analytical partitioning-based measuring techniques. We propose using polymers as a reference partitioning phase and introduce polymer-polymer partitioning as the basis for a deeper insight into partitioning differences of HOCs between polymers, calibrating analytical methods, and consistency checking of existing and calculation of new partition coefficients. Polymer-polymer partition coefficients were determined for polychlorinated biphenyls (PCBs), polycyclic aromatic hydrocarbons (PAHs), and organochlorine pesticides (OCPs) by equilibrating 13 silicones, including polydimethylsiloxane (PDMS) and low-density polyethylene (LDPE) in methanol-water solutions. Methanol as cosolvent ensured that all polymers reached equilibrium while its effect on the polymers' properties did not significantly affect silicone-silicone partition coefficients. However, we noticed minor cosolvent effects on determined polymer-polymer partition coefficients. Polymer-polymer partition coefficients near unity confirmed identical absorption capacities of several PDMS materials, whereas larger deviations from unity were indicated within the group of silicones and between silicones and LDPE. Uncertainty in polymer volume due to imprecise coating thickness or the presence of fillers was identified as the source of error for partition coefficients. New polymer-based (LDPE-lipid, PDMS-air) and multimedia partition coefficients (lipid-water, air-water) were calculated by applying the new concept of a polymer as reference partitioning phase and by using polymer-polymer partition coefficients as conversion factors. The present study encourages the use of polymer-polymer partition coefficients, recognizing that polymers can serve as a linking third phase for a quantitative understanding of equilibrium partitioning of HOCs between any two phases.

  10. Enhancing data locality by using terminal propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrickson, B.; Leland, R.; Van Driessche, R.

    1995-12-31

    Terminal propagation is a method developed in the circuit placement community for adding constraints to graph partitioning problems. This paper adapts and expands this idea, and applies it to the problem of partitioning data structures among the processors of a parallel computer. We show how the constraints in terminal propagation can be used to encourage partitions in which messages are communicated only between architecturally near processors. We then show how these constraints can be handled in two important partitioning algorithms, spectral bisection and multilevel-KL. We compare the quality of partitions generated by these algorithms to each other and to Partitionsmore » generated by more familiar techniques.« less

  11. A partitioning strategy for nonuniform problems on multiprocessors

    NASA Technical Reports Server (NTRS)

    Berger, M. J.; Bokhari, S.

    1985-01-01

    The partitioning of a problem on a domain with unequal work estimates in different subddomains is considered in a way that balances the work load across multiple processors. Such a problem arises for example in solving partial differential equations using an adaptive method that places extra grid points in certain subregions of the domain. A binary decomposition of the domain is used to partition it into rectangles requiring equal computational effort. The communication costs of mapping this partitioning onto different microprocessors: a mesh-connected array, a tree machine and a hypercube is then studied. The communication cost expressions can be used to determine the optimal depth of the above partitioning.

  12. A Fifth-order Symplectic Trigonometrically Fitted Partitioned Runge-Kutta Method

    NASA Astrophysics Data System (ADS)

    Kalogiratou, Z.; Monovasilis, Th.; Simos, T. E.

    2007-09-01

    Trigonometrically fitted symplectic Partitioned Runge Kutta (EFSPRK) methods for the numerical integration of Hamoltonian systems with oscillatory solutions are derived. These methods integrate exactly differential systems whose solutions can be expressed as linear combinations of the set of functions sin(wx),cos(wx), w∈R. We modify a fifth order symplectic PRK method with six stages so to derive an exponentially fitted SPRK method. The methods are tested on the numerical integration of the two body problem.

  13. Experimental Method Development for Estimating Solid-phase Diffusion Coefficients and Material/Air Partition Coefficients of SVOCs

    EPA Science Inventory

    The solid-phase diffusion coefficient (Dm) and material-air partition coefficient (Kma) are key parameters for characterizing the sources and transport of semivolatile organic compounds (SVOCs) in the indoor environment. In this work, a new experimental method was developed to es...

  14. 77 FR 46289 - Technical Corrections to Organizational Names, Addresses, and OMB Control Numbers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-03

    ...]795.232 Inhalation and dermal pharmacokinetics of commercial hexane. * * * * * (c) * * * (2) * * * (i... to read as follows: Sec. 799.6755 TSCA partition coefficient (n-octanol/water), shake flask method... read as follows: Sec. 799.6756 TSCA partition coefficient (n-octanol/water), generator column method...

  15. Surveillance system and method having an operating mode partitioned fault classification model

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor)

    2005-01-01

    A system and method which partitions a parameter estimation model, a fault detection model, and a fault classification model for a process surveillance scheme into two or more coordinated submodels together providing improved diagnostic decision making for at least one determined operating mode of an asset.

  16. Estimation of octanol/water partition coefficient and aqueous solubility of environmental chemicals using molecular fingerprints and machine learning methods

    EPA Science Inventory

    Octanol/water partition coefficient (logP) and aqueous solubility (logS) are two important parameters in pharmacology and toxicology studies, and experimental measurements are usually time-consuming and expensive. In the present research, novel methods are presented for the estim...

  17. The impact of aerosol composition on the particle to gas partitioning of reactive mercury.

    PubMed

    Rutter, Andrew P; Schauer, James J

    2007-06-01

    A laboratory system was developed to study the gas-particle partitioning of reactive mercury (RM) as a function of aerosol composition in synthetic atmospheric particulate matter. The collection of RM was achieved by filter- and sorbent-based methods. Analyses of the RM collected on the filters and sorbents were performed using thermal extraction combined with cold vapor atomic fluorescence spectroscopy (CVAFS), allowing direct measurement of the RM load on the substrates. Laboratory measurements of the gas-particle partitioning coefficients of RM to atmospheric aerosol particles revealed a strong dependence on aerosol composition, with partitioning coefficients that varied by orders of magnitude depending on the composition of the particles. Particles of sodium nitrate and the chlorides of potassium and sodium had high partitioning coefficients, shifting the RM partitioning toward the particle phase, while ammonium sulfate, levoglucosan, and adipic acid caused the RM to partition toward the gas phase and, therefore, had partitioning coefficients that were lower by orders of magnitude.

  18. A Dual Super-Element Domain Decomposition Approach for Parallel Nonlinear Finite Element Analysis

    NASA Astrophysics Data System (ADS)

    Jokhio, G. A.; Izzuddin, B. A.

    2015-05-01

    This article presents a new domain decomposition method for nonlinear finite element analysis introducing the concept of dual partition super-elements. The method extends ideas from the displacement frame method and is ideally suited for parallel nonlinear static/dynamic analysis of structural systems. In the new method, domain decomposition is realized by replacing one or more subdomains in a "parent system," each with a placeholder super-element, where the subdomains are processed separately as "child partitions," each wrapped by a dual super-element along the partition boundary. The analysis of the overall system, including the satisfaction of equilibrium and compatibility at all partition boundaries, is realized through direct communication between all pairs of placeholder and dual super-elements. The proposed method has particular advantages for matrix solution methods based on the frontal scheme, and can be readily implemented for existing finite element analysis programs to achieve parallelization on distributed memory systems with minimal intervention, thus overcoming memory bottlenecks typically faced in the analysis of large-scale problems. Several examples are presented in this article which demonstrate the computational benefits of the proposed parallel domain decomposition approach and its applicability to the nonlinear structural analysis of realistic structural systems.

  19. Partitioning of functional gene expression data using principal points.

    PubMed

    Kim, Jaehee; Kim, Haseong

    2017-10-12

    DNA microarrays offer motivation and hope for the simultaneous study of variations in multiple genes. Gene expression is a temporal process that allows variations in expression levels with a characterized gene function over a period of time. Temporal gene expression curves can be treated as functional data since they are considered as independent realizations of a stochastic process. This process requires appropriate models to identify patterns of gene functions. The partitioning of the functional data can find homogeneous subgroups of entities for the massive genes within the inherent biological networks. Therefor it can be a useful technique for the analysis of time-course gene expression data. We propose a new self-consistent partitioning method of functional coefficients for individual expression profiles based on the orthonormal basis system. A principal points based functional partitioning method is proposed for time-course gene expression data. The method explores the relationship between genes using Legendre coefficients as principal points to extract the features of gene functions. Our proposed method provides high connectivity in connectedness after clustering for simulated data and finds a significant subsets of genes with the increased connectivity. Our approach has comparative advantages that fewer coefficients are used from the functional data and self-consistency of principal points for partitioning. As real data applications, we are able to find partitioned genes through the gene expressions found in budding yeast data and Escherichia coli data. The proposed method benefitted from the use of principal points, dimension reduction, and choice of orthogonal basis system as well as provides appropriately connected genes in the resulting subsets. We illustrate our method by applying with each set of cell-cycle-regulated time-course yeast genes and E. coli genes. The proposed method is able to identify highly connected genes and to explore the complex dynamics of biological systems in functional genomics.

  20. Evaluation and routine application of the novel restricted-access precolumn packing material Alkyl-Diol Silica: coupled-column high-performance liquid chromatographic analysis of the photoreactive drug 8-methoxypsoralen in plasma.

    PubMed

    Vielhauer, S; Rudolphi, A; Boos, K S; Seidel, D

    1995-04-21

    A fully automated coupled-column HPLC method for on-line sample processing and determination of the photoreactive drug 8-methoxypsoralen (8-MOP) in plasma has been developed. The method is based on the novel internal-surface reversed-phase precolumn packing materials Alkyl-Diol Silica (ADS). This new family of restricted-access materials has a hydrophilic, electroneutral outer particle surface and a hydrophobic internal pore surface. The supports tolerate the direct and repetitive injection of proteinaceous fluids such as plasma and allow a classical C18-, C8- or C4-reversed-phase partitioning at the internal (pore) surface. The total protein load, i.e. the lifetime of the precolumn used in this study (C8-Alkyl-Diol Silica, 25 microns, 25 x 4 mm I.D.), exceeds more than 100 ml of plasma. 8-MOP was detected by its native fluorescence (excitation 312 nm, emission 540 nm). Validation of the method revealed a quantitative and matrix-independent recovery (99.5-101.3% measured at five concentrations between 21.3 and 625.2 ng of 8-MOP per milliliter of plasma), linearity over a wide range of 8-MOP concentrations (1.2-3070 ng of 8-MOP/ml, r = 0.999), low limits of detection (0.39 ng of 8-MOP/ml) and quantitation (0.79 ng of 8-MOP/ml) and a high between-run (C.V. 1.47%, n = 10) and within-run (C.V. 1.33%, n = 10) reproducibility. This paper introduces coupled-column HPLC as a suitable method for on-site analysis of drug plasma profiles (bedside-monitoring).

  1. MUSCLE: multiple sequence alignment with high accuracy and high throughput.

    PubMed

    Edgar, Robert C

    2004-01-01

    We describe MUSCLE, a new computer program for creating multiple alignments of protein sequences. Elements of the algorithm include fast distance estimation using kmer counting, progressive alignment using a new profile function we call the log-expectation score, and refinement using tree-dependent restricted partitioning. The speed and accuracy of MUSCLE are compared with T-Coffee, MAFFT and CLUSTALW on four test sets of reference alignments: BAliBASE, SABmark, SMART and a new benchmark, PREFAB. MUSCLE achieves the highest, or joint highest, rank in accuracy on each of these sets. Without refinement, MUSCLE achieves average accuracy statistically indistinguishable from T-Coffee and MAFFT, and is the fastest of the tested methods for large numbers of sequences, aligning 5000 sequences of average length 350 in 7 min on a current desktop computer. The MUSCLE program, source code and PREFAB test data are freely available at http://www.drive5. com/muscle.

  2. Improving the efficiency of branch-and-bound complete-search NMR assignment using the symmetry of molecules and spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernal, Andrés; Patiny, Luc; Castillo, Andrés M.

    2015-02-21

    Nuclear magnetic resonance (NMR) assignment of small molecules is presented as a typical example of a combinatorial optimization problem in chemical physics. Three strategies that help improve the efficiency of solution search by the branch and bound method are presented: 1. reduction of the size of the solution space by resort to a condensed structure formula, wherein symmetric nuclei are grouped together; 2. partitioning of the solution space based on symmetry, that becomes the basis for an efficient branching procedure; and 3. a criterion of selection of input restrictions that leads to increased gaps between branches and thus faster pruningmore » of non-viable solutions. Although the examples chosen to illustrate this work focus on small-molecule NMR assignment, the results are generic and might help solving other combinatorial optimization problems.« less

  3. Automatic partitioning of unstructured meshes for the parallel solution of problems in computational mechanics

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel; Lesoinne, Michel

    1993-01-01

    Most of the recently proposed computational methods for solving partial differential equations on multiprocessor architectures stem from the 'divide and conquer' paradigm and involve some form of domain decomposition. For those methods which also require grids of points or patches of elements, it is often necessary to explicitly partition the underlying mesh, especially when working with local memory parallel processors. In this paper, a family of cost-effective algorithms for the automatic partitioning of arbitrary two- and three-dimensional finite element and finite difference meshes is presented and discussed in view of a domain decomposed solution procedure and parallel processing. The influence of the algorithmic aspects of a solution method (implicit/explicit computations), and the architectural specifics of a multiprocessor (SIMD/MIMD, startup/transmission time), on the design of a mesh partitioning algorithm are discussed. The impact of the partitioning strategy on load balancing, operation count, operator conditioning, rate of convergence and processor mapping is also addressed. Finally, the proposed mesh decomposition algorithms are demonstrated with realistic examples of finite element, finite volume, and finite difference meshes associated with the parallel solution of solid and fluid mechanics problems on the iPSC/2 and iPSC/860 multiprocessors.

  4. dPCR: A Technology Review

    PubMed Central

    Quan, Phenix-Lan; Sauzade, Martin

    2018-01-01

    Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144

  5. Prediction of distribution coefficient from structure. 1. Estimation method.

    PubMed

    Csizmadia, F; Tsantili-Kakoulidou, A; Panderi, I; Darvas, F

    1997-07-01

    A method has been developed for the estimation of the distribution coefficient (D), which considers the microspecies of a compound. D is calculated from the microscopic dissociation constants (microconstants), the partition coefficients of the microspecies, and the counterion concentration. A general equation for the calculation of D at a given pH is presented. The microconstants are calculated from the structure using Hammett and Taft equations. The partition coefficients of the ionic microspecies are predicted by empirical equations using the dissociation constants and the partition coefficient of the uncharged species, which are estimated from the structure by a Linear Free Energy Relationship method. The algorithm is implemented in a program module called PrologD.

  6. Nonlinear phase noise tolerance for coherent optical systems using soft-decision-aided ML carrier phase estimation enhanced with constellation partitioning

    NASA Astrophysics Data System (ADS)

    Li, Yan; Wu, Mingwei; Du, Xinwei; Xu, Zhuoran; Gurusamy, Mohan; Yu, Changyuan; Kam, Pooi-Yuen

    2018-02-01

    A novel soft-decision-aided maximum likelihood (SDA-ML) carrier phase estimation method and its simplified version, the decision-aided and soft-decision-aided maximum likelihood (DA-SDA-ML) methods are tested in a nonlinear phase noise-dominant channel. The numerical performance results show that both the SDA-ML and DA-SDA-ML methods outperform the conventional DA-ML in systems with constant-amplitude modulation formats. In addition, modified algorithms based on constellation partitioning are proposed. With partitioning, the modified SDA-ML and DA-SDA-ML are shown to be useful for compensating the nonlinear phase noise in multi-level modulation systems.

  7. "K"-Balance Partitioning: An Exact Method with Applications to Generalized Structural Balance and Other Psychological Contexts

    ERIC Educational Resources Information Center

    Brusco, Michael; Steinley, Douglas

    2010-01-01

    Structural balance theory (SBT) has maintained a venerable status in the psychological literature for more than 5 decades. One important problem pertaining to SBT is the approximation of structural or generalized balance via the partitioning of the vertices of a signed graph into "K" clusters. This "K"-balance partitioning problem also has more…

  8. Study and modeling of the evolution of gas-liquid partitioning of hydrogen sulfide in model solutions simulating winemaking fermentations.

    PubMed

    Mouret, Jean-Roch; Sablayrolles, Jean-Marie; Farines, Vincent

    2015-04-01

    The knowledge of gas-liquid partitioning of aroma compounds during winemaking fermentation could allow optimization of fermentation management, maximizing concentrations of positive markers of aroma and minimizing formation of molecules, such as hydrogen sulfide (H2S), responsible for defects. In this study, the effect of the main fermentation parameters on the gas-liquid partition coefficients (Ki) of H2S was assessed. The Ki for this highly volatile sulfur compound was measured in water by an original semistatic method developed in this work for the determination of gas-liquid partitioning. This novel method was validated and then used to determine the Ki of H2S in synthetic media simulating must, fermenting musts at various steps of the fermentation process, and wine. Ki values were found to be mainly dependent on the temperature but also varied with the composition of the medium, especially with the glucose concentration. Finally, a model was developed to quantify the gas-liquid partitioning of H2S in synthetic media simulating must to wine. This model allowed a very accurate prediction of the partition coefficient of H2S: the difference between observed and predicted values never exceeded 4%.

  9. Optimization model for UDWDM-PON deployment based on physical restrictions and asymmetric user's clustering

    NASA Astrophysics Data System (ADS)

    Arévalo, Germán. V.; Hincapié, Roberto C.; Sierra, Javier E.

    2015-09-01

    UDWDM PON is a leading technology oriented to provide ultra-high bandwidth to final users while profiting the physical channels' capability. One of the main drawbacks of UDWDM technique is the fact that the nonlinear effects, like FWM, become stronger due to the close spectral proximity among channels. This work proposes a model for the optimal deployment of this type of networks taking into account the fiber length limitations imposed by physical restrictions related with the fiber's data transmission as well as the users' asymmetric distribution in a provided region. The proposed model employs the data transmission related effects in UDWDM PON as restrictions in the optimization problem and also considers the user's asymmetric clustering and the subdivision of the users region though a Voronoi geometric partition technique. Here it is considered de Voronoi dual graph, it is the Delaunay Triangulation, as the planar graph for resolving the problem related with the minimum weight of the fiber links.

  10. Total strain version of strainrange partitioning for thermomechanical fatigue at low strains

    NASA Technical Reports Server (NTRS)

    Halford, G. R.; Saltsman, J. F.

    1987-01-01

    A new method is proposed for characterizing and predicting the thermal fatigue behavior of materials. The method is based on three innovations in characterizing high temperature material behavior: (1) the bithermal concept of fatigue testing; (2) advanced, nonlinear, cyclic constitutive models; and (3) the total strain version of traditional strainrange partitioning.

  11. High-performance reconfigurable hardware architecture for restricted Boltzmann machines.

    PubMed

    Ly, Daniel Le; Chow, Paul

    2010-11-01

    Despite the popularity and success of neural networks in research, the number of resulting commercial or industrial applications has been limited. A primary cause for this lack of adoption is that neural networks are usually implemented as software running on general-purpose processors. Hence, a hardware implementation that can exploit the inherent parallelism in neural networks is desired. This paper investigates how the restricted Boltzmann machine (RBM), which is a popular type of neural network, can be mapped to a high-performance hardware architecture on field-programmable gate array (FPGA) platforms. The proposed modular framework is designed to reduce the time complexity of the computations through heavily customized hardware engines. A method to partition large RBMs into smaller congruent components is also presented, allowing the distribution of one RBM across multiple FPGA resources. The framework is tested on a platform of four Xilinx Virtex II-Pro XC2VP70 FPGAs running at 100 MHz through a variety of different configurations. The maximum performance was obtained by instantiating an RBM of 256 × 256 nodes distributed across four FPGAs, which resulted in a computational speed of 3.13 billion connection-updates-per-second and a speedup of 145-fold over an optimized C program running on a 2.8-GHz Intel processor.

  12. Modifications to the Patient Rule-Induction Method that utilize non-additive combinations of genetic and environmental effects to define partitions that predict ischemic heart disease.

    PubMed

    Dyson, Greg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G; Tybjaerg-Hansen, Anne; Sing, Charles F

    2009-05-01

    This article extends the Patient Rule-Induction Method (PRIM) for modeling cumulative incidence of disease developed by Dyson et al. (Genet Epidemiol 31:515-527) to include the simultaneous consideration of non-additive combinations of predictor variables, a significance test of each combination, an adjustment for multiple testing and a confidence interval for the estimate of the cumulative incidence of disease in each partition. We employ the partitioning algorithm component of the Combinatorial Partitioning Method to construct combinations of predictors, permutation testing to assess the significance of each combination, theoretical arguments for incorporating a multiple testing adjustment and bootstrap resampling to produce the confidence intervals. An illustration of this revised PRIM utilizing a sample of 2,258 European male participants from the Copenhagen City Heart Study is presented that assesses the utility of genetic variants in predicting the presence of ischemic heart disease beyond the established risk factors.

  13. Modifications to the Patient Rule-Induction Method that utilize non-additive combinations of genetic and environmental effects to define partitions that predict ischemic heart disease

    PubMed Central

    Dyson, Greg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G.; Tybjærg-Hansen, Anne; Sing, Charles F.

    2009-01-01

    This paper extends the Patient Rule-Induction Method (PRIM) for modeling cumulative incidence of disease developed by Dyson et al. (2007) to include the simultaneous consideration of non-additive combinations of predictor variables, a significance test of each combination, an adjustment for multiple testing and a confidence interval for the estimate of the cumulative incidence of disease in each partition. We employ the partitioning algorithm component of the Combinatorial Partitioning Method (CPM) to construct combinations of predictors, permutation testing to assess the significance of each combination, theoretical arguments for incorporating a multiple testing adjustment and bootstrap resampling to produce the confidence intervals. An illustration of this revised PRIM utilizing a sample of 2258 European male participants from the Copenhagen City Heart Study is presented that assesses the utility of genetic variants in predicting the presence of ischemic heart disease beyond the established risk factors. PMID:19025787

  14. Multiple Attribute Group Decision-Making Methods Based on Trapezoidal Fuzzy Two-Dimensional Linguistic Partitioned Bonferroni Mean Aggregation Operators.

    PubMed

    Yin, Kedong; Yang, Benshuo; Li, Xuemei

    2018-01-24

    In this paper, we investigate multiple attribute group decision making (MAGDM) problems where decision makers represent their evaluation of alternatives by trapezoidal fuzzy two-dimensional uncertain linguistic variable. To begin with, we introduce the definition, properties, expectation, operational laws of trapezoidal fuzzy two-dimensional linguistic information. Then, to improve the accuracy of decision making in some case where there are a sort of interrelationship among the attributes, we analyze partition Bonferroni mean (PBM) operator in trapezoidal fuzzy two-dimensional variable environment and develop two operators: trapezoidal fuzzy two-dimensional linguistic partitioned Bonferroni mean (TF2DLPBM) aggregation operator and trapezoidal fuzzy two-dimensional linguistic weighted partitioned Bonferroni mean (TF2DLWPBM) aggregation operator. Furthermore, we develop a novel method to solve MAGDM problems based on TF2DLWPBM aggregation operator. Finally, a practical example is presented to illustrate the effectiveness of this method and analyses the impact of different parameters on the results of decision-making.

  15. Multiple Attribute Group Decision-Making Methods Based on Trapezoidal Fuzzy Two-Dimensional Linguistic Partitioned Bonferroni Mean Aggregation Operators

    PubMed Central

    Yin, Kedong; Yang, Benshuo

    2018-01-01

    In this paper, we investigate multiple attribute group decision making (MAGDM) problems where decision makers represent their evaluation of alternatives by trapezoidal fuzzy two-dimensional uncertain linguistic variable. To begin with, we introduce the definition, properties, expectation, operational laws of trapezoidal fuzzy two-dimensional linguistic information. Then, to improve the accuracy of decision making in some case where there are a sort of interrelationship among the attributes, we analyze partition Bonferroni mean (PBM) operator in trapezoidal fuzzy two-dimensional variable environment and develop two operators: trapezoidal fuzzy two-dimensional linguistic partitioned Bonferroni mean (TF2DLPBM) aggregation operator and trapezoidal fuzzy two-dimensional linguistic weighted partitioned Bonferroni mean (TF2DLWPBM) aggregation operator. Furthermore, we develop a novel method to solve MAGDM problems based on TF2DLWPBM aggregation operator. Finally, a practical example is presented to illustrate the effectiveness of this method and analyses the impact of different parameters on the results of decision-making. PMID:29364849

  16. Field-gradient partitioning for fracture and frictional contact in the material point method: Field-gradient partitioning for fracture and frictional contact in the material point method [Fracture and frictional contact in material point method using damage-field gradients for velocity-field partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Homel, Michael A.; Herbold, Eric B.

    Contact and fracture in the material point method require grid-scale enrichment or partitioning of material into distinct velocity fields to allow for displacement or velocity discontinuities at a material interface. We present a new method which a kernel-based damage field is constructed from the particle data. The gradient of this field is used to dynamically repartition the material into contact pairs at each node. Our approach avoids the need to construct and evolve explicit cracks or contact surfaces and is therefore well suited to problems involving complex 3-D fracture with crack branching and coalescence. A straightforward extension of this approachmore » permits frictional ‘self-contact’ between surfaces that are initially part of a single velocity field, enabling more accurate simulation of granular flow, porous compaction, fragmentation, and comminution of brittle materials. Finally, numerical simulations of self contact and dynamic crack propagation are presented to demonstrate the accuracy of the approach.« less

  17. Field-gradient partitioning for fracture and frictional contact in the material point method: Field-gradient partitioning for fracture and frictional contact in the material point method [Fracture and frictional contact in material point method using damage-field gradients for velocity-field partitioning

    DOE PAGES

    Homel, Michael A.; Herbold, Eric B.

    2016-08-15

    Contact and fracture in the material point method require grid-scale enrichment or partitioning of material into distinct velocity fields to allow for displacement or velocity discontinuities at a material interface. We present a new method which a kernel-based damage field is constructed from the particle data. The gradient of this field is used to dynamically repartition the material into contact pairs at each node. Our approach avoids the need to construct and evolve explicit cracks or contact surfaces and is therefore well suited to problems involving complex 3-D fracture with crack branching and coalescence. A straightforward extension of this approachmore » permits frictional ‘self-contact’ between surfaces that are initially part of a single velocity field, enabling more accurate simulation of granular flow, porous compaction, fragmentation, and comminution of brittle materials. Finally, numerical simulations of self contact and dynamic crack propagation are presented to demonstrate the accuracy of the approach.« less

  18. Accelerated decomposition techniques for large discounted Markov decision processes

    NASA Astrophysics Data System (ADS)

    Larach, Abdelhadi; Chafik, S.; Daoui, C.

    2017-12-01

    Many hierarchical techniques to solve large Markov decision processes (MDPs) are based on the partition of the state space into strongly connected components (SCCs) that can be classified into some levels. In each level, smaller problems named restricted MDPs are solved, and then these partial solutions are combined to obtain the global solution. In this paper, we first propose a novel algorithm, which is a variant of Tarjan's algorithm that simultaneously finds the SCCs and their belonging levels. Second, a new definition of the restricted MDPs is presented to ameliorate some hierarchical solutions in discounted MDPs using value iteration (VI) algorithm based on a list of state-action successors. Finally, a robotic motion-planning example and the experiment results are presented to illustrate the benefit of the proposed decomposition algorithms.

  19. [Analytic methods for seed models with genotype x environment interactions].

    PubMed

    Zhu, J

    1996-01-01

    Genetic models with genotype effect (G) and genotype x environment interaction effect (GE) are proposed for analyzing generation means of seed quantitative traits in crops. The total genetic effect (G) is partitioned into seed direct genetic effect (G0), cytoplasm genetic of effect (C), and maternal plant genetic effect (Gm). Seed direct genetic effect (G0) can be further partitioned into direct additive (A) and direct dominance (D) genetic components. Maternal genetic effect (Gm) can also be partitioned into maternal additive (Am) and maternal dominance (Dm) genetic components. The total genotype x environment interaction effect (GE) can also be partitioned into direct genetic by environment interaction effect (G0E), cytoplasm genetic by environment interaction effect (CE), and maternal genetic by environment interaction effect (GmE). G0E can be partitioned into direct additive by environment interaction (AE) and direct dominance by environment interaction (DE) genetic components. GmE can also be partitioned into maternal additive by environment interaction (AmE) and maternal dominance by environment interaction (DmE) genetic components. Partitions of genetic components are listed for parent, F1, F2 and backcrosses. A set of parents, their reciprocal F1 and F2 seeds is applicable for efficient analysis of seed quantitative traits. MINQUE(0/1) method can be used for estimating variance and covariance components. Unbiased estimation for covariance components between two traits can also be obtained by the MINQUE(0/1) method. Random genetic effects in seed models are predictable by the Adjusted Unbiased Prediction (AUP) approach with MINQUE(0/1) method. The jackknife procedure is suggested for estimation of sampling variances of estimated variance and covariance components and of predicted genetic effects, which can be further used in a t-test for parameter. Unbiasedness and efficiency for estimating variance components and predicting genetic effects are tested by Monte Carlo simulations.

  20. Uncertain Henry's law constants compromise equilibrium partitioning calculations of atmospheric oxidation products

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Yuan, Tiange; Wood, Stephen A.; Goss, Kai-Uwe; Li, Jingyi; Ying, Qi; Wania, Frank

    2017-06-01

    Gas-particle partitioning governs the distribution, removal, and transport of organic compounds in the atmosphere and the formation of secondary organic aerosol (SOA). The large variety of atmospheric species and their wide range of properties make predicting this partitioning equilibrium challenging. Here we expand on earlier work and predict gas-organic and gas-aqueous phase partitioning coefficients for 3414 atmospherically relevant molecules using COSMOtherm, SPARC Performs Automated Reasoning in Chemistry (SPARC), and poly-parameter linear free-energy relationships. The Master Chemical Mechanism generated the structures by oxidizing primary emitted volatile organic compounds. Predictions for gas-organic phase partitioning coefficients (KWIOM/G) by different methods are on average within 1 order of magnitude of each other, irrespective of the numbers of functional groups, except for predictions by COSMOtherm and SPARC for compounds with more than three functional groups, which have a slightly higher discrepancy. Discrepancies between predictions of gas-aqueous partitioning (KW/G) are much larger and increase with the number of functional groups in the molecule. In particular, COSMOtherm often predicts much lower KW/G for highly functionalized compounds than the other methods. While the quantum-chemistry-based COSMOtherm accounts for the influence of intra-molecular interactions on conformation, highly functionalized molecules likely fall outside of the applicability domain of the other techniques, which at least in part rely on empirical data for calibration. Further analysis suggests that atmospheric phase distribution calculations are sensitive to the partitioning coefficient estimation method, in particular to the estimated value of KW/G. The large uncertainty in KW/G predictions for highly functionalized organic compounds needs to be resolved to improve the quantitative treatment of SOA formation.

  1. Programmable partitioning for high-performance coherence domains in a multiprocessor system

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Salapura, Valentina [Chappaqua, NY

    2011-01-25

    A multiprocessor computing system and a method of logically partitioning a multiprocessor computing system are disclosed. The multiprocessor computing system comprises a multitude of processing units, and a multitude of snoop units. Each of the processing units includes a local cache, and the snoop units are provided for supporting cache coherency in the multiprocessor system. Each of the snoop units is connected to a respective one of the processing units and to all of the other snoop units. The multiprocessor computing system further includes a partitioning system for using the snoop units to partition the multitude of processing units into a plurality of independent, memory-consistent, adjustable-size processing groups. Preferably, when the processor units are partitioned into these processing groups, the partitioning system also configures the snoop units to maintain cache coherency within each of said groups.

  2. Random Partition Distribution Indexed by Pairwise Information

    PubMed Central

    Dahl, David B.; Day, Ryan; Tsai, Jerry W.

    2017-01-01

    We propose a random partition distribution indexed by pairwise similarity information such that partitions compatible with the similarities are given more probability. The use of pairwise similarities, in the form of distances, is common in some clustering algorithms (e.g., hierarchical clustering), but we show how to use this type of information to define a prior partition distribution for flexible Bayesian modeling. A defining feature of the distribution is that it allocates probability among partitions within a given number of subsets, but it does not shift probability among sets of partitions with different numbers of subsets. Our distribution places more probability on partitions that group similar items yet keeps the total probability of partitions with a given number of subsets constant. The distribution of the number of subsets (and its moments) is available in closed-form and is not a function of the similarities. Our formulation has an explicit probability mass function (with a tractable normalizing constant) so the full suite of MCMC methods may be used for posterior inference. We compare our distribution with several existing partition distributions, showing that our formulation has attractive properties. We provide three demonstrations to highlight the features and relative performance of our distribution. PMID:29276318

  3. Determination of partition coefficients using 1 H NMR spectroscopy and time domain complete reduction to amplitude-frequency table (CRAFT) analysis.

    PubMed

    Soulsby, David; Chica, Jeryl A M

    2017-08-01

    We have developed a simple, direct and novel method for the determination of partition coefficients and partitioning behavior using 1 H NMR spectroscopy combined with time domain complete reduction to amplitude-frequency tables (CRAFT). After partitioning into water and 1-octanol using standard methods, aliquots from each layer are directly analyzed using either proton or selective excitation NMR experiments. Signal amplitudes for each compound from each layer are then extracted directly from the time domain data in an automated fashion and analyzed using the CRAFT software. From these amplitudes, log P and log D 7.4 values can be calculated directly. Phase, baseline and internal standard issues, which can be problematic when Fourier transformed data are used, are unimportant when using time domain data. Furthermore, analytes can contain impurities because only a single resonance is examined and need not be UV active. Using this approach, we examined a variety of pharmaceutically relevant compounds and determined partition coefficients that are in excellent agreement with literature values. To demonstrate the utility of this approach, we also examined salicylic acid in more detail demonstrating an aggregation effect as a function of sample loading and partition coefficient behavior as a function of pH value. This method provides a valuable addition to the medicinal chemist toolbox for determining these important constants. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Information-theoretic indices usage for the prediction and calculation of octanol-water partition coefficient.

    PubMed

    Persona, Marek; Kutarov, Vladimir V; Kats, Boris M; Persona, Andrzej; Marczewska, Barbara

    2007-01-01

    The paper describes the new prediction method of octanol-water partition coefficient, which is based on molecular graph theory. The results obtained using the new method are well correlated with experimental values. These results were compared with the ones obtained by use of ten other structure correlated methods. The comparison shows that graph theory can be very useful in structure correlation research.

  5. Entanglement polygon inequality in qubit systems

    NASA Astrophysics Data System (ADS)

    Qian, Xiao-Feng; Alonso, Miguel A.; Eberly, J. H.

    2018-06-01

    We prove a set of tight entanglement inequalities for arbitrary N-qubit pure states. By focusing on all bi-partite marginal entanglements between each single qubit and its remaining partners, we show that the inequalities provide an upper bound for each marginal entanglement, while the known monogamy relation establishes the lower bound. The restrictions and sharing properties associated with the inequalities are further analyzed with a geometric polytope approach, and examples of three-qubit GHZ-class and W-class entangled states are presented to illustrate the results.

  6. A partitioned correlation function interaction approach for describing electron correlation in atoms

    NASA Astrophysics Data System (ADS)

    Verdebout, S.; Rynkun, P.; Jönsson, P.; Gaigalas, G.; Froese Fischer, C.; Godefroid, M.

    2013-04-01

    The traditional multiconfiguration Hartree-Fock (MCHF) and configuration interaction (CI) methods are based on a single orthonormal orbital basis. For atoms with many closed core shells, or complicated shell structures, a large orbital basis is needed to saturate the different electron correlation effects such as valence, core-valence and correlation within the core shells. The large orbital basis leads to massive configuration state function (CSF) expansions that are difficult to handle, even on large computer systems. We show that it is possible to relax the orthonormality restriction on the orbital basis and break down the originally very large calculations into a series of smaller calculations that can be run in parallel. Each calculation determines a partitioned correlation function (PCF) that accounts for a specific correlation effect. The PCFs are built on optimally localized orbital sets and are added to a zero-order multireference (MR) function to form a total wave function. The expansion coefficients of the PCFs are determined from a low dimensional generalized eigenvalue problem. The interaction and overlap matrices are computed using a biorthonormal transformation technique (Verdebout et al 2010 J. Phys. B: At. Mol. Phys. 43 074017). The new method, called partitioned correlation function interaction (PCFI), converges rapidly with respect to the orbital basis and gives total energies that are lower than the ones from ordinary MCHF and CI calculations. The PCFI method is also very flexible when it comes to targeting different electron correlation effects. Focusing our attention on neutral lithium, we show that by dedicating a PCF to the single excitations from the core, spin- and orbital-polarization effects can be captured very efficiently, leading to highly improved convergence patterns for hyperfine parameters compared with MCHF calculations based on a single orthogonal radial orbital basis. By collecting separately optimized PCFs to correct the MR function, the variational degrees of freedom in the relative mixing coefficients of the CSFs building the PCFs are inhibited. The constraints on the mixing coefficients lead to small off-sets in computed properties such as hyperfine structure, isotope shift and transition rates, with respect to the correct values. By (partially) deconstraining the mixing coefficients one converges to the correct limits and keeps the tremendous advantage of improved convergence rates that comes from the use of several orbital sets. Reducing ultimately each PCF to a single CSF with its own orbital basis leads to a non-orthogonal CI approach. Various perspectives of the new method are given.

  7. A novel method for the determination of adsorption partition coefficients of minor gases in a shale sample by headspace gas chromatography.

    PubMed

    Zhang, Chun-Yun; Hu, Hui-Chao; Chai, Xin-Sheng; Pan, Lei; Xiao, Xian-Ming

    2013-10-04

    A novel method has been developed for the determination of adsorption partition coefficient (Kd) of minor gases in shale. The method uses samples of two different sizes (masses) of the same material, from which the partition coefficient of the gas can be determined from two independent headspace gas chromatographic (HS-GC) measurements. The equilibrium for the model gas (ethane) was achieved in 5h at 120°C. The method also involves establishing an equation based on the Kd at higher equilibrium temperature, from which the Kd at lower temperature can be calculated. Although the HS-GC method requires some time and effort, it is simpler and quicker than the isothermal adsorption method that is in widespread use today. As a result, the method is simple and practical and can be a valuable tool for shale gas-related research and applications. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. New Parallel Algorithms for Landscape Evolution Model

    NASA Astrophysics Data System (ADS)

    Jin, Y.; Zhang, H.; Shi, Y.

    2017-12-01

    Most landscape evolution models (LEM) developed in the last two decades solve the diffusion equation to simulate the transportation of surface sediments. This numerical approach is difficult to parallelize due to the computation of drainage area for each node, which needs huge amount of communication if run in parallel. In order to overcome this difficulty, we developed two parallel algorithms for LEM with a stream net. One algorithm handles the partition of grid with traditional methods and applies an efficient global reduction algorithm to do the computation of drainage areas and transport rates for the stream net; the other algorithm is based on a new partition algorithm, which partitions the nodes in catchments between processes first, and then partitions the cells according to the partition of nodes. Both methods focus on decreasing communication between processes and take the advantage of massive computing techniques, and numerical experiments show that they are both adequate to handle large scale problems with millions of cells. We implemented the two algorithms in our program based on the widely used finite element library deal.II, so that it can be easily coupled with ASPECT.

  9. Convex Regression with Interpretable Sharp Partitions

    PubMed Central

    Petersen, Ashley; Simon, Noah; Witten, Daniela

    2016-01-01

    We consider the problem of predicting an outcome variable on the basis of a small number of covariates, using an interpretable yet non-additive model. We propose convex regression with interpretable sharp partitions (CRISP) for this task. CRISP partitions the covariate space into blocks in a data-adaptive way, and fits a mean model within each block. Unlike other partitioning methods, CRISP is fit using a non-greedy approach by solving a convex optimization problem, resulting in low-variance fits. We explore the properties of CRISP, and evaluate its performance in a simulation study and on a housing price data set. PMID:27635120

  10. Semi-implicit time integration of atmospheric flows with characteristic-based flux partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghosh, Debojyoti; Constantinescu, Emil M.

    2016-06-23

    Here, this paper presents a characteristic-based flux partitioning for the semi-implicit time integration of atmospheric flows. Nonhydrostatic models require the solution of the compressible Euler equations. The acoustic time scale is significantly faster than the advective scale, yet it is typically not relevant to atmospheric and weather phenomena. The acoustic and advective components of the hyperbolic flux are separated in the characteristic space. High-order, conservative additive Runge-Kutta methods are applied to the partitioned equations so that the acoustic component is integrated in time implicitly with an unconditionally stable method, while the advective component is integrated explicitly. The time step ofmore » the overall algorithm is thus determined by the advective scale. Benchmark flow problems are used to demonstrate the accuracy, stability, and convergence of the proposed algorithm. The computational cost of the partitioned semi-implicit approach is compared with that of explicit time integration.« less

  11. A comparison of two methods for determining copper partitioning in oxidized sediments

    USGS Publications Warehouse

    Luoma, S.N.

    1986-01-01

    Model estimations of the proportion of Cu in oxidized sediments associated with extractable organic materials show some agreement with the proportion of Cu extracted from those sediments with ammonium hydroxide. Data were from 17 estuaries of widely differing sediment chemistry. The modelling and extraction methods agreed best where concentrations of organic materials were either in very high concentrations, relative to other sediment components, or in very low concentrations. In the range of component concentrations where the model predicted Cu should be distributed among a variety of components, agreement between the methods was poor. Both approaches indicated that Cu was predominantly partitioned to organic materials in some sediments, and predominantly partitioned to other components (most probably iron oxides and manganese oxides) in other sediments, and that these differences were related to the relative abundances of the specific components in the sediment. Although the results of the two methods of estimating Cu partitioning to organics correlated significantly among 24 stations from the 17 estuaries, the variability in the relationship suggested refinement of parameter values and verification of some important assumptions were essential to the further development of a reasonable model. ?? 1986.

  12. Clinopyroxene-melt element partitioning during interaction between trachybasaltic magma and siliceous crust: Clues from quartzite enclaves at Mt. Etna volcano

    NASA Astrophysics Data System (ADS)

    Mollo, S.; Blundy, J. D.; Giacomoni, P.; Nazzari, M.; Scarlato, P.; Coltorti, M.; Langone, A.; Andronico, D.

    2017-07-01

    A peculiar characteristic of the paroxysmal sequence that occurred on March 16, 2013 at the New South East Crater of Mt. Etna volcano (eastern Sicily, Italy) was the eruption of siliceous crustal xenoliths representative of the sedimentary basement beneath the volcanic edifice. These xenoliths are quartzites that occur as subspherical bombs enclosed in a thin trachybasaltic lava envelope. At the quartzite-magma interface a reaction corona develops due to the interaction between the Etnean trachybasaltic magma and the partially melted quartzite. Three distinct domains are observed: (i) the trachybasaltic lava itself (Zone 1), including Al-rich clinopyroxene phenocrysts dispersed in a matrix glass, (ii) the hybrid melt (Zone 2), developing at the quartzite-magma interface and feeding the growth of newly-formed Al-poor clinopyroxenes, and (iii) the partially melted quartzite (Zone 3), producing abundant siliceous melt. These features makes it possible to quantify the effect of magma contamination by siliceous crust in terms of clinopyroxene-melt element partitioning. Major and trace element partition coefficients have been calculated using the compositions of clinopyroxene rims and glasses next to the crystal surface. Zone 1 and Zone 2 partition coefficients correspond to, respectively, the chemical analyses of Al-rich phenocrysts and matrix glasses, and the chemical analyses of newly-formed Al-poor crystals and hybrid glasses. For clinopyroxenes from both the hybrid layer and the lava flow expected relationships are observed between the partition coefficient, the valence of the element, and the ionic radius. However, with respect to Zone 1 partition coefficients, values of Zone 2 partition coefficients show a net decrease for transition metals (TE), high-field strength elements (HFSE) and rare earth elements including yttrium (REE + Y), and an increase for large ion lithophile elements (LILE). This variation is associated with coupled substitutions on the M1, M2 and T sites of the type M1(Al, Fe3 +) + TAl = M2(Mg, Fe2 +) + TSi. The different incorporation of trace elements into clinopyroxenes of hybrid origin is controlled by cation substitution reactions reflecting local charge-balance requirements. According to the lattice strain theory, simultaneous cation exchanges across the M1, M2, and T sites have profound effects on REE + Y and HFSE partitioning. Conversely, both temperature and melt composition have only a minor effect when the thermal path of magma is restricted to 70 °C and the value of non-bridging oxygens per tetrahedral cations (NBO/T) shifts moderately from 0.31 to 0.43. As a consequence, Zone 2 partition coefficients for REE + Y and HFSE diverge significantly from those derived for Zone 1, accounting for limited cation incorporation into the newly-formed clinopyroxenes at the quartzite-magma interface.

  13. Parametric symplectic partitioned Runge-Kutta methods with energy-preserving properties for Hamiltonian systems

    NASA Astrophysics Data System (ADS)

    Wang, Dongling; Xiao, Aiguo; Li, Xueyang

    2013-02-01

    Based on W-transformation, some parametric symplectic partitioned Runge-Kutta (PRK) methods depending on a real parameter α are developed. For α=0, the corresponding methods become the usual PRK methods, including Radau IA-IA¯ and Lobatto IIIA-IIIB methods as examples. For any α≠0, the corresponding methods are symplectic and there exists a value α∗ such that energy is preserved in the numerical solution at each step. The existence of the parameter and the order of the numerical methods are discussed. Some numerical examples are presented to illustrate these results.

  14. Weak-value amplification and optimal parameter estimation in the presence of correlated noise

    NASA Astrophysics Data System (ADS)

    Sinclair, Josiah; Hallaji, Matin; Steinberg, Aephraim M.; Tollaksen, Jeff; Jordan, Andrew N.

    2017-11-01

    We analytically and numerically investigate the performance of weak-value amplification (WVA) and related parameter estimation methods in the presence of temporally correlated noise. WVA is a special instance of a general measurement strategy that involves sorting data into separate subsets based on the outcome of a second "partitioning" measurement. Using a simplified correlated noise model that can be analyzed exactly together with optimal statistical estimators, we compare WVA to a conventional measurement method. We find that WVA indeed yields a much lower variance of the parameter of interest than the conventional technique does, optimized in the absence of any partitioning measurements. In contrast, a statistically optimal analysis that employs partitioning measurements, incorporating all partitioned results and their known correlations, is found to yield an improvement—typically slight—over the noise reduction achieved by WVA. This result occurs because the simple WVA technique is not tailored to any specific noise environment and therefore does not make use of correlations between the different partitions. We also compare WVA to traditional background subtraction, a familiar technique where measurement outcomes are partitioned to eliminate unknown offsets or errors in calibration. Surprisingly, for the cases we consider, background subtraction turns out to be a special case of the optimal partitioning approach, possessing a similar typically slight advantage over WVA. These results give deeper insight into the role of partitioning measurements (with or without postselection) in enhancing measurement precision, which some have found puzzling. They also resolve previously made conflicting claims about the usefulness of weak-value amplification to precision measurement in the presence of correlated noise. We finish by presenting numerical results to model a more realistic laboratory situation of time-decaying correlations, showing that our conclusions hold for a wide range of statistical models.

  15. A novel method to augment extraction of mangiferin by application of microwave on three phase partitioning.

    PubMed

    Kulkarni, Vrushali M; Rathod, Virendra K

    2015-06-01

    This work reports a novel approach where three phase partitioning (TPP) was combined with microwave for extraction of mangiferin from leaves of Mangifera indica . Soxhlet extraction was used as reference method, which yielded 57 mg/g in 5 h. Under optimal conditions such as microwave irradiation time 5 min, ammonium sulphate concentration 40% w/v, power 272 W, solute to solvent ratio 1:20, slurry to t -butanol ratio 1:1, soaking time 5 min and duty cycle 50%, the mangiferin yield obtained was 54 mg/g by microwave assisted three phase partitioning extraction (MTPP). Thus extraction method developed resulted into higher extraction yield in a shorter span, thereby making it an interesting alternative prior to down-stream processing.

  16. A Novel Feature Extraction Method for Monitoring (Vehicular) Fuel Storage System Leaks

    DTIC Science & Technology

    2014-10-02

    gives a continuous output of the DPDF with predefined partitions . Resolution a DPDF is dependent on pre-determined signal range and number of... partitions within that range. Conceptually, proposed implementation is identical to the creation of a histogram with a moving data windown given some...window. The crisp partitions within specified signal range act as “competing and possible” scenarios or alternatives where we impose a “winner takes all

  17. The Refinement-Tree Partition for Parallel Solution of Partial Differential Equations

    PubMed Central

    Mitchell, William F.

    1998-01-01

    Dynamic load balancing is considered in the context of adaptive multilevel methods for partial differential equations on distributed memory multiprocessors. An approach that periodically repartitions the grid is taken. The important properties of a partitioning algorithm are presented and discussed in this context. A partitioning algorithm based on the refinement tree of the adaptive grid is presented and analyzed in terms of these properties. Theoretical and numerical results are given. PMID:28009355

  18. The Refinement-Tree Partition for Parallel Solution of Partial Differential Equations.

    PubMed

    Mitchell, William F

    1998-01-01

    Dynamic load balancing is considered in the context of adaptive multilevel methods for partial differential equations on distributed memory multiprocessors. An approach that periodically repartitions the grid is taken. The important properties of a partitioning algorithm are presented and discussed in this context. A partitioning algorithm based on the refinement tree of the adaptive grid is presented and analyzed in terms of these properties. Theoretical and numerical results are given.

  19. A New Approach to Parallel Dynamic Partitioning for Adaptive Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Heber, Gerd; Biswas, Rupak; Gao, Guang R.

    1999-01-01

    Classical mesh partitioning algorithms were designed for rather static situations, and their straightforward application in a dynamical framework may lead to unsatisfactory results, e.g., excessive data migration among processors. Furthermore, special attention should be paid to their amenability to parallelization. In this paper, a novel parallel method for the dynamic partitioning of adaptive unstructured meshes is described. It is based on a linear representation of the mesh using self-avoiding walks.

  20. Systems and methods to control multiple peripherals with a single-peripheral application code

    DOEpatents

    Ransom, Ray M.

    2013-06-11

    Methods and apparatus are provided for enhancing the BIOS of a hardware peripheral device to manage multiple peripheral devices simultaneously without modifying the application software of the peripheral device. The apparatus comprises a logic control unit and a memory in communication with the logic control unit. The memory is partitioned into a plurality of ranges, each range comprising one or more blocks of memory, one range being associated with each instance of the peripheral application and one range being reserved for storage of a data pointer related to each peripheral application of the plurality. The logic control unit is configured to operate multiple instances of the control application by duplicating one instance of the peripheral application for each peripheral device of the plurality and partitioning a memory device into partitions comprising one or more blocks of memory, one partition being associated with each instance of the peripheral application. The method then reserves a range of memory addresses for storage of a data pointer related to each peripheral device of the plurality, and initializes each of the plurality of peripheral devices.

  1. Tensor Spectral Clustering for Partitioning Higher-order Network Structures.

    PubMed

    Benson, Austin R; Gleich, David F; Leskovec, Jure

    2015-01-01

    Spectral graph theory-based methods represent an important class of tools for studying the structure of networks. Spectral methods are based on a first-order Markov chain derived from a random walk on the graph and thus they cannot take advantage of important higher-order network substructures such as triangles, cycles, and feed-forward loops. Here we propose a Tensor Spectral Clustering (TSC) algorithm that allows for modeling higher-order network structures in a graph partitioning framework. Our TSC algorithm allows the user to specify which higher-order network structures (cycles, feed-forward loops, etc.) should be preserved by the network clustering. Higher-order network structures of interest are represented using a tensor, which we then partition by developing a multilinear spectral method. Our framework can be applied to discovering layered flows in networks as well as graph anomaly detection, which we illustrate on synthetic networks. In directed networks, a higher-order structure of particular interest is the directed 3-cycle, which captures feedback loops in networks. We demonstrate that our TSC algorithm produces large partitions that cut fewer directed 3-cycles than standard spectral clustering algorithms.

  2. Tensor Spectral Clustering for Partitioning Higher-order Network Structures

    PubMed Central

    Benson, Austin R.; Gleich, David F.; Leskovec, Jure

    2016-01-01

    Spectral graph theory-based methods represent an important class of tools for studying the structure of networks. Spectral methods are based on a first-order Markov chain derived from a random walk on the graph and thus they cannot take advantage of important higher-order network substructures such as triangles, cycles, and feed-forward loops. Here we propose a Tensor Spectral Clustering (TSC) algorithm that allows for modeling higher-order network structures in a graph partitioning framework. Our TSC algorithm allows the user to specify which higher-order network structures (cycles, feed-forward loops, etc.) should be preserved by the network clustering. Higher-order network structures of interest are represented using a tensor, which we then partition by developing a multilinear spectral method. Our framework can be applied to discovering layered flows in networks as well as graph anomaly detection, which we illustrate on synthetic networks. In directed networks, a higher-order structure of particular interest is the directed 3-cycle, which captures feedback loops in networks. We demonstrate that our TSC algorithm produces large partitions that cut fewer directed 3-cycles than standard spectral clustering algorithms. PMID:27812399

  3. Raster Data Partitioning for Supporting Distributed GIS Processing

    NASA Astrophysics Data System (ADS)

    Nguyen Thai, B.; Olasz, A.

    2015-08-01

    In the geospatial sector big data concept also has already impact. Several studies facing originally computer science techniques applied in GIS processing of huge amount of geospatial data. In other research studies geospatial data is considered as it were always been big data (Lee and Kang, 2015). Nevertheless, we can prove data acquisition methods have been improved substantially not only the amount, but the resolution of raw data in spectral, spatial and temporal aspects as well. A significant portion of big data is geospatial data, and the size of such data is growing rapidly at least by 20% every year (Dasgupta, 2013). The produced increasing volume of raw data, in different format, representation and purpose the wealth of information derived from this data sets represents only valuable results. However, the computing capability and processing speed rather tackle with limitations, even if semi-automatic or automatic procedures are aimed on complex geospatial data (Kristóf et al., 2014). In late times, distributed computing has reached many interdisciplinary areas of computer science inclusive of remote sensing and geographic information processing approaches. Cloud computing even more requires appropriate processing algorithms to be distributed and handle geospatial big data. Map-Reduce programming model and distributed file systems have proven their capabilities to process non GIS big data. But sometimes it's inconvenient or inefficient to rewrite existing algorithms to Map-Reduce programming model, also GIS data can not be partitioned as text-based data by line or by bytes. Hence, we would like to find an alternative solution for data partitioning, data distribution and execution of existing algorithms without rewriting or with only minor modifications. This paper focuses on technical overview of currently available distributed computing environments, as well as GIS data (raster data) partitioning, distribution and distributed processing of GIS algorithms. A proof of concept implementation have been made for raster data partitioning, distribution and processing. The first results on performance have been compared against commercial software ERDAS IMAGINE 2011 and 2014. Partitioning methods heavily depend on application areas, therefore we may consider data partitioning as a preprocessing step before applying processing services on data. As a proof of concept we have implemented a simple tile-based partitioning method splitting an image into smaller grids (NxM tiles) and comparing the processing time to existing methods by NDVI calculation. The concept is demonstrated using own development open source processing framework.

  4. Improving Design Efficiency for Large-Scale Heterogeneous Circuits

    NASA Astrophysics Data System (ADS)

    Gregerson, Anthony

    Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency and decrease the cost of the developing large-scale, heterogeneous circuits needed to enable large-scale application in high-energy physics and other important areas.

  5. Niche Partitioning of Feather Mites within a Seabird Host, Calonectris borealis

    PubMed Central

    Stefan, Laura M.; Gómez-Díaz, Elena; Elguero, Eric; Proctor, Heather C.; McCoy, Karen D.; González-Solís, Jacob

    2015-01-01

    According to classic niche theory, species can coexist in heterogeneous environments by reducing interspecific competition via niche partitioning, e.g. trophic or spatial partitioning. However, support for the role of competition on niche partitioning remains controversial. Here, we tested for spatial and trophic partitioning in feather mites, a diverse and abundant group of arthropods. We focused on the two dominant mite species, Microspalax brevipes and Zachvatkinia ovata, inhabiting flight feathers of the Cory’s shearwater, Calonectris borealis. We performed mite counts across and within primary and tail feathers on free-living shearwaters breeding on an oceanic island (Gran Canaria, Canary Islands). We then investigated trophic relationships between the two mite species and the host using stable isotope analyses of carbon and nitrogen on mite tissues and potential host food sources. The distribution of the two mite species showed clear spatial segregation among feathers; M. brevipes showed high preference for the central wing primary feathers, whereas Z. ovata was restricted to the two outermost primaries. Morphological differences between M. brevipes and Z. ovata support an adaptive basis for the spatial segregation of the two mite species. However, the two mites overlap in some central primaries and statistical modeling showed that Z. ovata tends to outcompete M. brevipes. Isotopic analyses indicated similar isotopic values for the two mite species and a strong correlation in carbon signatures between mites inhabiting the same individual host suggesting that diet is mainly based on shared host-associated resources. Among the four candidate tissues examined (blood, feather remains, skin remains and preen gland oil), we conclude that the diet is most likely dominated by preen gland oil, while the contribution of exogenous material to mite diets is less marked. Our results indicate that ongoing competition for space and resources plays a central role in structuring feather mite communities. They also illustrate that symbiotic infracommunities are excellent model systems to study trophic ecology, and can improve our understanding of mechanisms of niche differentiation and species coexistence. PMID:26650672

  6. Quantum mechanical fragment methods based on partitioning atoms or partitioning coordinates.

    PubMed

    Wang, Bo; Yang, Ke R; Xu, Xuefei; Isegawa, Miho; Leverentz, Hannah R; Truhlar, Donald G

    2014-09-16

    Conspectus The development of more efficient and more accurate ways to represent reactive potential energy surfaces is a requirement for extending the simulation of large systems to more complex systems, longer-time dynamical processes, and more complete statistical mechanical sampling. One way to treat large systems is by direct dynamics fragment methods. Another way is by fitting system-specific analytic potential energy functions with methods adapted to large systems. Here we consider both approaches. First we consider three fragment methods that allow a given monomer to appear in more than one fragment. The first two approaches are the electrostatically embedded many-body (EE-MB) expansion and the electrostatically embedded many-body expansion of the correlation energy (EE-MB-CE), which we have shown to yield quite accurate results even when one restricts the calculations to include only electrostatically embedded dimers. The third fragment method is the electrostatically embedded molecular tailoring approach (EE-MTA), which is more flexible than EE-MB and EE-MB-CE. We show that electrostatic embedding greatly improves the accuracy of these approaches compared with the original unembedded approaches. Quantum mechanical fragment methods share with combined quantum mechanical/molecular mechanical (QM/MM) methods the need to treat a quantum mechanical fragment in the presence of the rest of the system, which is especially challenging for those parts of the rest of the system that are close to the boundary of the quantum mechanical fragment. This is a delicate matter even for fragments that are not covalently bonded to the rest of the system, but it becomes even more difficult when the boundary of the quantum mechanical fragment cuts a bond. We have developed a suite of methods for more realistically treating interactions across such boundaries. These methods include redistributing and balancing the external partial atomic charges and the use of tuned fluorine atoms for capping dangling bonds, and we have shown that they can greatly improve the accuracy. Finally we present a new approach that goes beyond QM/MM by combining the convenience of molecular mechanics with the accuracy of fitting a potential function to electronic structure calculations on a specific system. To make the latter practical for systems with a large number of degrees of freedom, we developed a method to interpolate between local internal-coordinate fits to the potential energy. A key issue for the application to large systems is that rather than assigning the atoms or monomers to fragments, we assign the internal coordinates to reaction, secondary, and tertiary sets. Thus, we make a partition in coordinate space rather than atom space. Fits to the local dependence of the potential energy on tertiary coordinates are arrayed along a preselected reaction coordinate at a sequence of geometries called anchor points; the potential energy function is called an anchor points reactive potential. Electrostatically embedded fragment methods and the anchor points reactive potential, because they are based on treating an entire system by quantum mechanical electronic structure methods but are affordable for large and complex systems, have the potential to open new areas for accurate simulations where combined QM/MM methods are inadequate.

  7. Greedy feature selection for glycan chromatography data with the generalized Dirichlet distribution

    PubMed Central

    2013-01-01

    Background Glycoproteins are involved in a diverse range of biochemical and biological processes. Changes in protein glycosylation are believed to occur in many diseases, particularly during cancer initiation and progression. The identification of biomarkers for human disease states is becoming increasingly important, as early detection is key to improving survival and recovery rates. To this end, the serum glycome has been proposed as a potential source of biomarkers for different types of cancers. High-throughput hydrophilic interaction liquid chromatography (HILIC) technology for glycan analysis allows for the detailed quantification of the glycan content in human serum. However, the experimental data from this analysis is compositional by nature. Compositional data are subject to a constant-sum constraint, which restricts the sample space to a simplex. Statistical analysis of glycan chromatography datasets should account for their unusual mathematical properties. As the volume of glycan HILIC data being produced increases, there is a considerable need for a framework to support appropriate statistical analysis. Proposed here is a methodology for feature selection in compositional data. The principal objective is to provide a template for the analysis of glycan chromatography data that may be used to identify potential glycan biomarkers. Results A greedy search algorithm, based on the generalized Dirichlet distribution, is carried out over the feature space to search for the set of “grouping variables” that best discriminate between known group structures in the data, modelling the compositional variables using beta distributions. The algorithm is applied to two glycan chromatography datasets. Statistical classification methods are used to test the ability of the selected features to differentiate between known groups in the data. Two well-known methods are used for comparison: correlation-based feature selection (CFS) and recursive partitioning (rpart). CFS is a feature selection method, while recursive partitioning is a learning tree algorithm that has been used for feature selection in the past. Conclusions The proposed feature selection method performs well for both glycan chromatography datasets. It is computationally slower, but results in a lower misclassification rate and a higher sensitivity rate than both correlation-based feature selection and the classification tree method. PMID:23651459

  8. ANALYTICAL METHOD DEVELOPMENTS TO SUPPORT PARTITIONING INTERWELL TRACER TESTING

    EPA Science Inventory

    Partitioning Interwell Tracer Testing (PITT) uses alcohol tracer compounds in estimating subsurface contamination from non-polar pollutants. PITT uses the analysis of water samples for various alcohols as part of the overall measurement process. The water samples may contain many...

  9. Methods for selecting fixed-effect models for heterogeneous codon evolution, with comments on their application to gene and genome data.

    PubMed

    Bao, Le; Gu, Hong; Dunn, Katherine A; Bielawski, Joseph P

    2007-02-08

    Models of codon evolution have proven useful for investigating the strength and direction of natural selection. In some cases, a priori biological knowledge has been used successfully to model heterogeneous evolutionary dynamics among codon sites. These are called fixed-effect models, and they require that all codon sites are assigned to one of several partitions which are permitted to have independent parameters for selection pressure, evolutionary rate, transition to transversion ratio or codon frequencies. For single gene analysis, partitions might be defined according to protein tertiary structure, and for multiple gene analysis partitions might be defined according to a gene's functional category. Given a set of related fixed-effect models, the task of selecting the model that best fits the data is not trivial. In this study, we implement a set of fixed-effect codon models which allow for different levels of heterogeneity among partitions in the substitution process. We describe strategies for selecting among these models by a backward elimination procedure, Akaike information criterion (AIC) or a corrected Akaike information criterion (AICc). We evaluate the performance of these model selection methods via a simulation study, and make several recommendations for real data analysis. Our simulation study indicates that the backward elimination procedure can provide a reliable method for model selection in this setting. We also demonstrate the utility of these models by application to a single-gene dataset partitioned according to tertiary structure (abalone sperm lysin), and a multi-gene dataset partitioned according to the functional category of the gene (flagellar-related proteins of Listeria). Fixed-effect models have advantages and disadvantages. Fixed-effect models are desirable when data partitions are known to exhibit significant heterogeneity or when a statistical test of such heterogeneity is desired. They have the disadvantage of requiring a priori knowledge for partitioning sites. We recommend: (i) selection of models by using backward elimination rather than AIC or AICc, (ii) use a stringent cut-off, e.g., p = 0.0001, and (iii) conduct sensitivity analysis of results. With thoughtful application, fixed-effect codon models should provide a useful tool for large scale multi-gene analyses.

  10. A set partitioning reformulation for the multiple-choice multidimensional knapsack problem

    NASA Astrophysics Data System (ADS)

    Voß, Stefan; Lalla-Ruiz, Eduardo

    2016-05-01

    The Multiple-choice Multidimensional Knapsack Problem (MMKP) is a well-known ?-hard combinatorial optimization problem that has received a lot of attention from the research community as it can be easily translated to several real-world problems arising in areas such as allocating resources, reliability engineering, cognitive radio networks, cloud computing, etc. In this regard, an exact model that is able to provide high-quality feasible solutions for solving it or being partially included in algorithmic schemes is desirable. The MMKP basically consists of finding a subset of objects that maximizes the total profit while observing some capacity restrictions. In this article a reformulation of the MMKP as a set partitioning problem is proposed to allow for new insights into modelling the MMKP. The computational experimentation provides new insights into the problem itself and shows that the new model is able to improve on the best of the known results for some of the most common benchmark instances.

  11. Chamber identity programs drive early functional partitioning of the heart.

    PubMed

    Mosimann, Christian; Panáková, Daniela; Werdich, Andreas A; Musso, Gabriel; Burger, Alexa; Lawson, Katy L; Carr, Logan A; Nevis, Kathleen R; Sabeh, M Khaled; Zhou, Yi; Davidson, Alan J; DiBiase, Anthony; Burns, Caroline E; Burns, C Geoffrey; MacRae, Calum A; Zon, Leonard I

    2015-08-26

    The vertebrate heart muscle (myocardium) develops from the first heart field (FHF) and expands by adding second heart field (SHF) cells. While both lineages exist already in teleosts, the primordial contributions of FHF and SHF to heart structure and function remain incompletely understood. Here we delineate the functional contribution of the FHF and SHF to the zebrafish heart using the cis-regulatory elements of the draculin (drl) gene. The drl reporters initially delineate the lateral plate mesoderm, including heart progenitors. Subsequent myocardial drl reporter expression restricts to FHF descendants. We harnessed this unique feature to uncover that loss of tbx5a and pitx2 affect relative FHF versus SHF contributions to the heart. High-resolution physiology reveals distinctive electrical properties of each heart field territory that define a functional boundary within the single zebrafish ventricle. Our data establish that the transcriptional program driving cardiac septation regulates physiologic ventricle partitioning, which successively provides mechanical advantages of sequential contraction.

  12. Generalizing the self-healing diffusion Monte Carlo approach to finite temperature: a path for the optimization of low-energy many-body bases.

    PubMed

    Reboredo, Fernando A; Kim, Jeongnim

    2014-02-21

    A statistical method is derived for the calculation of thermodynamic properties of many-body systems at low temperatures. This method is based on the self-healing diffusion Monte Carlo method for complex functions [F. A. Reboredo, J. Chem. Phys. 136, 204101 (2012)] and some ideas of the correlation function Monte Carlo approach [D. M. Ceperley and B. Bernu, J. Chem. Phys. 89, 6316 (1988)]. In order to allow the evolution in imaginary time to describe the density matrix, we remove the fixed-node restriction using complex antisymmetric guiding wave functions. In the process we obtain a parallel algorithm that optimizes a small subspace of the many-body Hilbert space to provide maximum overlap with the subspace spanned by the lowest-energy eigenstates of a many-body Hamiltonian. We show in a model system that the partition function is progressively maximized within this subspace. We show that the subspace spanned by the small basis systematically converges towards the subspace spanned by the lowest energy eigenstates. Possible applications of this method for calculating the thermodynamic properties of many-body systems near the ground state are discussed. The resulting basis can also be used to accelerate the calculation of the ground or excited states with quantum Monte Carlo.

  13. Generalizing the self-healing diffusion Monte Carlo approach to finite temperature: A path for the optimization of low-energy many-body bases

    NASA Astrophysics Data System (ADS)

    Reboredo, Fernando A.; Kim, Jeongnim

    2014-02-01

    A statistical method is derived for the calculation of thermodynamic properties of many-body systems at low temperatures. This method is based on the self-healing diffusion Monte Carlo method for complex functions [F. A. Reboredo, J. Chem. Phys. 136, 204101 (2012)] and some ideas of the correlation function Monte Carlo approach [D. M. Ceperley and B. Bernu, J. Chem. Phys. 89, 6316 (1988)]. In order to allow the evolution in imaginary time to describe the density matrix, we remove the fixed-node restriction using complex antisymmetric guiding wave functions. In the process we obtain a parallel algorithm that optimizes a small subspace of the many-body Hilbert space to provide maximum overlap with the subspace spanned by the lowest-energy eigenstates of a many-body Hamiltonian. We show in a model system that the partition function is progressively maximized within this subspace. We show that the subspace spanned by the small basis systematically converges towards the subspace spanned by the lowest energy eigenstates. Possible applications of this method for calculating the thermodynamic properties of many-body systems near the ground state are discussed. The resulting basis can also be used to accelerate the calculation of the ground or excited states with quantum Monte Carlo.

  14. FNAS phase partitions

    NASA Technical Reports Server (NTRS)

    Vanalstine, James M.

    1993-01-01

    Project NAS8-36955 D.O. #100 initially involved the following tasks: (1) evaluation of various coatings' ability to control wall wetting and surface zeta potential expression; (2) testing various methods to mix and control the demixing of phase systems; and (3) videomicroscopic investigation of cell partition. Three complementary areas were identified for modification and extension of the original contract. They were: (1) identification of new supports for column cell partition; (2) electrokinetic detection of protein adsorption; and (3) emulsion studies related to bioseparations.

  15. Reconstruction of a piecewise constant conductivity on a polygonal partition via shape optimization in EIT

    NASA Astrophysics Data System (ADS)

    Beretta, Elena; Micheletti, Stefano; Perotto, Simona; Santacesaria, Matteo

    2018-01-01

    In this paper, we develop a shape optimization-based algorithm for the electrical impedance tomography (EIT) problem of determining a piecewise constant conductivity on a polygonal partition from boundary measurements. The key tool is to use a distributed shape derivative of a suitable cost functional with respect to movements of the partition. Numerical simulations showing the robustness and accuracy of the method are presented for simulated test cases in two dimensions.

  16. Inference and Analysis of Population Structure Using Genetic Data and Network Theory.

    PubMed

    Greenbaum, Gili; Templeton, Alan R; Bar-David, Shirli

    2016-04-01

    Clustering individuals to subpopulations based on genetic data has become commonplace in many genetic studies. Inference about population structure is most often done by applying model-based approaches, aided by visualization using distance-based approaches such as multidimensional scaling. While existing distance-based approaches suffer from a lack of statistical rigor, model-based approaches entail assumptions of prior conditions such as that the subpopulations are at Hardy-Weinberg equilibria. Here we present a distance-based approach for inference about population structure using genetic data by defining population structure using network theory terminology and methods. A network is constructed from a pairwise genetic-similarity matrix of all sampled individuals. The community partition, a partition of a network to dense subgraphs, is equated with population structure, a partition of the population to genetically related groups. Community-detection algorithms are used to partition the network into communities, interpreted as a partition of the population to subpopulations. The statistical significance of the structure can be estimated by using permutation tests to evaluate the significance of the partition's modularity, a network theory measure indicating the quality of community partitions. To further characterize population structure, a new measure of the strength of association (SA) for an individual to its assigned community is presented. The strength of association distribution (SAD) of the communities is analyzed to provide additional population structure characteristics, such as the relative amount of gene flow experienced by the different subpopulations and identification of hybrid individuals. Human genetic data and simulations are used to demonstrate the applicability of the analyses. The approach presented here provides a novel, computationally efficient model-free method for inference about population structure that does not entail assumption of prior conditions. The method is implemented in the software NetStruct (available at https://giligreenbaum.wordpress.com/software/). Copyright © 2016 by the Genetics Society of America.

  17. Separation of soil respiration: a site-specific comparison of partition methods

    NASA Astrophysics Data System (ADS)

    Comeau, Louis-Pierre; Lai, Derrick Y. F.; Jinglan Cui, Jane; Farmer, Jenny

    2018-06-01

    Without accurate data on soil heterotrophic respiration (Rh), assessments of soil carbon (C) sequestration rate and C balance are challenging to produce. Accordingly, it is essential to determine the contribution of the different sources of the total soil CO2 efflux (Rs) in different ecosystems, but to date, there are still many uncertainties and unknowns regarding the soil respiration partitioning procedures currently available. This study compared the suitability and relative accuracy of five different Rs partitioning methods in a subtropical forest: (1) regression between root biomass and CO2 efflux, (2) lab incubations with minimally disturbed soil microcosm cores, (3) root exclusion bags with hand-sorted roots, (4) root exclusion bags with intact soil blocks and (5) soil δ13C-CO2 natural abundance. The relationship between Rh and soil moisture and temperature was also investigated. A qualitative evaluation table of the partition methods with five performance parameters was produced. The Rs was measured weekly from 3 February to 19 April 2017 and found to average 6.1 ± 0.3 Mg C ha-1 yr-1. During this period, the Rh measured with the in situ mesh bags with intact soil blocks and hand-sorted roots was estimated to contribute 49 ± 7 and 79 ± 3 % of Rs, respectively. The Rh percentages estimated with the root biomass regression, microcosm incubation and δ13C-CO2 natural abundance were 54 ± 41, 8-17 and 61 ± 39 %, respectively. Overall, no systematically superior or inferior Rs partition method was found. The paper discusses the strengths and weaknesses of each technique with the conclusion that combining two or more methods optimizes Rh assessment reliability.

  18. Determination of octanol-air partition coefficients and supercooled liquid vapor pressures of PAHs as a function of temperature: Application to gas-particle partitioning in an urban atmosphere

    NASA Astrophysics Data System (ADS)

    Odabasi, Mustafa; Cetin, Eylem; Sofuoglu, Aysun

    Octanol-air partition coefficients ( KOA) for 14 polycyclic aromatic hydrocarbons (PAHs) were determined as a function of temperature using the gas chromatographic retention time method. log KOA values at 25° ranged over six orders of magnitude, between 6.34 (acenaphthylene) and 12.59 (dibenz[ a,h]anthracene). The determined KOA values were within factor of 0.7 (dibenz[ a,h]anthracene) to 15.1 (benz[ a]anthracene) of values calculated as the ratio of octanol-water partition coefficient to dimensionless Henry's law constant. Supercooled liquid vapor pressures ( PL) of 13 PAHs were also determined using the gas chromatographic retention time technique. Activity coefficients in octanol calculated using KOA and PL ranged between 3.2 and 6.2 indicating near-ideal solution behavior. Atmospheric concentrations measured in this study in Izmir, Turkey were used to investigate the partitioning of PAHs between particle and gas-phases. Experimental gas-particle partition coefficients ( Kp) were compared to the predictions of KOA absorption and KSA (soot-air partition coefficient) models. Octanol-based absorptive partitioning model predicted lower partition coefficients especially for relatively volatile PAHs. Ratios of measured/modeled partition coefficients ranged between 1.1 and 15.5 (4.5±6.0, average±SD) for KOA model. KSA model predictions were relatively better and measured to modeled ratios ranged between 0.6 and 5.6 (2.3±2.7, average±SD).

  19. Cell separation in immunoaffinity partition in aqueous polymer two-phase systems

    NASA Technical Reports Server (NTRS)

    Karr, Laurel J.; Van Alstine, James M.; Snyder, Robert S.; Shafer, Steven G.; Harris, J. Milton

    1989-01-01

    Two methods for immunoaffinity partitioning are described. One technique involves the covalent coupling of poly (ethylene glycol) (PEG) to immunoglobulin G antibody preparations. In the second method PEG-modified Protein A is used to complex with cells and unmodified antibody. The effects of PEG molecular weight, the degree of modification, and varying phase system composition on antibody activity and its affinity for the upper phase are studied. It is observed that both methods resulted in effective cell separation.

  20. METHOD FOR MEASURING AIR-IMMISCIBLE LIQUID PARTITION COEFFICIENTS

    EPA Science Inventory

    The principal objective of this work was to measure nonaqueous phase liquid-air partition coefficients for various gas tracer compounds. Known amounts of trichloroethene (TCE) and tracer, as neat compounds, were introduced into glass vials and allowed to equilibrate. The TCE and ...

  1. Improved parallel data partitioning by nested dissection with applications to information retrieval.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolf, Michael M.; Chevalier, Cedric; Boman, Erik Gunnar

    The computational work in many information retrieval and analysis algorithms is based on sparse linear algebra. Sparse matrix-vector multiplication is a common kernel in many of these computations. Thus, an important related combinatorial problem in parallel computing is how to distribute the matrix and the vectors among processors so as to minimize the communication cost. We focus on minimizing the total communication volume while keeping the computation balanced across processes. In [1], the first two authors presented a new 2D partitioning method, the nested dissection partitioning algorithm. In this paper, we improve on that algorithm and show that it ismore » a good option for data partitioning in information retrieval. We also show partitioning time can be substantially reduced by using the SCOTCH software, and quality improves in some cases, too.« less

  2. Isotope partitioning of soil respiration: A Bayesian solution to accommodate multiple sources of variability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogle, Kiona; Pendall, Elise

    Isotopic methods offer great potential for partitioning trace gas fluxes such as soil respiration into their different source contributions. Traditional partitioning methods face challenges due to variability introduced by different measurement methods, fractionation effects, and end-member uncertainty. To address these challenges, we describe in this paper a hierarchical Bayesian (HB) approach for isotopic partitioning of soil respiration that directly accommodates such variability. We apply our HB method to data from an experiment conducted in a shortgrass steppe ecosystem, where decomposition was previously shown to be stimulated by elevated CO 2. Our approach simultaneously fits Keeling plot (KP) models to observationsmore » of soil or soil-respired δ 13C and [CO 2] obtained via chambers and gas wells, corrects the KP intercepts for apparent fractionation (Δ) due to isotope-specific diffusion rates and/or method artifacts, estimates method- and treatment-specific values for Δ, propagates end-member uncertainty, and calculates proportional contributions from two distinct respiration sources (“old” and “new” carbon). The chamber KP intercepts were estimated with greater confidence than the well intercepts and compared to the theoretical value of 4.4‰, our results suggest that Δ varies between 2 and 5.2‰ depending on method (chambers versus wells) and CO 2 treatment. Because elevated CO 2 plots were fumigated with 13C-depleted CO 2, the source contributions were tightly constrained, and new C accounted for 64% (range = 55–73%) of soil respiration. The contributions were less constrained for the ambient CO 2 treatments, but new C accounted for significantly less (47%, range = 15–82%) of soil respiration. Finally, our new HB partitioning approach contrasts our original analysis (higher contribution of old C under elevated CO 2) because it uses additional data sources, accounts for end-member bias, and estimates apparent fractionation effects.« less

  3. Confocal Raman Microscopy for in Situ Measurement of Octanol-Water Partitioning within the Pores of Individual C18-Functionalized Chromatographic Particles.

    PubMed

    Kitt, Jay P; Harris, Joel M

    2015-05-19

    Octanol-water partitioning is one of the most widely used predictors of hydrophobicity and lipophilicity. Traditional methods for measuring octanol-water partition coefficients (K(ow)), including shake-flasks and generator columns, require hours for equilibration and milliliter quantities of sample solution. These challenges have led to development of smaller-scale methods for measuring K(ow). Recent advances in microfluidics have produced faster and smaller-volume approaches to measuring K(ow). As flowing volumes are reduced, however, separation of water and octanol prior to measurement and detection in small volumes of octanol phase are especially challenging. In this work, we reduce the receiver volume of octanol-water partitioning measurements from current practice by six-orders-of-magnitude, to the femtoliter scale, by using a single octanol-filled reversed-phase, octadecylsilane-modified (C18-silica) chromatographic particle as a collector. The fluid-handling challenges of working in such small volumes are circumvented by eliminating postequilibration phase separation. Partitioning is measured in situ within the pore-confined octanol phase using confocal Raman microscopy, which is capable of detecting and quantifying a wide variety of molecular structures. Equilibration times are fast (less than a minute) because molecular diffusion is efficient over distance scales of micrometers. The demonstrated amount of analyte needed to carry out a measurement is very small, less than 50 fmol, which would be a useful attribute for drug screening applications or testing of small quantities of environmentally sensitive compounds. The method is tested for measurements of pH-dependent octanol-water partitioning of naphthoic acid, and the results are compared to both traditional shake-flask measurements and sorption onto C18-modified silica without octanol present within the pores.

  4. Novel medium-throughput technique for investigating drug-cyclodextrin complexation by pH-metric titration using the partition coefficient method.

    PubMed

    Dargó, Gergő; Boros, Krisztina; Péter, László; Malanga, Milo; Sohajda, Tamás; Szente, Lajos; Balogh, György T

    2018-05-05

    The present study was aimed to develop a medium-throughput screening technique for investigation of cyclodextrin (CD)-active pharmaceutical ingredient (API) complexes. Dual-phase potentiometric lipophilicity measurement, as gold standard technique, was combined with the partition coefficient method (plotting the reciprocal of partition coefficients of APIs as a function of CD concentration). A general equation was derived for determination of stability constants of 1:1 CD-API complexes (K 1:1,CD ) based on solely the changes of partition coefficients (logP o/w N -logP app N ), without measurement of the actual API concentrations. Experimentally determined logP value (-1.64) of 6-deoxy-6[(5/6)-fluoresceinylthioureido]-HPBCD (FITC-NH-HPBCD) was used to estimate the logP value (≈ -2.5 to -3) of (2-hydroxypropyl)-ß-cyclodextrin (HPBCD). The results suggested that the amount of HPBCD can be considered to be inconsequential in the octanol phase. The decrease of octanol volume due to the octanol-CD complexation was considered, thus a corrected octanol-water phase ratio was also introduced. The K 1:1,CD values obtained by this developed method showed a good accordance with the results from other orthogonal methods. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. C-Depth Method to Determine Diffusion Coefficient and Partition Coefficient of PCB in Building Materials.

    PubMed

    Liu, Cong; Kolarik, Barbara; Gunnarsen, Lars; Zhang, Yinping

    2015-10-20

    Polychlorinated biphenyls (PCBs) have been found to be persistent in the environment and possibly harmful. Many buildings are characterized with high PCB concentrations. Knowledge about partitioning between primary sources and building materials is critical for exposure assessment and practical remediation of PCB contamination. This study develops a C-depth method to determine diffusion coefficient (D) and partition coefficient (K), two key parameters governing the partitioning process. For concrete, a primary material studied here, relative standard deviations of results among five data sets are 5%-22% for K and 42-66% for D. Compared with existing methods, C-depth method overcomes the inability to obtain unique estimation for nonlinear regression and does not require assumed correlations for D and K among congeners. Comparison with a more sophisticated two-term approach implies significant uncertainty for D, and smaller uncertainty for K. However, considering uncertainties associated with sampling and chemical analysis, and impact of environmental factors, the results are acceptable for engineering applications. This was supported by good agreement between model prediction and measurement. Sensitivity analysis indicated that effective diffusion distance, contacting time of materials with primary sources, and depth of measured concentrations are critical for determining D, and PCB concentration in primary sources is critical for K.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poerschke, Andrew; Rudd, Armin

    This report investigates the feasibility of using a home-run manifold small-diameter duct system to provide space-conditioning air to individual thermal zones in a low-load home. This compact layout allows duct systems to easily be brought within conditioned space via interior partition walls. Centrally locating the air handling unit in the house significantly reduces duct lengths. The plenum box is designed so that each connected duct receives a similar amount of airflow—regardless of its position on the box. Furthermore, within a reasonable set of length restrictions each duct continues to receive similar airflow.

  7. A mesh partitioning algorithm for preserving spatial locality in arbitrary geometries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nivarti, Girish V., E-mail: g.nivarti@alumni.ubc.ca; Salehi, M. Mahdi; Bushe, W. Kendal

    2015-01-15

    Highlights: •An algorithm for partitioning computational meshes is proposed. •The Morton order space-filling curve is modified to achieve improved locality. •A spatial locality metric is defined to compare results with existing approaches. •Results indicate improved performance of the algorithm in complex geometries. -- Abstract: A space-filling curve (SFC) is a proximity preserving linear mapping of any multi-dimensional space and is widely used as a clustering tool. Equi-sized partitioning of an SFC ignores the loss in clustering quality that occurs due to inaccuracies in the mapping. Often, this results in poor locality within partitions, especially for the conceptually simple, Morton ordermore » curves. We present a heuristic that improves partition locality in arbitrary geometries by slicing a Morton order curve at points where spatial locality is sacrificed. In addition, we develop algorithms that evenly distribute points to the extent possible while maintaining spatial locality. A metric is defined to estimate relative inter-partition contact as an indicator of communication in parallel computing architectures. Domain partitioning tests have been conducted on geometries relevant to turbulent reactive flow simulations. The results obtained highlight the performance of our method as an unsupervised and computationally inexpensive domain partitioning tool.« less

  8. Intuitionistic uncertain linguistic partitioned Bonferroni means and their application to multiple attribute decision-making

    NASA Astrophysics Data System (ADS)

    Liu, Zhengmin; Liu, Peide

    2017-04-01

    The Bonferroni mean (BM) was originally introduced by Bonferroni and generalised by many other researchers due to its capacity to capture the interrelationship between input arguments. Nevertheless, in many situations, interrelationships do not always exist between all of the attributes. Attributes can be partitioned into several different categories and members of intra-partition are interrelated while no interrelationship exists between attributes of different partitions. In this paper, as complements to the existing generalisations of BM, we investigate the partitioned Bonferroni mean (PBM) under intuitionistic uncertain linguistic environments and develop two linguistic aggregation operators: intuitionistic uncertain linguistic partitioned Bonferroni mean (IULPBM) and its weighted form (WIULPBM). Then, motivated by the ideal of geometric mean and PBM, we further present the partitioned geometric Bonferroni mean (PGBM) and develop two linguistic geometric aggregation operators: intuitionistic uncertain linguistic partitioned geometric Bonferroni mean (IULPGBM) and its weighted form (WIULPGBM). Some properties and special cases of these proposed operators are also investigated and discussed in detail. Based on these operators, an approach for multiple attribute decision-making problems with intuitionistic uncertain linguistic information is developed. Finally, a practical example is presented to illustrate the developed approach and comparison analyses are conducted with other representative methods to verify the effectiveness and feasibility of the developed approach.

  9. Three-Dimensional High-Order Spectral Finite Volume Method for Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Liu, Yen; Vinokur, Marcel; Wang, Z. J.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    Many areas require a very high-order accurate numerical solution of conservation laws for complex shapes. This paper deals with the extension to three dimensions of the Spectral Finite Volume (SV) method for unstructured grids, which was developed to solve such problems. We first summarize the limitations of traditional methods such as finite-difference, and finite-volume for both structured and unstructured grids. We then describe the basic formulation of the spectral finite volume method. What distinguishes the SV method from conventional high-order finite-volume methods for unstructured triangular or tetrahedral grids is the data reconstruction. Instead of using a large stencil of neighboring cells to perform a high-order reconstruction, the stencil is constructed by partitioning each grid cell, called a spectral volume (SV), into 'structured' sub-cells, called control volumes (CVs). One can show that if all the SV cells are partitioned into polygonal or polyhedral CV sub-cells in a geometrically similar manner, the reconstructions for all the SVs become universal, irrespective of their shapes, sizes, orientations, or locations. It follows that the reconstruction is reduced to a weighted sum of unknowns involving just a few simple adds and multiplies, and those weights are universal and can be pre-determined once for all. The method is thus very efficient, accurate, and yet geometrically flexible. The most critical part of the SV method is the partitioning of the SV into CVs. In this paper we present the partitioning of a tetrahedral SV into polyhedral CVs with one free parameter for polynomial reconstructions up to degree of precision five. (Note that the order of accuracy of the method is one order higher than the reconstruction degree of precision.) The free parameter will be determined by minimizing the Lebesgue constant of the reconstruction matrix or similar criteria to obtain optimized partitions. The details of an efficient, parallelizable code to solve three-dimensional problems for any order of accuracy are then presented. Important aspects of the data structure are discussed. Comparisons with the Discontinuous Galerkin (DG) method are made. Numerical examples for wave propagation problems are presented.

  10. Characterization of Nonlinear Systems with Memory by Means of Volterra Expansions with Frequency Partitioning: Application to a Cicada Mating Call

    DTIC Science & Technology

    2010-06-15

    Partitioning Application to a Cicada Mating Call Albert H. Nuttall Adaptive Methods Inc. Derke R. Hughes NUWC Division Newport IVAVSEA WARFARE...Frequency Partitioning: Application to a Cicada Mating Call 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Albert H... cicada mating call with a distinctly non-white and non-Gaussian excitation gives good results for the estimated first- and second-order kernels and

  11. The use of acoustically tuned resonators to improve the sound transmission loss of double panel partitions

    NASA Astrophysics Data System (ADS)

    Mason, J. M.; Fahy, F. J.

    1986-10-01

    The effectiveness of tuned Helmholtz resonators connected to the partition cavity in double-leaf partitions utilized in situations requiring low weight structures with high transmission loss is investigated as a method of improving sound transmission loss. This is demonstrated by a simple theoretical model and then experimentally verified. Results show that substantial improvements may be obtained at and around the mass-air-mass frequency for a total resonator volume 15 percent of the cavity volume.

  12. The effects of rainfall partitioning and evapotranspiration on the temporal and spatial variation of soil water content in a Mediterranean agroforestry system

    NASA Astrophysics Data System (ADS)

    Biel, C.; Molina, A.; Aranda, X.; Llorens, P.; Savé, R.

    2012-04-01

    Tree plantation for wood production has been proposed to mitigate CO2-related climate change. Although these agroforestry systems can contribute to maintain the agriculture in some areas placed between rainfed crops and secondary forests, water scarcity in Mediterranean climate could restrict its growth, and their presence will affect the water balance. Tree plantations management (species, plant density, irrigation, etc), hence, can be used to affect the water balance, resulting in water availability improvement and buffering of the water cycle. Soil water content and meteorological data are widely used in agroforestry systems as indicators of vegetation water use, and consequently to define water management. However, the available information of ecohydrological processes in this kind of ecosystem is scarce. The present work studies how the temporal and spatial variation of soil water content is affected by transpiration and interception loss fluxes in a Mediterranean rainfed plantation of cherry tree (Prunus avium) located in Caldes de Montbui (Northeast of Spain). From May till December 2011, rainfall partitioning, canopy transpiration, soil water content and meteorological parameters were continuously recorded. Rainfall partitioning was measured in 6 trees, with 6 automatic rain recorders for throughfall and 1 automatic rain recorder for stemflow per tree. Transpiration was monitored in 12 nearby trees by means of heat pulse sap flow sensors. Soil water content was also measured at three different depths under selected trees and at two depths between rows without tree cover influence. This work presents the relationships between rainfall partitioning, transpiration and soil water content evolution under the tree canopy. The effect of tree cover on the soil water content dynamics is also analyzed.

  13. Analyzing students’ errors on fractions in the number line

    NASA Astrophysics Data System (ADS)

    Widodo, S.; Ikhwanudin, T.

    2018-05-01

    The objectives of this study are to know the type of students’ errors when they deal with fractions on the number line. This study used qualitative with a descriptive method, and involved 31 sixth grade students at one of the primary schools in Purwakarta, Indonesia. The results of this study are as follow, there are four types of student’s errors: unit confusion, tick mark interpretation error, partitioning and un partitioning error, and estimation error. We recommend that teachers should: strengthen unit understanding to the students when studying fractions, make students understand about tick mark interpretation, remind student of the importance of partitioning and un-partitioning strategy and teaches effective estimation strategies.

  14. Improved coverage of cDNA-AFLP by sequential digestion of immobilized cDNA.

    PubMed

    Weiberg, Arne; Pöhler, Dirk; Morgenstern, Burkhard; Karlovsky, Petr

    2008-10-13

    cDNA-AFLP is a transcriptomics technique which does not require prior sequence information and can therefore be used as a gene discovery tool. The method is based on selective amplification of cDNA fragments generated by restriction endonucleases, electrophoretic separation of the products and comparison of the band patterns between treated samples and controls. Unequal distribution of restriction sites used to generate cDNA fragments negatively affects the performance of cDNA-AFLP. Some transcripts are represented by more than one fragment while other escape detection, causing redundancy and reducing the coverage of the analysis, respectively. With the goal of improving the coverage of cDNA-AFLP without increasing its redundancy, we designed a modified cDNA-AFLP protocol. Immobilized cDNA is sequentially digested with several restriction endonucleases and the released DNA fragments are collected in mutually exclusive pools. To investigate the performance of the protocol, software tool MECS (Multiple Enzyme cDNA-AFLP Simulation) was written in Perl. cDNA-AFLP protocols described in the literature and the new sequential digestion protocol were simulated on sets of cDNA sequences from mouse, human and Arabidopsis thaliana. The redundancy and coverage, the total number of PCR reactions, and the average fragment length were calculated for each protocol and cDNA set. Simulation revealed that sequential digestion of immobilized cDNA followed by the partitioning of released fragments into mutually exclusive pools outperformed other cDNA-AFLP protocols in terms of coverage, redundancy, fragment length, and the total number of PCRs. Primers generating 30 to 70 amplicons per PCR provided the highest fraction of electrophoretically distinguishable fragments suitable for normalization. For A. thaliana, human and mice transcriptome, the use of two marking enzymes and three sequentially applied releasing enzymes for each of the marking enzymes is recommended.

  15. Reprogramming of G protein-coupled receptor recycling and signaling by a kinase switch

    PubMed Central

    Vistein, Rachel; Puthenveedu, Manojkumar A.

    2013-01-01

    The postendocytic recycling of signaling receptors is subject to multiple requirements. Why this is so, considering that many other proteins can recycle without apparent requirements, is a fundamental question. Here we show that cells can leverage these requirements to switch the recycling of the beta-2 adrenergic receptor (B2AR), a prototypic signaling receptor, between sequence-dependent and bulk recycling pathways, based on extracellular signals. This switch is determined by protein kinase A-mediated phosphorylation of B2AR on the cytoplasmic tail. The phosphorylation state of B2AR dictates its partitioning into spatially and functionally distinct endosomal microdomains mediating bulk and sequence-dependent recycling, and also regulates the rate of B2AR recycling and resensitization. Our results demonstrate that G protein-coupled receptor recycling is not always restricted to the sequence-dependent pathway, but may be reprogrammed as needed by physiological signals. Such flexible reprogramming might provide a versatile method for rapidly modulating cellular responses to extracellular signaling. PMID:24003153

  16. Predicting bioconcentration of chemicals into vegetation from soil or air using the molecular connectivity index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowdy, D.L.; McKone, T.E.; Hsieh, D.P.H.

    1995-12-31

    Bioconcentration factors (BCFs) are the ratio of chemical concentration found in an exposed organism (in this case a plant) to the concentration in an air or soil exposure medium. The authors examine here the use of molecular connectivity indices (MCIs) as quantitative structure-activity relationships (QSARS) for predicting BCFs for organic chemicals between plants and air or soil. The authors compare the reliability of the octanol-air partition coefficient (K{sub oa}) to the MC based prediction method for predicting plant/air partition coefficients. The authors also compare the reliability of the octanol/water partition coefficient (K{sub ow}) to the MC based prediction method formore » predicting plant/soil partition coefficients. The results here indicate that, relative to the use of K{sub ow} or K{sub oa} as predictors of BCFs the MC can substantially increase the reliability with which BCFs can be estimated. The authors find that the MC provides a relatively precise and accurate method for predicting the potential biotransfer of a chemical from environmental media into plants. In addition, the MC is much faster and more cost effective than direct measurements.« less

  17. An empirical study of statistical properties of variance partition coefficients for multi-level logistic regression models

    USGS Publications Warehouse

    Li, Ji; Gray, B.R.; Bates, D.M.

    2008-01-01

    Partitioning the variance of a response by design levels is challenging for binomial and other discrete outcomes. Goldstein (2003) proposed four definitions for variance partitioning coefficients (VPC) under a two-level logistic regression model. In this study, we explicitly derived formulae for multi-level logistic regression model and subsequently studied the distributional properties of the calculated VPCs. Using simulations and a vegetation dataset, we demonstrated associations between different VPC definitions, the importance of methods for estimating VPCs (by comparing VPC obtained using Laplace and penalized quasilikehood methods), and bivariate dependence between VPCs calculated at different levels. Such an empirical study lends an immediate support to wider applications of VPC in scientific data analysis.

  18. On distributed wavefront reconstruction for large-scale adaptive optics systems.

    PubMed

    de Visser, Cornelis C; Brunner, Elisabeth; Verhaegen, Michel

    2016-05-01

    The distributed-spline-based aberration reconstruction (D-SABRE) method is proposed for distributed wavefront reconstruction with applications to large-scale adaptive optics systems. D-SABRE decomposes the wavefront sensor domain into any number of partitions and solves a local wavefront reconstruction problem on each partition using multivariate splines. D-SABRE accuracy is within 1% of a global approach with a speedup that scales quadratically with the number of partitions. The D-SABRE is compared to the distributed cumulative reconstruction (CuRe-D) method in open-loop and closed-loop simulations using the YAO adaptive optics simulation tool. D-SABRE accuracy exceeds CuRe-D for low levels of decomposition, and D-SABRE proved to be more robust to variations in the loop gain.

  19. Application of a Model for Quenching and Partitioning in Hot Stamping of High-Strength Steel

    NASA Astrophysics Data System (ADS)

    Zhu, Bin; Liu, Zhuang; Wang, Yanan; Rolfe, Bernard; Wang, Liang; Zhang, Yisheng

    2018-04-01

    Application of quenching and partitioning process in hot stamping has proven to be an effective method to improve the plasticity of advanced high-strength steels (AHSSs). In this study, the hot stamping and partitioning process of advanced high-strength steel 30CrMnSi2Nb is investigated with a hot stamping mold. Given the specific partitioning time and temperature, the influence of quenching temperature on the volume fraction of microstructure evolution and mechanical properties of the above steel are studied in detail. In addition, a model for quenching and partitioning process is applied to predict the carbon diffusion and interface migration during partitioning, which determines the retained austenite volume fraction and final properties of the part. The predicted trends of the retained austenite volume fraction agree with the experimental results. In both cases, the volume fraction of retained austenite increases first and then decreases with the increasing quenching temperature. The optimal quenching temperature is approximately 290 °C for 30CrMnSi2Nb with the partition conditions of 425 °C and 20 seconds. It is suggested that the model can be used to help determine the process parameters to obtain retained austenite as much as possible.

  20. Trace element partitioning between plagioclase and melt: An investigation of the impact of experimental and analytical procedures

    NASA Astrophysics Data System (ADS)

    Nielsen, Roger L.; Ustunisik, Gokce; Weinsteiger, Allison B.; Tepley, Frank J.; Johnston, A. Dana; Kent, Adam J. R.

    2017-09-01

    Quantitative models of petrologic processes require accurate partition coefficients. Our ability to obtain accurate partition coefficients is constrained by their dependence on pressure temperature and composition, and on the experimental and analytical techniques we apply. The source and magnitude of error in experimental studies of trace element partitioning may go unrecognized if one examines only the processed published data. The most important sources of error are relict crystals, and analyses of more than one phase in the analytical volume. Because we have typically published averaged data, identification of compromised data is difficult if not impossible. We addressed this problem by examining unprocessed data from plagioclase/melt partitioning experiments, by comparing models based on that data with existing partitioning models, and evaluated the degree to which the partitioning models are dependent on the calibration data. We found that partitioning models are dependent on the calibration data in ways that result in erroneous model values, and that the error will be systematic and dependent on the value of the partition coefficient. In effect, use of different calibration datasets will result in partitioning models whose results are systematically biased, and that one can arrive at different and conflicting conclusions depending on how a model is calibrated, defeating the purpose of applying the models. Ultimately this is an experimental data problem, which can be solved if we publish individual analyses (not averages) or use a projection method wherein we use an independent compositional constraint to identify and estimate the uncontaminated composition of each phase.

  1. Fluxpart: Open source software for partitioning carbon dioxide and water vapor fluxes

    USDA-ARS?s Scientific Manuscript database

    The eddy covariance method is regularly used for measuring gas fluxes over agricultural fields and natural ecosystems. For many applications, it is desirable to partition the measured fluxes into constitutive components: the water vapor flux into transpiration and direct evaporation components, and ...

  2. Mapping Pesticide Partition Coefficients By Electromagnetic Induction

    USDA-ARS?s Scientific Manuscript database

    A potential method for reducing pesticide leaching is to base application rates on the leaching potential of a specific chemical and soil combination. However, leaching is determined in part by the partitioning of the chemical between the soil and soil solution, which varies across a field. Standard...

  3. 78 FR 4844 - Notice of Intent To Suspend Certain Pesticide Registrations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-23

    ... water. 830.7550 Partition coefficient (n- 6/16/09 6/25/09 3/16/10 1,3 octanol/water) shake flask method. 830.7570 Partition coefficient (n- 6/16/09 6/25/09 3/16/10 1,3 octanol/water) estimation by liquid...

  4. Partitioning net ecosystem exchange of CO2 into gross primary production and ecosystem respiration in northern high-latitude ecosystems

    NASA Astrophysics Data System (ADS)

    Lund, M.; Zona, D.; Jackowicz-Korczynski, M.; Xu, X.

    2017-12-01

    The eddy covariance methodology is the primary tool for studying landscape-scale land-atmosphere exchange of greenhouse gases. Since the choice of instrumental setup and processing algorithms may influence the results, efforts within the international flux community have been made towards methodological harmonization and standardization. Performing eddy covariance measurements in high-latitude, Arctic tundra sites involves several challenges, related not only to remoteness and harsh climate conditions but also to the choice of processing algorithms. Partitioning of net ecosystem exchange (NEE) of CO2 into gross primary production (GPP) and ecosystem respiration (Reco) in the FLUXNET2015 dataset is made using either Nighttime or Daytime methods. These variables, GPP and Reco, are essential for calibration and validation of Earth system models. North of the Arctic Circle, sun remains visible at local midnight for a period of time, the number of days per year with midnight sun being dependent on latitude. The absence of nighttime conditions during Arctic summers renders the Nighttime method uncertain, however, no extensive assessment on the implications for flux partitioning has yet been made. In this study, we will assess the performance and validity of both partitioning methods along a latitudinal transect of northern sites included in the FLUXNET2015 dataset. We will evaluate the partitioned flux components against model simulations using the Community Land Model (CLM). Our results will be valuable for users interested in simulating Arctic and global carbon cycling.

  5. Radio-metabolite analysis of carbon-11 biochemical partitioning to non-structural carbohydrates for integrated metabolism and transport studies.

    PubMed

    Babst, Benjamin A; Karve, Abhijit A; Judt, Tatjana

    2013-06-01

    Metabolism and phloem transport of carbohydrates are interactive processes, yet each is often studied in isolation from the other. Carbon-11 ((11)C) has been successfully used to study transport and allocation processes dynamically over time. There is a need for techniques to determine metabolic partitioning of newly fixed carbon that are compatible with existing non-invasive (11)C-based methodologies for the study of phloem transport. In this report, we present methods using (11)C-labeled CO2 to trace carbon partitioning to the major non-structural carbohydrates in leaves-sucrose, glucose, fructose and starch. High-performance thin-layer chromatography (HPTLC) was adapted to provide multisample throughput, raising the possibility of measuring different tissues of the same individual plant, or for screening multiple plants. An additional advantage of HPTLC was that phosphor plate imaging of radioactivity had a much higher sensitivity and broader range of sensitivity than radio-HPLC detection, allowing measurement of (11)C partitioning to starch, which was previously not possible. Because of the high specific activity of (11)C and high sensitivity of detection, our method may have additional applications in the study of rapid metabolic responses to environmental changes that occur on a time scale of minutes. The use of this method in tandem with other (11)C assays for transport dynamics and whole-plant partitioning makes a powerful combination of tools to study carbohydrate metabolism and whole-plant transport as integrated processes.

  6. Sequential detection of temporal communities by estrangement confinement.

    PubMed

    Kawadia, Vikas; Sreenivasan, Sameet

    2012-01-01

    Temporal communities are the result of a consistent partitioning of nodes across multiple snapshots of an evolving network, and they provide insights into how dense clusters in a network emerge, combine, split and decay over time. To reliably detect temporal communities we need to not only find a good community partition in a given snapshot but also ensure that it bears some similarity to the partition(s) found in the previous snapshot(s), a particularly difficult task given the extreme sensitivity of community structure yielded by current methods to changes in the network structure. Here, motivated by the inertia of inter-node relationships, we present a new measure of partition distance called estrangement, and show that constraining estrangement enables one to find meaningful temporal communities at various degrees of temporal smoothness in diverse real-world datasets. Estrangement confinement thus provides a principled approach to uncovering temporal communities in evolving networks.

  7. Compact and multiple plasmonic nanofilter based on ultra-broad stopband in partitioned semicircle or semiring stub waveguide

    NASA Astrophysics Data System (ADS)

    Zheng, Mingfei; Li, Hongjian; Chen, Zhiquan; He, Zhihui; Xu, Hui; Zhao, Mingzhuo

    2017-11-01

    We propose a compact plasmonic nanofilter in partitioned semicircle or semiring stub waveguide, and investigate the transmission characteristics of the two novel systems by using the finite-difference time-domain method. An ultra-broad stopband phenomenon is generated by partitioning a single stub into a double stub with a rectangular metal partition, which is caused by the destructive interference superposition of the reflected and transmitted waves from each stub. A tunable stopband is realized in the multiple plasmonic nanofilter by adjusting the width of the partition and the (outer) radius and inner radius of the stub, whose starting wavelength, ending wavelength, center wavelength, bandwidth and total tunable bandwidth are discussed, and specific filtering waveband and optimum structural parameter are obtained. The proposed structures realize asymmetrical stub and achieve ultra-broad stopband, and have potential applications in band-stop nanofilters and high-density plasmonic integrated optical circuits.

  8. Ocean surface partitioning strategies using ocean colour remote Sensing: A review

    NASA Astrophysics Data System (ADS)

    Krug, Lilian Anne; Platt, Trevor; Sathyendranath, Shubha; Barbosa, Ana B.

    2017-06-01

    The ocean surface is organized into regions with distinct properties reflecting the complexity of interactions between environmental forcing and biological responses. The delineation of these functional units, each with unique, homogeneous properties and underlying ecosystem structure and dynamics, can be defined as ocean surface partitioning. The main purposes and applications of ocean partitioning include the evaluation of particular marine environments; generation of more accurate satellite ocean colour products; assimilation of data into biogeochemical and climate models; and establishment of ecosystem-based management practices. This paper reviews the diverse approaches implemented for ocean surface partition into functional units, using ocean colour remote sensing (OCRS) data, including their purposes, criteria, methods and scales. OCRS offers a synoptic, high spatial-temporal resolution, multi-decadal coverage of bio-optical properties, relevant to the applications and value of ocean surface partitioning. In combination with other biotic and/or abiotic data, OCRS-derived data (e.g., chlorophyll-a, optical properties) provide a broad and varied source of information that can be analysed using different delineation methods derived from subjective, expert-based to unsupervised learning approaches (e.g., cluster, fuzzy and empirical orthogonal function analyses). Partition schemes are applied at global to mesoscale spatial coverage, with static (time-invariant) or dynamic (time-varying) representations. A case study, the highly heterogeneous area off SW Iberian Peninsula (NE Atlantic), illustrates how the selection of spatial coverage and temporal representation affects the discrimination of distinct environmental drivers of phytoplankton variability. Advances in operational oceanography and in the subject area of satellite ocean colour, including development of new sensors, algorithms and products, are among the potential benefits from extended use, scope and applications of ocean surface partitioning using OCRS.

  9. Methods for Large-Scale Nonlinear Optimization.

    DTIC Science & Technology

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  10. Symplectic partitioned Runge-Kutta scheme for Maxwell's equations

    NASA Astrophysics Data System (ADS)

    Huang, Zhi-Xiang; Wu, Xian-Liang

    Using the symplectic partitioned Runge-Kutta (PRK) method, we construct a new scheme for approximating the solution to infinite dimensional nonseparable Hamiltonian systems of Maxwell's equations for the first time. The scheme is obtained by discretizing the Maxwell's equations in the time direction based on symplectic PRK method, and then evaluating the equation in the spatial direction with a suitable finite difference approximation. Several numerical examples are presented to verify the efficiency of the scheme.

  11. Kinetic energy partition method applied to ground state helium-like atoms.

    PubMed

    Chen, Yu-Hsin; Chao, Sheng D

    2017-03-28

    We have used the recently developed kinetic energy partition (KEP) method to solve the quantum eigenvalue problems for helium-like atoms and obtain precise ground state energies and wave-functions. The key to treating properly the electron-electron (repulsive) Coulomb potential energies for the KEP method to be applied is to introduce a "negative mass" term into the partitioned kinetic energy. A Hartree-like product wave-function from the subsystem wave-functions is used to form the initial trial function, and the variational search for the optimized adiabatic parameters leads to a precise ground state energy. This new approach sheds new light on the all-important problem of solving many-electron Schrödinger equations and hopefully opens a new way to predictive quantum chemistry. The results presented here give very promising evidence that an effective one-electron model can be used to represent a many-electron system, in the spirit of density functional theory.

  12. Accurate Segmentation of Cervical Cytoplasm and Nuclei Based on Multiscale Convolutional Network and Graph Partitioning.

    PubMed

    Song, Youyi; Zhang, Ling; Chen, Siping; Ni, Dong; Lei, Baiying; Wang, Tianfu

    2015-10-01

    In this paper, a multiscale convolutional network (MSCN) and graph-partitioning-based method is proposed for accurate segmentation of cervical cytoplasm and nuclei. Specifically, deep learning via the MSCN is explored to extract scale invariant features, and then, segment regions centered at each pixel. The coarse segmentation is refined by an automated graph partitioning method based on the pretrained feature. The texture, shape, and contextual information of the target objects are learned to localize the appearance of distinctive boundary, which is also explored to generate markers to split the touching nuclei. For further refinement of the segmentation, a coarse-to-fine nucleus segmentation framework is developed. The computational complexity of the segmentation is reduced by using superpixel instead of raw pixels. Extensive experimental results demonstrate that the proposed cervical nucleus cell segmentation delivers promising results and outperforms existing methods.

  13. Wavelet compression of multichannel ECG data by enhanced set partitioning in hierarchical trees algorithm.

    PubMed

    Sharifahmadian, Ershad

    2006-01-01

    The set partitioning in hierarchical trees (SPIHT) algorithm is very effective and computationally simple technique for image and signal compression. Here the author modified the algorithm which provides even better performance than the SPIHT algorithm. The enhanced set partitioning in hierarchical trees (ESPIHT) algorithm has performance faster than the SPIHT algorithm. In addition, the proposed algorithm reduces the number of bits in a bit stream which is stored or transmitted. I applied it to compression of multichannel ECG data. Also, I presented a specific procedure based on the modified algorithm for more efficient compression of multichannel ECG data. This method employed on selected records from the MIT-BIH arrhythmia database. According to experiments, the proposed method attained the significant results regarding compression of multichannel ECG data. Furthermore, in order to compress one signal which is stored for a long time, the proposed multichannel compression method can be utilized efficiently.

  14. Sorption capacity of plastic debris for hydrophobic organic chemicals.

    PubMed

    Lee, Hwang; Shim, Won Joon; Kwon, Jung-Hwan

    2014-02-01

    The occurrence of microplastics (MPs) in the ocean is an emerging world-wide concern. Due to high sorption capacity of plastics for hydrophobic organic chemicals (HOCs), sorption may play an important role in the transport processes of HOCs. However, sorption capacity of various plastic materials is rarely documented except in the case of those used for environmental sampling purposes. In this study, we measured partition coefficients between MPs and seawater (KMPsw) for 8 polycyclic aromatic hydrocarbons (PAHs), 4 hexachlorocyclohexanes (HCHs) and 2 chlorinated benzenes (CBs). Three surrogate polymers - polyethylene, polypropylene, and polystyrene - were used as model plastic debris because they are the major components of microplastic debris found. Due to the limited solubility of HOCs in seawater and their long equilibration time, a third-phase partitioning method was used for the determination of KMPsw. First, partition coefficients between polydimethylsiloxane (PDMS) and seawater (KPDMSsw) were measured. For the determination of KMPsw, the distribution of HOCs between PDMS or plastics and solvent mixture (methanol:water=8:2 (v/v)) was determined after apparent equilibrium up to 12 weeks. Plastic debris was prepared in a laboratory by physical crushing; the median longest dimension was 320-440 μm. Partition coefficients between polyethylene and seawater obtained using the third-phase equilibrium method agreed well with experimental partition coefficients between low-density polyethylene and water in the literature. The values of KMPsw were generally in the order of polystyrene, polyethylene, and polypropylene for most of the chemicals tested. The ranges of log KMPsw were 2.04-7.87, 2.18-7.00, and 2.63-7.52 for polyethylene, polypropylene, and polystyrene, respectively. The partition coefficients of plastic debris can be as high as other frequently used partition coefficients, such as 1-octanol-water partition coefficients (Kow) and log KMPsw showed good linear correlations with log Kow. High sorption capacity of microplastics implies the importance of MP-associated transport of HOCs in the marine environment. © 2013 Elsevier B.V. All rights reserved.

  15. Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables

    NASA Astrophysics Data System (ADS)

    Goodwell, Allison E.; Kumar, Praveen

    2017-07-01

    Information theoretic measures can be used to identify nonlinear interactions between source and target variables through reductions in uncertainty. In information partitioning, multivariate mutual information is decomposed into synergistic, unique, and redundant components. Synergy is information shared only when sources influence a target together, uniqueness is information only provided by one source, and redundancy is overlapping shared information from multiple sources. While this partitioning has been applied to provide insights into complex dependencies, several proposed partitioning methods overestimate redundant information and omit a component of unique information because they do not account for source dependencies. Additionally, information partitioning has only been applied to time-series data in a limited context, using basic pdf estimation techniques or a Gaussian assumption. We develop a Rescaled Redundancy measure (Rs) to solve the source dependency issue, and present Gaussian, autoregressive, and chaotic test cases to demonstrate its advantages over existing techniques in the presence of noise, various source correlations, and different types of interactions. This study constitutes the first rigorous application of information partitioning to environmental time-series data, and addresses how noise, pdf estimation technique, or source dependencies can influence detected measures. We illustrate how our techniques can unravel the complex nature of forcing and feedback within an ecohydrologic system with an application to 1 min environmental signals of air temperature, relative humidity, and windspeed. The methods presented here are applicable to the study of a broad range of complex systems composed of interacting variables.

  16. Evidence for melt partitioning between olivine and orthopyroxene in partially molten harzburgite

    NASA Astrophysics Data System (ADS)

    Miller, K.; Zhu, W.; Montesi, L. G.; Le Roux, V.; Gaetani, G. A.

    2013-12-01

    During melting at mid-ocean ridges, melt is driven into an equilibrium, minimum-energy configuration by surface energy gradients between solid-solid and solid-liquid phase boundaries. Such a configuration, where melt is mostly restricted to three and four-grain junctions, acts as a porous medium through which melt can percolate to the surface. For a monomineralic system, melt is distributed evenly among all grains. However, in mineralogical heterogeneous systems, melt partitions unevenly between the various solid phases to minimize the total energy of the system. In a ocean ridge melting environment, where olivine is often juxtaposed against orthopyroxene (opx), lithologic partitioning is expected to turn olivine-rich regions into high-permeability conduits, through which melt can be quickly extracted, drastically increasing the permeability of the mantle [Zhu and Hirth, 2003]. Lithologic partitioning has been demonstrated in experiments using analogue systems [Watson, 1999]; however, to date, no experiment has confirmed its existence in partially molten mantle systems. We present experimental results that determine the degree of melt partitioning between olivine and opx in partially molten harzburgites. Samples were prepared from a powdered mixture of oxides and carbonates and then hot-pressed in a solid-media piston-cylinder apparatus at 1350°C and 1.5GPa [Zhu et al., 2011] to achieve an 82/18 vol. % ratio of olivine to opx. Prior to hot-pressing, basalt was added to the powdered mixtures in various proportions to test for lithologic partitioning across a range of melt fractions. Three-dimensional, 700nm-resolution images of our samples were obtained using synchrotron X-ray microtomography on the 2BM station of the Advanced Photon Source at Argonne National Labs. Image data were filtered using an anisotropic diffusion filter to enhance phase contrast and then segmented to produce binary representations of each phase. In order to quantitatively demonstrate lithologic melt partitioning in our samples, we digitally segment each grain and then fit a sample window, slightly larger than the grain, to calculate the local melt volume fraction. Our results show strong evidence for lithologic partitioning in partially molten harzburgite systems, in a ~2 to 1 ratio of local melt fraction, between olivine and opx across the range of melt fractions tested. We also present permeability, grain size, and connectivity analyses of our samples in order to evaluate the effects of melt partitioning on melt migration rates at mid-ocean ridges, as well as at other locations in the Earth where partial melting occurs. References Watson, E. B. (1999), Lithologic partitioning of fluids and melts, American Minerologist, 84, 1693-1710. Zhu, W., and G. Hirth (2003), A network model for permeability in partially molten rocks, Earth Planet. Sci. Lett., 212(3-4), 407-416, doi:10.1016/S0012-821X(03)00264-4. Zhu, W., G. A. Gaetani, F. Fusseis, L. G. J. Montési, and F. De Carlo (2011), Microtomography of partially molten rocks: three-dimensional melt distribution in mantle peridotite, Science, 332(6025), 88-91, doi:10.1126/science.1202221.

  17. Systems and methods for knowledge discovery in spatial data

    DOEpatents

    Obradovic, Zoran; Fiez, Timothy E.; Vucetic, Slobodan; Lazarevic, Aleksandar; Pokrajac, Dragoljub; Hoskinson, Reed L.

    2005-03-08

    Systems and methods are provided for knowledge discovery in spatial data as well as to systems and methods for optimizing recipes used in spatial environments such as may be found in precision agriculture. A spatial data analysis and modeling module is provided which allows users to interactively and flexibly analyze and mine spatial data. The spatial data analysis and modeling module applies spatial data mining algorithms through a number of steps. The data loading and generation module obtains or generates spatial data and allows for basic partitioning. The inspection module provides basic statistical analysis. The preprocessing module smoothes and cleans the data and allows for basic manipulation of the data. The partitioning module provides for more advanced data partitioning. The prediction module applies regression and classification algorithms on the spatial data. The integration module enhances prediction methods by combining and integrating models. The recommendation module provides the user with site-specific recommendations as to how to optimize a recipe for a spatial environment such as a fertilizer recipe for an agricultural field.

  18. DEVELOPMENT AND APPLICATION OF EQUILIBRIUM PARTITIONING SEDIMENT GUIDELINES IN THE ASSESSMENT OF SEDIMENT PAH CONTAMINATION

    EPA Science Inventory

    The U.S. Environmental Protection Agency used insights and methods from its water quality criteria program to develop ESGs. The discovery that freely-dissolved contaminants were the toxic form led to equilibrium partitioning being chosen to model the distribution of contaminants...

  19. Impact of water use efficiency parameterization on partitioning evapotranspiration with the eddy covariance flux variance method

    USDA-ARS?s Scientific Manuscript database

    Partitioned observations of evapotranspiration (ET) into its constituent components of soil and canopy evaporation (E) and plant transpiration (T) are needed to validate many agricultural water use models. E and T observations are also useful for assessing management practices to reduce crop water ...

  20. Alternative approaches to mixed conifer forest restoration: partitioning the competitive neighborhood

    Treesearch

    Michael I. Premer; Sophan Chhin; Jianwei Zhang

    2017-01-01

    Forest restoration efforts in the intermountain west of North America generally seek to promote the continuation of pine dominance, enhance wildlife habitat, and decrease hazardous fuels, thereby mitigating catastrophic losses from various stressors and disturbances. We propose a method of focal tree release thinning that partitions the...

  1. Coupling Surfactants/Cosolvents with Oxidants: Effects on Site Characterization and DNAPL Remediation

    NASA Astrophysics Data System (ADS)

    Dugan, P. J.; Siegrist, R. L.; Crimi, M. L.

    2004-12-01

    Within the last decade, surfactant-enhanced aquifer remediation \\(SEAR\\), and more recently, in-situ chemical oxidation \\(ISCO\\) show promise for remediation of dense nonaqueous phase liquid \\(DNAPL\\) contamination in the subsurface. DNAPL removal is typically difficult to achieve with one remedial technique; however, coupling of treatments can be a highly effective method for remediation of DNAPL contamination. Little research has been completed to date to evaluate such coupling and the factors that impact appropriate engineering design and remediation performance assessment. Partitioning tracer tests (PTTs) are a promising method for estimating the volume and distribution of DNAPL. PTTs have several useful purposes: locating subsurface DNAPL zones, estimating NAPL saturation or volume within these contaminated zones, and providing a quantitative and qualitative means of assessing remediation performance. PTT theory permits direct calculation of the NAPL saturation from the chromatographic separation of a tracer pulse consisting of suites of partitioning and non-partitioning tracers that travel with the advecting groundwater. The PTT has been used with limited success after surfactant/cosolvent recovery but has not been assessed as a performance assessment tool after ISCO. There are several factors that could potentially impact the feasibility of the PTT after ISCO. First, previous batch experiments indicate that partitioning tracers degrade in the presence of the oxidant potassium permanganate. Secondly, tracer partitioning could be inhibited by manganese dioxide film formation after chemical oxidation of DNAPL. Both of these factors have potential to influence partitioning tracer transport, which could lead to inaccurate estimates of the post-remediation NAPL saturation, and therefore remediation efficiency. There is a need for researching PTTs after surfactant/cosolvent coupling with ISCO. In general, DNAPL-zone characterization methods have significant uncertainty, and assessing remediation efficiency is difficult. Effluent concentrations can be monitored in the extraction fluid during surfactant/cosolvent flushing, as an independent measure of mass removed. However, a challenge with ISCO in terms of performance assessment is that there is no way to directly measure mass destroyed, except through post-remediation characterization (i.e., PTTs or soil cores). Column and 2-D cell studies were conducted to investigate removal of DNAPL with surfactant/cosolvent flushing coupled with ISCO using the oxidant potassium permanganate. Partitioning and non-partitioning tracers were used in the pre- and post-remediation studies to investigate the effect of these remedial techniques on the viability of PTT.

  2. A New Model for Optimal Mechanical and Thermal Performance of Cement-Based Partition Wall

    PubMed Central

    Huang, Shiping; Hu, Mengyu; Cui, Nannan; Wang, Weifeng

    2018-01-01

    The prefabricated cement-based partition wall has been widely used in assembled buildings because of its high manufacturing efficiency, high-quality surface, and simple and convenient construction process. In this paper, a general porous partition wall that is made from cement-based materials was proposed to meet the optimal mechanical and thermal performance during transportation, construction and its service life. The porosity of the proposed partition wall is formed by elliptic-cylinder-type cavities. The finite element method was used to investigate the mechanical and thermal behaviour, which shows that the proposed model has distinct advantages over the current partition wall that is used in the building industry. It is found that, by controlling the eccentricity of the elliptic-cylinder cavities, the proposed wall stiffness can be adjusted to respond to the imposed loads and to improve the thermal performance, which can be used for the optimum design. Finally, design guidance is provided to obtain the optimal mechanical and thermal performance. The proposed model could be used as a promising candidate for partition wall in the building industry. PMID:29673176

  3. A New Model for Optimal Mechanical and Thermal Performance of Cement-Based Partition Wall.

    PubMed

    Huang, Shiping; Hu, Mengyu; Huang, Yonghui; Cui, Nannan; Wang, Weifeng

    2018-04-17

    The prefabricated cement-based partition wall has been widely used in assembled buildings because of its high manufacturing efficiency, high-quality surface, and simple and convenient construction process. In this paper, a general porous partition wall that is made from cement-based materials was proposed to meet the optimal mechanical and thermal performance during transportation, construction and its service life. The porosity of the proposed partition wall is formed by elliptic-cylinder-type cavities. The finite element method was used to investigate the mechanical and thermal behaviour, which shows that the proposed model has distinct advantages over the current partition wall that is used in the building industry. It is found that, by controlling the eccentricity of the elliptic-cylinder cavities, the proposed wall stiffness can be adjusted to respond to the imposed loads and to improve the thermal performance, which can be used for the optimum design. Finally, design guidance is provided to obtain the optimal mechanical and thermal performance. The proposed model could be used as a promising candidate for partition wall in the building industry.

  4. Scheduling applications for execution on a plurality of compute nodes of a parallel computer to manage temperature of the nodes during execution

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Peters, Amanda E; Ratterman, Joseph D; Smith, Brian E

    2012-10-16

    Methods, apparatus, and products are disclosed for scheduling applications for execution on a plurality of compute nodes of a parallel computer to manage temperature of the plurality of compute nodes during execution that include: identifying one or more applications for execution on the plurality of compute nodes; creating a plurality of physically discontiguous node partitions in dependence upon temperature characteristics for the compute nodes and a physical topology for the compute nodes, each discontiguous node partition specifying a collection of physically adjacent compute nodes; and assigning, for each application, that application to one or more of the discontiguous node partitions for execution on the compute nodes specified by the assigned discontiguous node partitions.

  5. A comparison of latent class, K-means, and K-median methods for clustering dichotomous data.

    PubMed

    Brusco, Michael J; Shireman, Emilie; Steinley, Douglas

    2017-09-01

    The problem of partitioning a collection of objects based on their measurements on a set of dichotomous variables is a well-established problem in psychological research, with applications including clinical diagnosis, educational testing, cognitive categorization, and choice analysis. Latent class analysis and K-means clustering are popular methods for partitioning objects based on dichotomous measures in the psychological literature. The K-median clustering method has recently been touted as a potentially useful tool for psychological data and might be preferable to its close neighbor, K-means, when the variable measures are dichotomous. We conducted simulation-based comparisons of the latent class, K-means, and K-median approaches for partitioning dichotomous data. Although all 3 methods proved capable of recovering cluster structure, K-median clustering yielded the best average performance, followed closely by latent class analysis. We also report results for the 3 methods within the context of an application to transitive reasoning data, in which it was found that the 3 approaches can exhibit profound differences when applied to real data. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Distributed State Estimation Using a Modified Partitioned Moving Horizon Strategy for Power Systems.

    PubMed

    Chen, Tengpeng; Foo, Yi Shyh Eddy; Ling, K V; Chen, Xuebing

    2017-10-11

    In this paper, a distributed state estimation method based on moving horizon estimation (MHE) is proposed for the large-scale power system state estimation. The proposed method partitions the power systems into several local areas with non-overlapping states. Unlike the centralized approach where all measurements are sent to a processing center, the proposed method distributes the state estimation task to the local processing centers where local measurements are collected. Inspired by the partitioned moving horizon estimation (PMHE) algorithm, each local area solves a smaller optimization problem to estimate its own local states by using local measurements and estimated results from its neighboring areas. In contrast with PMHE, the error from the process model is ignored in our method. The proposed modified PMHE (mPMHE) approach can also take constraints on states into account during the optimization process such that the influence of the outliers can be further mitigated. Simulation results on the IEEE 14-bus and 118-bus systems verify that our method achieves comparable state estimation accuracy but with a significant reduction in the overall computation load.

  7. Partitioning Algorithms for Simultaneously Balancing Iterative and Direct Methods

    DTIC Science & Technology

    2004-03-03

    is defined as 57698&:&;=<$>?8A@B8 DC E & /F <G8H IJ0 K L 012 1NM? which is the ratio of the highest partition weight over the average...OQPSR , 57698T:;=<$>U8T@B8 DC E & /VXWZYK[\\O , and E :^] E_CU`4ab /V is minimized. The load imbalance is the constraint we have to satisfy, and...that the initial partitioning can be improved [16, 19, 20]. 3 Problem Definition and Challenges Consider a graph )c2 with d e f vertices

  8. Low-derivative operators of the Standard Model effective field theory via Hilbert series methods

    NASA Astrophysics Data System (ADS)

    Lehman, Landon; Martin, Adam

    2016-02-01

    In this work, we explore an extension of Hilbert series techniques to count operators that include derivatives. For sufficiently low-derivative operators, we conjecture an algorithm that gives the number of invariant operators, properly accounting for redundancies due to the equations of motion and integration by parts. Specifically, the conjectured technique can be applied whenever there is only one Lorentz invariant for a given partitioning of derivatives among the fields. At higher numbers of derivatives, equation of motion redundancies can be removed, but the increased number of Lorentz contractions spoils the subtraction of integration by parts redundancies. While restricted, this technique is sufficient to automatically recreate the complete set of invariant operators of the Standard Model effective field theory for dimensions 6 and 7 (for arbitrary numbers of flavors). At dimension 8, the algorithm does not automatically generate the complete operator set; however, it suffices for all but five classes of operators. For these remaining classes, there is a well defined procedure to manually determine the number of invariants. Assuming our method is correct, we derive a set of 535 dimension-8 N f = 1 operators.

  9. Technical evaluation report of the Specialists Meeting on Characterization of Low Cycle High Temperature Fatigue by the Strainrange Partitioning Method

    NASA Technical Reports Server (NTRS)

    Drapier, J. M.; Hirschberg, M. H.

    1979-01-01

    The ability of the Strainrange Partitioning Method SRP was evaluated to correlate the creep-fatigue behavior of gas turbine materials and to predict the creep fatigue life of laboratory specimens subjected to complex cycling conditions. A reference body of high temperature creep fatigue data which can be used in the evaluation of other SRP and low cycle high temperature fatigue predictive techniques was provided.

  10. Implementation of spectral clustering on microarray data of carcinoma using k-means algorithm

    NASA Astrophysics Data System (ADS)

    Frisca, Bustamam, Alhadi; Siswantining, Titin

    2017-03-01

    Clustering is one of data analysis methods that aims to classify data which have similar characteristics in the same group. Spectral clustering is one of the most popular modern clustering algorithms. As an effective clustering technique, spectral clustering method emerged from the concepts of spectral graph theory. Spectral clustering method needs partitioning algorithm. There are some partitioning methods including PAM, SOM, Fuzzy c-means, and k-means. Based on the research that has been done by Capital and Choudhury in 2013, when using Euclidian distance k-means algorithm provide better accuracy than PAM algorithm. So in this paper we use k-means as our partition algorithm. The major advantage of spectral clustering is in reducing data dimension, especially in this case to reduce the dimension of large microarray dataset. Microarray data is a small-sized chip made of a glass plate containing thousands and even tens of thousands kinds of genes in the DNA fragments derived from doubling cDNA. Application of microarray data is widely used to detect cancer, for the example is carcinoma, in which cancer cells express the abnormalities in his genes. The purpose of this research is to classify the data that have high similarity in the same group and the data that have low similarity in the others. In this research, Carcinoma microarray data using 7457 genes. The result of partitioning using k-means algorithm is two clusters.

  11. Hierarchical Modeling and Robust Synthesis for the Preliminary Design of Large Scale Complex Systems

    NASA Technical Reports Server (NTRS)

    Koch, Patrick N.

    1997-01-01

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis; Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration; and Noise modeling techniques for implementing robust preliminary design when approximate models are employed. Hierarchical partitioning and modeling techniques including intermediate responses, linking variables, and compatibility constraints are incorporated within a hierarchical compromise decision support problem formulation for synthesizing subproblem solutions for a partitioned system. Experimentation and approximation techniques are employed for concurrent investigations and modeling of partitioned subproblems. A modified composite experiment is introduced for fitting better predictive models across the ranges of the factors, and an approach for constructing partitioned response surfaces is developed to reduce the computational expense of experimentation for fitting models in a large number of factors. Noise modeling techniques are compared and recommendations are offered for the implementation of robust design when approximate models are sought. These techniques, approaches, and recommendations are incorporated within the method developed for hierarchical robust preliminary design exploration. This method as well as the associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system. The case study is developed in collaboration with Allison Engine Company, Rolls Royce Aerospace, and is based on the Allison AE3007 existing engine designed for midsize commercial, regional business jets. For this case study, the turbofan system-level problem is partitioned into engine cycle design and configuration design and a compressor modules integrated for more detailed subsystem-level design exploration, improving system evaluation. The fan and low pressure turbine subsystems are also modeled, but in less detail. Given the defined partitioning, these subproblems are investigated independently and concurrently, and response surface models are constructed to approximate the responses of each. These response models are then incorporated within a commercial turbofan hierarchical compromise decision support problem formulation. Five design scenarios are investigated, and robust solutions are identified. The method and solutions identified are verified by comparison with the AE3007 engine. The solutions obtained are similar to the AE3007 cycle and configuration, but are better with respect to many of the requirements.

  12. Confocal Raman Microscopy for In-situ Measurement of Phospholipid-Water Partitioning into Model Phospholipid Bilayers within Individual Chromatographic Particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitt, Jay P.; Bryce, David A.; Minteer, Shelley D.

    The phospholipid-water partition coefficient is a commonly measured parameter that correlates with drug efficacy, small-molecule toxicity, and accumulation of molecules in biological systems in the environment. Despite the utility of this parameter, methods for measuring phospholipid-water partition coefficients are limited. This is due to the difficulty of making quantitative measurements in vesicle membranes or supported phospholipid bilayers, both of which are small-volume phases that challenge the sensitivity of many analytical techniques. In this paper, we employ in-situ confocal Raman microscopy to probe the partitioning of a model membrane-active compound, 2-(4-isobutylphenyl) propionic acid or ibuprofen, into both hybrid- and supported-phospholipid bilayersmore » deposited on the pore walls of individual chromatographic particles. The large surface-area-to-volume ratio of chromatographic silica allows interrogation of a significant lipid bilayer area within a very small volume. The local phospholipid concentration within a confocal probe volume inside the particle can be as high as 0.5 M, which overcomes the sensitivity limitations of making measurements in the limited membrane areas of single vesicles or planar supported bilayers. Quantitative determination of ibuprofen partitioning is achieved by using the phospholipid acyl-chains of the within-particle bilayer as an internal standard. This approach is tested for measurements of pH-dependent partitioning of ibuprofen into both hybrid-lipid and supported-lipid bilayers within silica particles, and the results are compared with octanol-water partitioning and with partitioning into individual optically-trapped phospholipid vesicle membranes. Finally and additionally, the impact of ibuprofen partitioning on bilayer structure is evaluated for both within-particle model membranes and compared with the structural impacts of partitioning into vesicle lipid bilayers.« less

  13. Confocal Raman Microscopy for In-situ Measurement of Phospholipid-Water Partitioning into Model Phospholipid Bilayers within Individual Chromatographic Particles

    DOE PAGES

    Kitt, Jay P.; Bryce, David A.; Minteer, Shelley D.; ...

    2018-05-14

    The phospholipid-water partition coefficient is a commonly measured parameter that correlates with drug efficacy, small-molecule toxicity, and accumulation of molecules in biological systems in the environment. Despite the utility of this parameter, methods for measuring phospholipid-water partition coefficients are limited. This is due to the difficulty of making quantitative measurements in vesicle membranes or supported phospholipid bilayers, both of which are small-volume phases that challenge the sensitivity of many analytical techniques. In this paper, we employ in-situ confocal Raman microscopy to probe the partitioning of a model membrane-active compound, 2-(4-isobutylphenyl) propionic acid or ibuprofen, into both hybrid- and supported-phospholipid bilayersmore » deposited on the pore walls of individual chromatographic particles. The large surface-area-to-volume ratio of chromatographic silica allows interrogation of a significant lipid bilayer area within a very small volume. The local phospholipid concentration within a confocal probe volume inside the particle can be as high as 0.5 M, which overcomes the sensitivity limitations of making measurements in the limited membrane areas of single vesicles or planar supported bilayers. Quantitative determination of ibuprofen partitioning is achieved by using the phospholipid acyl-chains of the within-particle bilayer as an internal standard. This approach is tested for measurements of pH-dependent partitioning of ibuprofen into both hybrid-lipid and supported-lipid bilayers within silica particles, and the results are compared with octanol-water partitioning and with partitioning into individual optically-trapped phospholipid vesicle membranes. Finally and additionally, the impact of ibuprofen partitioning on bilayer structure is evaluated for both within-particle model membranes and compared with the structural impacts of partitioning into vesicle lipid bilayers.« less

  14. Confocal Raman Microscopy for in Situ Measurement of Phospholipid-Water Partitioning into Model Phospholipid Bilayers within Individual Chromatographic Particles.

    PubMed

    Kitt, Jay P; Bryce, David A; Minteer, Shelley D; Harris, Joel M

    2018-06-05

    The phospholipid-water partition coefficient is a commonly measured parameter that correlates with drug efficacy, small-molecule toxicity, and accumulation of molecules in biological systems in the environment. Despite the utility of this parameter, methods for measuring phospholipid-water partition coefficients are limited. This is due to the difficulty of making quantitative measurements in vesicle membranes or supported phospholipid bilayers, both of which are small-volume phases that challenge the sensitivity of many analytical techniques. In this work, we employ in situ confocal Raman microscopy to probe the partitioning of a model membrane-active compound, 2-(4-isobutylphenyl) propionic acid or ibuprofen, into both hybrid- and supported-phospholipid bilayers deposited on the pore walls of individual chromatographic particles. The large surface-area-to-volume ratio of chromatographic silica allows interrogation of a significant lipid bilayer area within a very small volume. The local phospholipid concentration within a confocal probe volume inside the particle can be as high as 0.5 M, which overcomes the sensitivity limitations of making measurements in the limited membrane areas of single vesicles or planar supported bilayers. Quantitative determination of ibuprofen partitioning is achieved by using the phospholipid acyl-chains of the within-particle bilayer as an internal standard. This approach is tested for measurements of pH-dependent partitioning of ibuprofen into both hybrid-lipid and supported-lipid bilayers within silica particles, and the results are compared with octanol-water partitioning and with partitioning into individual optically trapped phospholipid vesicle membranes. Additionally, the impact of ibuprofen partitioning on bilayer structure is evaluated for both within-particle model membranes and compared with the structural impacts of partitioning into vesicle lipid bilayers.

  15. Partition functions with spin in AdS2 via quasinormal mode methods

    DOE PAGES

    Keeler, Cynthia; Lisbão, Pedro; Ng, Gim Seng

    2016-10-12

    We extend the results of [1], computing one loop partition functions for massive fields with spin half in AdS 2 using the quasinormal mode method proposed by Denef, Hartnoll, and Sachdev [2]. We find the finite representations of SO(2,1) for spin zero and spin half, consisting of a highest weight state |hi and descendants with non-unitary values of h. These finite representations capture the poles and zeroes of the one loop determinants. Together with the asymptotic behavior of the partition functions (which can be easily computed using a large mass heat kernel expansion), these are sufficient to determine the fullmore » answer for the one loop determinants. We also discuss extensions to higher dimensional AdS 2n and higher spins.« less

  16. An assessment of the liquid-gas partitioning behavior of major wastewater odorants using two comparative experimental approaches: liquid sample-based vaporization vs. impinger-based dynamic headspace extraction into sorbent tubes.

    PubMed

    Iqbal, Mohammad Asif; Kim, Ki-Hyun; Szulejko, Jan E; Cho, Jinwoo

    2014-01-01

    The gas-liquid partitioning behavior of major odorants (acetic acid, propionic acid, isobutyric acid, n-butyric acid, i-valeric acid, n-valeric acid, hexanoic acid, phenol, p-cresol, indole, skatole, and toluene (as a reference)) commonly found in microbially digested wastewaters was investigated by two experimental approaches. Firstly, a simple vaporization method was applied to measure the target odorants dissolved in liquid samples with the aid of sorbent tube/thermal desorption/gas chromatography/mass spectrometry. As an alternative method, an impinger-based dynamic headspace sampling method was also explored to measure the partitioning of target odorants between the gas and liquid phases with the same detection system. The relative extraction efficiency (in percent) of the odorants by dynamic headspace sampling was estimated against the calibration results derived by the vaporization method. Finally, the concentrations of the major odorants in real digested wastewater samples were also analyzed using both analytical approaches. Through a parallel application of the two experimental methods, we intended to develop an experimental approach to be able to assess the liquid-to-gas phase partitioning behavior of major odorants in a complex wastewater system. The relative sensitivity of the two methods expressed in terms of response factor ratios (RFvap/RFimp) of liquid standard calibration between vaporization and impinger-based calibrations varied widely from 981 (skatole) to 6,022 (acetic acid). Comparison of this relative sensitivity thus highlights the rather low extraction efficiency of the highly soluble and more acidic odorants from wastewater samples in dynamic headspace sampling.

  17. Quantitative analysis of molecular partition towards lipid membranes using surface plasmon resonance

    NASA Astrophysics Data System (ADS)

    Figueira, Tiago N.; Freire, João M.; Cunha-Santos, Catarina; Heras, Montserrat; Gonçalves, João; Moscona, Anne; Porotto, Matteo; Salomé Veiga, Ana; Castanho, Miguel A. R. B.

    2017-03-01

    Understanding the interplay between molecules and lipid membranes is fundamental when studying cellular and biotechnological phenomena. Partition between aqueous media and lipid membranes is key to the mechanism of action of many biomolecules and drugs. Quantifying membrane partition, through adequate and robust parameters, is thus essential. Surface Plasmon Resonance (SPR) is a powerful technique for studying 1:1 stoichiometric interactions but has limited application to lipid membrane partition data. We have developed and applied a novel mathematical model for SPR data treatment that enables determination of kinetic and equilibrium partition constants. The method uses two complementary fitting models for association and dissociation sensorgram data. The SPR partition data obtained for the antibody fragment F63, the HIV fusion inhibitor enfuvirtide, and the endogenous drug kyotorphin towards POPC membranes were compared against data from independent techniques. The comprehensive kinetic and partition models were applied to the membrane interaction data of HRC4, a measles virus entry inhibitor peptide, revealing its increased affinity for, and retention in, cholesterol-rich membranes. Overall, our work extends the application of SPR beyond the realm of 1:1 stoichiometric ligand-receptor binding into a new and immense field of applications: the interaction of solutes such as biomolecules and drugs with lipids.

  18. Handling Data Skew in MapReduce Cluster by Using Partition Tuning

    PubMed

    Gao, Yufei; Zhou, Yanjie; Zhou, Bing; Shi, Lei; Zhang, Jiacai

    2017-01-01

    The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data. © 2017 Yufei Gao et al.

  19. Handling Data Skew in MapReduce Cluster by Using Partition Tuning.

    PubMed

    Gao, Yufei; Zhou, Yanjie; Zhou, Bing; Shi, Lei; Zhang, Jiacai

    2017-01-01

    The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data.

  20. Handling Data Skew in MapReduce Cluster by Using Partition Tuning

    PubMed Central

    Zhou, Yanjie; Zhou, Bing; Shi, Lei

    2017-01-01

    The healthcare industry has generated large amounts of data, and analyzing these has emerged as an important problem in recent years. The MapReduce programming model has been successfully used for big data analytics. However, data skew invariably occurs in big data analytics and seriously affects efficiency. To overcome the data skew problem in MapReduce, we have in the past proposed a data processing algorithm called Partition Tuning-based Skew Handling (PTSH). In comparison with the one-stage partitioning strategy used in the traditional MapReduce model, PTSH uses a two-stage strategy and the partition tuning method to disperse key-value pairs in virtual partitions and recombines each partition in case of data skew. The robustness and efficiency of the proposed algorithm were tested on a wide variety of simulated datasets and real healthcare datasets. The results showed that PTSH algorithm can handle data skew in MapReduce efficiently and improve the performance of MapReduce jobs in comparison with the native Hadoop, Closer, and locality-aware and fairness-aware key partitioning (LEEN). We also found that the time needed for rule extraction can be reduced significantly by adopting the PTSH algorithm, since it is more suitable for association rule mining (ARM) on healthcare data. PMID:29065568

  1. Partitioning Behavior of Petrodiesel/Biodiesel Blends in Water

    EPA Science Inventory

    The partitioning behavior of six petrodiesel/soybean-biodiesel blends (B0, B20, B40, B60, B80, and B100, where B100 is 100% unblended biodiesel) in water was investigated at various oil loads by the 10-fold dilution method. Five fatty acid methyl esters (FAMEs), C10 - C20 n

  2. ANALYSIS OF A GAS-PHASE PARTITIONING TRACER TEST CONDUCTED IN AN UNSATURATED FRACTURED-CLAY FORMATION

    EPA Science Inventory

    The gas-phase partitioning tracer method was used to estimate non-aqueous phase liquid (NAPL), water, and air saturations in the vadose zone at a chlorinated-solvent contaminated field site in Tucson, AZ. The tracer test was conducted in a fractured-clay system that is the confin...

  3. OCTANOL/WATER PARTITION COEFFICIENTS AND WATER SOLUBILITIES OF PHTHALATE ESTERS

    EPA Science Inventory

    Measurements of the octanol/water partition coefficients (K-ow) and water solubilities of di-n-octyl phthalate (DnOP) and di-n-decyl phthalate (DnDP) by the slow-stirring method are reported. The water solubility was also measured for di-n-hexyl phthalate (DnHP). The log K-ow val...

  4. An Application of the Patient Rule-Induction Method for Evaluating the Contribution of the Apolipoprotein E and Lipoprotein Lipase Genes to Predicting Ischemic Heart Disease

    PubMed Central

    Dyson, Greg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G.; Tybjærg-Hansen, Anne; Sing, Charles F.

    2007-01-01

    Different combinations of genetic and environmental risk factors are known to contribute to the complex etiology of ischemic heart disease (IHD) in different subsets of individuals. We employed the Patient Rule-Induction Method (PRIM) to select the combination of risk factors and risk factor values that identified each of 16 mutually exclusive partitions of individuals having significantly different levels of risk of IHD. PRIM balances two competing objectives: (1) finding partitions where the risk of IHD is high and (2) maximizing the number of IHD cases explained by the partitions. A sequential PRIM analysis was applied to data on the incidence of IHD collected over 8 years for a sample of 5,455 unrelated individuals from the Copenhagen City Heart Study (CCHS) to assess the added value of variation in two candidate susceptibility genes beyond the traditional, lipid and body mass index risk factors for IHD. An independent sample of 362 unrelated individuals also from the city of Copenhagen was used to test the model obtained for each of the hypothesized partitions. PMID:17436307

  5. A similarity based agglomerative clustering algorithm in networks

    NASA Astrophysics Data System (ADS)

    Liu, Zhiyuan; Wang, Xiujuan; Ma, Yinghong

    2018-04-01

    The detection of clusters is benefit for understanding the organizations and functions of networks. Clusters, or communities, are usually groups of nodes densely interconnected but sparsely linked with any other clusters. To identify communities, an efficient and effective community agglomerative algorithm based on node similarity is proposed. The proposed method initially calculates similarities between each pair of nodes, and form pre-partitions according to the principle that each node is in the same community as its most similar neighbor. After that, check each partition whether it satisfies community criterion. For the pre-partitions who do not satisfy, incorporate them with others that having the biggest attraction until there are no changes. To measure the attraction ability of a partition, we propose an attraction index that based on the linked node's importance in networks. Therefore, our proposed method can better exploit the nodes' properties and network's structure. To test the performance of our algorithm, both synthetic and empirical networks ranging in different scales are tested. Simulation results show that the proposed algorithm can obtain superior clustering results compared with six other widely used community detection algorithms.

  6. A scalable geometric multigrid solver for nonsymmetric elliptic systems with application to variable-density flows

    NASA Astrophysics Data System (ADS)

    Esmaily, M.; Jofre, L.; Mani, A.; Iaccarino, G.

    2018-03-01

    A geometric multigrid algorithm is introduced for solving nonsymmetric linear systems resulting from the discretization of the variable density Navier-Stokes equations on nonuniform structured rectilinear grids and high-Reynolds number flows. The restriction operation is defined such that the resulting system on the coarser grids is symmetric, thereby allowing for the use of efficient smoother algorithms. To achieve an optimal rate of convergence, the sequence of interpolation and restriction operations are determined through a dynamic procedure. A parallel partitioning strategy is introduced to minimize communication while maintaining the load balance between all processors. To test the proposed algorithm, we consider two cases: 1) homogeneous isotropic turbulence discretized on uniform grids and 2) turbulent duct flow discretized on stretched grids. Testing the algorithm on systems with up to a billion unknowns shows that the cost varies linearly with the number of unknowns. This O (N) behavior confirms the robustness of the proposed multigrid method regarding ill-conditioning of large systems characteristic of multiscale high-Reynolds number turbulent flows. The robustness of our method to density variations is established by considering cases where density varies sharply in space by a factor of up to 104, showing its applicability to two-phase flow problems. Strong and weak scalability studies are carried out, employing up to 30,000 processors, to examine the parallel performance of our implementation. Excellent scalability of our solver is shown for a granularity as low as 104 to 105 unknowns per processor. At its tested peak throughput, it solves approximately 4 billion unknowns per second employing over 16,000 processors with a parallel efficiency higher than 50%.

  7. A Method for a Multi-Platform Approach to Generate Gridded Surface Evaporation

    NASA Astrophysics Data System (ADS)

    Badger, A.; Livneh, B.; Small, E. E.; Abolafia-Rosenzweig, R.

    2017-12-01

    Evapotranspiration is an integral component of the surface water balance. While there are many estimates of evapotranspiration, there are fewer estimates that partition evapotranspiration into evaporation and transpiration components. This study aims to generate a CONUS-scale, observationally-based soil evaporation dataset by using the time difference of surface soil moisture by Soil Moisture Active Passive (SMAP) satellite with adjustments for transpiration and a bottom flux out of the surface layer. In concert with SMAP, the Moderate-Resolution Imaging Spectroradiometer (MODIS) satellite, North American Land Data Assimilation Systems (NLDAS) and the Hydrus-1D model are used to fully analyze the surface water balance. A biome specific estimate of the total terrestrial ET is calculated through a variation of the Penman-Monteith equation with NLDAS forcing and NLDAS Noah Model output for meteorological variables. A root density restriction and SMAP-based soil moisture restriction are applied to obtain terrestrial transpiration estimates. By forcing Hydrus-1D with NLDAS meteorology and our terrestrial transpiration estimates, an estimate of the flux between the soil surface and root zone layers (qbot) will dictate the proportion of water that is available for soil evaporation. After constraining transpiration and the bottom flux from the surface layer, we estimate soil evaporation as the residual of the surface water balance. Application of this method at Fluxnet sites shows soil evaporation estimates of approximately 0­3 mm/day and less than ET estimates. Expanding this methodology to produce a gridded product for CONUS, and eventually a global-scale product, will enable a better understanding of water balance processes and contribute a dataset to validate land-surface model's surface flux processes.

  8. Adaptive hybrid simulations for multiscale stochastic reaction networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hepp, Benjamin; Gupta, Ankit; Khammash, Mustafa

    2015-01-21

    The probability distribution describing the state of a Stochastic Reaction Network (SRN) evolves according to the Chemical Master Equation (CME). It is common to estimate its solution using Monte Carlo methods such as the Stochastic Simulation Algorithm (SSA). In many cases, these simulations can take an impractical amount of computational time. Therefore, many methods have been developed that approximate sample paths of the underlying stochastic process and estimate the solution of the CME. A prominent class of these methods include hybrid methods that partition the set of species and the set of reactions into discrete and continuous subsets. Such amore » partition separates the dynamics into a discrete and a continuous part. Simulating such a stochastic process can be computationally much easier than simulating the exact discrete stochastic process with SSA. Moreover, the quasi-stationary assumption to approximate the dynamics of fast subnetworks can be applied for certain classes of networks. However, as the dynamics of a SRN evolves, these partitions may have to be adapted during the simulation. We develop a hybrid method that approximates the solution of a CME by automatically partitioning the reactions and species sets into discrete and continuous components and applying the quasi-stationary assumption on identifiable fast subnetworks. Our method does not require any user intervention and it adapts to exploit the changing timescale separation between reactions and/or changing magnitudes of copy-numbers of constituent species. We demonstrate the efficiency of the proposed method by considering examples from systems biology and showing that very good approximations to the exact probability distributions can be achieved in significantly less computational time. This is especially the case for systems with oscillatory dynamics, where the system dynamics change considerably throughout the time-period of interest.« less

  9. Adaptive hybrid simulations for multiscale stochastic reaction networks.

    PubMed

    Hepp, Benjamin; Gupta, Ankit; Khammash, Mustafa

    2015-01-21

    The probability distribution describing the state of a Stochastic Reaction Network (SRN) evolves according to the Chemical Master Equation (CME). It is common to estimate its solution using Monte Carlo methods such as the Stochastic Simulation Algorithm (SSA). In many cases, these simulations can take an impractical amount of computational time. Therefore, many methods have been developed that approximate sample paths of the underlying stochastic process and estimate the solution of the CME. A prominent class of these methods include hybrid methods that partition the set of species and the set of reactions into discrete and continuous subsets. Such a partition separates the dynamics into a discrete and a continuous part. Simulating such a stochastic process can be computationally much easier than simulating the exact discrete stochastic process with SSA. Moreover, the quasi-stationary assumption to approximate the dynamics of fast subnetworks can be applied for certain classes of networks. However, as the dynamics of a SRN evolves, these partitions may have to be adapted during the simulation. We develop a hybrid method that approximates the solution of a CME by automatically partitioning the reactions and species sets into discrete and continuous components and applying the quasi-stationary assumption on identifiable fast subnetworks. Our method does not require any user intervention and it adapts to exploit the changing timescale separation between reactions and/or changing magnitudes of copy-numbers of constituent species. We demonstrate the efficiency of the proposed method by considering examples from systems biology and showing that very good approximations to the exact probability distributions can be achieved in significantly less computational time. This is especially the case for systems with oscillatory dynamics, where the system dynamics change considerably throughout the time-period of interest.

  10. Self-assembled monolayers improve protein distribution on holey carbon cryo-EM supports

    PubMed Central

    Meyerson, Joel R.; Rao, Prashant; Kumar, Janesh; Chittori, Sagar; Banerjee, Soojay; Pierson, Jason; Mayer, Mark L.; Subramaniam, Sriram

    2014-01-01

    Poor partitioning of macromolecules into the holes of holey carbon support grids frequently limits structural determination by single particle cryo-electron microscopy (cryo-EM). Here, we present a method to deposit, on gold-coated carbon grids, a self-assembled monolayer whose surface properties can be controlled by chemical modification. We demonstrate the utility of this approach to drive partitioning of ionotropic glutamate receptors into the holes, thereby enabling 3D structural analysis using cryo-EM methods. PMID:25403871

  11. Inference and Analysis of Population Structure Using Genetic Data and Network Theory

    PubMed Central

    Greenbaum, Gili; Templeton, Alan R.; Bar-David, Shirli

    2016-01-01

    Clustering individuals to subpopulations based on genetic data has become commonplace in many genetic studies. Inference about population structure is most often done by applying model-based approaches, aided by visualization using distance-based approaches such as multidimensional scaling. While existing distance-based approaches suffer from a lack of statistical rigor, model-based approaches entail assumptions of prior conditions such as that the subpopulations are at Hardy-Weinberg equilibria. Here we present a distance-based approach for inference about population structure using genetic data by defining population structure using network theory terminology and methods. A network is constructed from a pairwise genetic-similarity matrix of all sampled individuals. The community partition, a partition of a network to dense subgraphs, is equated with population structure, a partition of the population to genetically related groups. Community-detection algorithms are used to partition the network into communities, interpreted as a partition of the population to subpopulations. The statistical significance of the structure can be estimated by using permutation tests to evaluate the significance of the partition’s modularity, a network theory measure indicating the quality of community partitions. To further characterize population structure, a new measure of the strength of association (SA) for an individual to its assigned community is presented. The strength of association distribution (SAD) of the communities is analyzed to provide additional population structure characteristics, such as the relative amount of gene flow experienced by the different subpopulations and identification of hybrid individuals. Human genetic data and simulations are used to demonstrate the applicability of the analyses. The approach presented here provides a novel, computationally efficient model-free method for inference about population structure that does not entail assumption of prior conditions. The method is implemented in the software NetStruct (available at https://giligreenbaum.wordpress.com/software/). PMID:26888080

  12. Binary recursive partitioning: background, methods, and application to psychology.

    PubMed

    Merkle, Edgar C; Shaffer, Victoria A

    2011-02-01

    Binary recursive partitioning (BRP) is a computationally intensive statistical method that can be used in situations where linear models are often used. Instead of imposing many assumptions to arrive at a tractable statistical model, BRP simply seeks to accurately predict a response variable based on values of predictor variables. The method outputs a decision tree depicting the predictor variables that were related to the response variable, along with the nature of the variables' relationships. No significance tests are involved, and the tree's 'goodness' is judged based on its predictive accuracy. In this paper, we describe BRP methods in a detailed manner and illustrate their use in psychological research. We also provide R code for carrying out the methods.

  13. MTR WING, TRA604. ONE OF THE LABORATORY UNITS ALONG THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    MTR WING, TRA-604. ONE OF THE LABORATORY UNITS ALONG THE SOUTH SIDE WALL. NOTE SINK, CABINET, TABLE, AND HOOD UNITS. DUCT ABOVE RECEIVES CONTAMINATED AIR AND SENDS IT TO FAN HOUSE AND STACK. NOTE PARTITION WALL BEHIND WORK UNITS. THE HEALTH PHYSICS LAB WAS SIMILARLY EQUIPPED. WINDOW AT LEFT EDGE OF VIEW. CARD IN LOWER RIGHT WAS INSERTED BY INL PHOTOGRAPHER TO COVER AN OBSOLETE SECURITY RESTRICTION PRINTED ON ORIGINAL NEGATIVE. INL NEGATIVE NO. 4225. Unknown Photographer, 2/13/1952 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  14. The use of acoustically tuned resonators to improve the sound transmission loss of double-panel partitions

    NASA Astrophysics Data System (ADS)

    Mason, J. M.; Fahy, F. J.

    1988-07-01

    Double-leaf partitions are often utilized in situations requiring low weight structures with high transmission loss, an example of current interest being the fuselage walls of propeller-driven aircraft. In this case, acoustic excitation is periodic and, if one of the frequencies of excitation lies in the region of the fundamental mass-air-mass frequency of the partition, insulation performance is considerably less than desired. The potential effectiveness of tuned Helmholtz resonators connected to the partition cavity is investigated as a method of improving transmission loss. This is demonstrated by a simple theoretical model and then experimentally verified. Results show that substantial improvements may be obtained at and around the mass-air-mass frequency for a total resonator volume 15 percent of the cavity volume.

  15. Robust and efficient overset grid assembly for partitioned unstructured meshes

    NASA Astrophysics Data System (ADS)

    Roget, Beatrice; Sitaraman, Jayanarayanan

    2014-03-01

    This paper presents a method to perform efficient and automated Overset Grid Assembly (OGA) on a system of overlapping unstructured meshes in a parallel computing environment where all meshes are partitioned into multiple mesh-blocks and processed on multiple cores. The main task of the overset grid assembler is to identify, in parallel, among all points in the overlapping mesh system, at which points the flow solution should be computed (field points), interpolated (receptor points), or ignored (hole points). Point containment search or donor search, an algorithm to efficiently determine the cell that contains a given point, is the core procedure necessary for accomplishing this task. Donor search is particularly challenging for partitioned unstructured meshes because of the complex irregular boundaries that are often created during partitioning.

  16. Generalizing the self-healing diffusion Monte Carlo approach to finite temperature: A path for the optimization of low-energy many-body bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reboredo, Fernando A.; Kim, Jeongnim

    A statistical method is derived for the calculation of thermodynamic properties of many-body systems at low temperatures. This method is based on the self-healing diffusion Monte Carlo method for complex functions [F. A. Reboredo, J. Chem. Phys. 136, 204101 (2012)] and some ideas of the correlation function Monte Carlo approach [D. M. Ceperley and B. Bernu, J. Chem. Phys. 89, 6316 (1988)]. In order to allow the evolution in imaginary time to describe the density matrix, we remove the fixed-node restriction using complex antisymmetric guiding wave functions. In the process we obtain a parallel algorithm that optimizes a small subspacemore » of the many-body Hilbert space to provide maximum overlap with the subspace spanned by the lowest-energy eigenstates of a many-body Hamiltonian. We show in a model system that the partition function is progressively maximized within this subspace. We show that the subspace spanned by the small basis systematically converges towards the subspace spanned by the lowest energy eigenstates. Possible applications of this method for calculating the thermodynamic properties of many-body systems near the ground state are discussed. The resulting basis can also be used to accelerate the calculation of the ground or excited states with quantum Monte Carlo.« less

  17. NaSi⇌CaAl exchange equilibrium between plagioclase and amphibole

    NASA Astrophysics Data System (ADS)

    Spear, Frank S.

    1980-03-01

    The exchange equilibrium between plagioclase and amphibole, 2 albite+tschermakite=2 anorthite+glaucophane, has been calibrated empirically using data from natural amphibolites. The partition coefficient, K D, for the exchange reaction is ( X an/ X ab)plag ·(Na, M4/Ca, M4)amph.. Partitioning is systematic between plagioclase and amphibole in suites collected from single exposures, but the solid solutions are highly non-ideal: values of In K D range from -3.0 at X an=0.30 to -1.0 at X an=0.90 in samples from a single roadcut. Changes in both K D and the topology of the ternary reciprocal exchange diagram occur with increasing metamorphic grade. Temperature dependence of In K D is moderate with Δ ¯H≃35 to 47 kcal at X an=0.25; pressure dependence is small with Δ ¯V≃ -0.24 cal/bar. Usefulness of this exchange equilibrium as a geothermometer is restricted by uncertainties in the calculation of the amphibole formula from a microprobe analysis, especially with regard to Na, M4 in amphibole, to approximately ±50 ° C.

  18. Intrinsic disorder in the partitioning protein KorB persists after co-operative complex formation with operator DNA and KorA.

    PubMed

    Hyde, Eva I; Callow, Philip; Rajasekar, Karthik V; Timmins, Peter; Patel, Trushar R; Siligardi, Giuliano; Hussain, Rohanah; White, Scott A; Thomas, Christopher M; Scott, David J

    2017-08-30

    The ParB protein, KorB, from the RK2 plasmid is required for DNA partitioning and transcriptional repression. It acts co-operatively with other proteins, including the repressor KorA. Like many multifunctional proteins, KorB contains regions of intrinsically disordered structure, existing in a large ensemble of interconverting conformations. Using NMR spectroscopy, circular dichroism and small-angle neutron scattering, we studied KorB selectively within its binary complexes with KorA and DNA, and within the ternary KorA/KorB/DNA complex. The bound KorB protein remains disordered with a mobile C-terminal domain and no changes in the secondary structure, but increases in the radius of gyration on complex formation. Comparison of wild-type KorB with an N-terminal deletion mutant allows a model of the ensemble average distances between the domains when bound to DNA. We propose that the positive co-operativity between KorB, KorA and DNA results from conformational restriction of KorB on binding each partner, while maintaining disorder. © 2017 The Author(s).

  19. On the ordinary quiver of the symmetric group over a field of characteristic 2

    NASA Astrophysics Data System (ADS)

    Martin, Stuart; Russell, Lee

    1997-11-01

    Let [fraktur S]n and [fraktur A]n denote the symmetric and alternating groups of degree n[set membership][open face N] respectively. Let p be a prime number and let F be an arbitrary field of characteristic p. We say that a partition of n is p-regular if no p (non-zero) parts of it are equal; otherwise we call it p-singular. Let S[lambda]F denote the Specht module corresponding to [lambda]. For [lambda] a p-regular partition of n let D[lambda]F denote the unique irreducible top factor of S[lambda]F. Denote by [Delta][lambda]F =D[lambda]F [downward arrow][fraktur A]n its restriction to [fraktur A]n. Recall also that, over F, the ordinary quiver of the modular group algebra FG is a finite directed graph defined as follows: the vertices are labelled by the set of all simple FG-modules, L1, [ctdot], Lr, and the number of arrows from Li to Lj equals dimFExtFG(Li, Lj). The quiver gives important information about the block structure of G.

  20. Nature and function of insulator protein binding sites in the Drosophila genome

    PubMed Central

    Schwartz, Yuri B.; Linder-Basso, Daniela; Kharchenko, Peter V.; Tolstorukov, Michael Y.; Kim, Maria; Li, Hua-Bing; Gorchakov, Andrey A.; Minoda, Aki; Shanower, Gregory; Alekseyenko, Artyom A.; Riddle, Nicole C.; Jung, Youngsook L.; Gu, Tingting; Plachetka, Annette; Elgin, Sarah C.R.; Kuroda, Mitzi I.; Park, Peter J.; Savitsky, Mikhail; Karpen, Gary H.; Pirrotta, Vincenzo

    2012-01-01

    Chromatin insulator elements and associated proteins have been proposed to partition eukaryotic genomes into sets of independently regulated domains. Here we test this hypothesis by quantitative genome-wide analysis of insulator protein binding to Drosophila chromatin. We find distinct combinatorial binding of insulator proteins to different classes of sites and uncover a novel type of insulator element that binds CP190 but not any other known insulator proteins. Functional characterization of different classes of binding sites indicates that only a small fraction act as robust insulators in standard enhancer-blocking assays. We show that insulators restrict the spreading of the H3K27me3 mark but only at a small number of Polycomb target regions and only to prevent repressive histone methylation within adjacent genes that are already transcriptionally inactive. RNAi knockdown of insulator proteins in cultured cells does not lead to major alterations in genome expression. Taken together, these observations argue against the concept of a genome partitioned by specialized boundary elements and suggest that insulators are reserved for specific regulation of selected genes. PMID:22767387

  1. Effect of heterogeneity on the characterization of cell membrane compartments: I. Uniform size and permeability.

    PubMed

    Hall, Damien

    2010-03-15

    Observations of the motion of individual molecules in the membrane of a number of different cell types have led to the suggestion that the outer membrane of many eukaryotic cells may be effectively partitioned into microdomains. A major cause of this suggested partitioning is believed to be due to the direct/indirect association of the cytosolic face of the cell membrane with the cortical cytoskeleton. Such intimate association is thought to introduce effective hydrodynamic barriers into the membrane that are capable of frustrating molecular Brownian motion over distance scales greater than the average size of the compartment. To date, the standard analytical method for deducing compartment characteristics has relied on observing the random walk behavior of a labeled lipid or protein at various temporal frequencies and different total lengths of time. Simple theoretical arguments suggest that the presence of restrictive barriers imparts a characteristic turnover to a plot of mean squared displacement versus sampling period that can be interpreted to yield the average dimensions of the compartment expressed as the respective side lengths of a rectangle. In the following series of articles, we used computer simulation methods to investigate how well the conventional analytical strategy coped with heterogeneity in size, shape, and barrier permeability of the cell membrane compartments. We also explored questions relating to the necessary extent of sampling required (with regard to both the recorded time of a single trajectory and the number of trajectories included in the measurement bin) for faithful representation of the actual distribution of compartment sizes found using the SPT technique. In the current investigation, we turned our attention to the analytical characterization of diffusion through cell membrane compartments having both a uniform size and permeability. For this ideal case, we found that (i) an optimum sampling time interval existed for the analysis and (ii) the total length of time for which a trajectory was recorded was a key factor. Copyright (c) 2009 Elsevier Inc. All rights reserved.

  2. Color Image Classification Using Block Matching and Learning

    NASA Astrophysics Data System (ADS)

    Kondo, Kazuki; Hotta, Seiji

    In this paper, we propose block matching and learning for color image classification. In our method, training images are partitioned into small blocks. Given a test image, it is also partitioned into small blocks, and mean-blocks corresponding to each test block are calculated with neighbor training blocks. Our method classifies a test image into the class that has the shortest total sum of distances between mean blocks and test ones. We also propose a learning method for reducing memory requirement. Experimental results show that our classification outperforms other classifiers such as support vector machine with bag of keypoints.

  3. The Deformation Behavior Analysis and Mechanical Modeling of Step/Intercritical Quenching and Partitioning-Treated Multiphase Steels

    NASA Astrophysics Data System (ADS)

    Zhao, Hongshan; Li, Wei; Wang, Li; Zhou, Shu; Jin, Xuejun

    2016-08-01

    T wo types of multiphase steels containing blocky or fine martensite have been used to study the phase interaction and the TRIP effect. These steels were obtained by step-quenching and partitioning (S-QP820) or intercritical-quenching and partitioning (I-QP800 & I-QP820). The retained austenite (RA) in S-QP820 specimen containing blocky martensite transformed too early to prevent the local failure at high strain due to the local strain concentration. In contrast, plentiful RA in I-QP800 specimen containing finely dispersed martensite transformed uniformly at high strain, which led to optimized strength and elongation. By applying a coordinate conversion method to the microhardness test, the load partitioning between ferrite and partitioned martensite was proved to follow the linear mixture law. The mechanical behavior of multiphase S-QP820 steel can be modeled based on the Mecking-Kocks theory, Bouquerel's spherical assumption, and Gladman-type mixture law. Finally, the transformation-induced martensite hardening effect has been studied on a bake-hardened specimen.

  4. Concerted changes in N and C primary metabolism in alfalfa (Medicago sativa) under water restriction.

    PubMed

    Aranjuelo, Iker; Tcherkez, Guillaume; Molero, Gemma; Gilard, Françoise; Avice, Jean-Christophe; Nogués, Salvador

    2013-02-01

    Although the mechanisms of nodule N(2) fixation in legumes are now well documented, some uncertainty remains on the metabolic consequences of water deficit. In most cases, little consideration is given to other organs and, therefore, the coordinated changes in metabolism in leaves, roots, and nodules are not well known. Here, the effect of water restriction on exclusively N(2)-fixing alfalfa (Medicago sativa L.) plants was investigated, and proteomic, metabolomic, and physiological analyses were carried out. It is shown that the inhibition of nitrogenase activity caused by water restriction was accompanied by concerted alterations in metabolic pathways in nodules, leaves, and roots. The data suggest that nodule metabolism and metabolic exchange between plant organs nearly reached homeostasis in asparagine synthesis and partitioning, as well as the N demand from leaves. Typically, there was (i) a stimulation of the anaplerotic pathway to sustain the provision of C skeletons for amino acid (e.g. glutamate and proline) synthesis; (ii) re-allocation of glycolytic products to alanine and serine/glycine; and (iii) subtle changes in redox metabolites suggesting the implication of a slight oxidative stress. Furthermore, water restriction caused little change in both photosynthetic efficiency and respiratory cost of N(2) fixation by nodules. In other words, the results suggest that under water stress, nodule metabolism follows a compromise between physiological imperatives (N demand, oxidative stress) and the lower input to sustain catabolism.

  5. Predicting solute partitioning in lipid bilayers: Free energies and partition coefficients from molecular dynamics simulations and COSMOmic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakobtorweihen, S., E-mail: jakobtorweihen@tuhh.de; Ingram, T.; Gerlach, T.

    2014-07-28

    Quantitative predictions of biomembrane/water partition coefficients are important, as they are a key property in pharmaceutical applications and toxicological studies. Molecular dynamics (MD) simulations are used to calculate free energy profiles for different solutes in lipid bilayers. How to calculate partition coefficients from these profiles is discussed in detail and different definitions of partition coefficients are compared. Importantly, it is shown that the calculated coefficients are in quantitative agreement with experimental results. Furthermore, we compare free energy profiles from MD simulations to profiles obtained by the recent method COSMOmic, which is an extension of the conductor-like screening model for realisticmore » solvation to micelles and biomembranes. The free energy profiles from these molecular methods are in good agreement. Additionally, solute orientations calculated with MD and COSMOmic are compared and again a good agreement is found. Four different solutes are investigated in detail: 4-ethylphenol, propanol, 5-phenylvaleric acid, and dibenz[a,h]anthracene, whereby the latter belongs to the class of polycyclic aromatic hydrocarbons. The convergence of the free energy profiles from biased MD simulations is discussed and the results are shown to be comparable to equilibrium MD simulations. For 5-phenylvaleric acid the influence of the carboxyl group dihedral angle on free energy profiles is analyzed with MD simulations.« less

  6. Cooperative mobile agents search using beehive partitioned structure and Tabu Random search algorithm

    NASA Astrophysics Data System (ADS)

    Ramazani, Saba; Jackson, Delvin L.; Selmic, Rastko R.

    2013-05-01

    In search and surveillance operations, deploying a team of mobile agents provides a robust solution that has multiple advantages over using a single agent in efficiency and minimizing exploration time. This paper addresses the challenge of identifying a target in a given environment when using a team of mobile agents by proposing a novel method of mapping and movement of agent teams in a cooperative manner. The approach consists of two parts. First, the region is partitioned into a hexagonal beehive structure in order to provide equidistant movements in every direction and to allow for more natural and flexible environment mapping. Additionally, in search environments that are partitioned into hexagons, mobile agents have an efficient travel path while performing searches due to this partitioning approach. Second, we use a team of mobile agents that move in a cooperative manner and utilize the Tabu Random algorithm to search for the target. Due to the ever-increasing use of robotics and Unmanned Aerial Vehicle (UAV) platforms, the field of cooperative multi-agent search has developed many applications recently that would benefit from the use of the approach presented in this work, including: search and rescue operations, surveillance, data collection, and border patrol. In this paper, the increased efficiency of the Tabu Random Search algorithm method in combination with hexagonal partitioning is simulated, analyzed, and advantages of this approach are presented and discussed.

  7. Schinus terebinthifolius countercurrent chromatography (Part III): Method transfer from small countercurrent chromatography column to preparative centrifugal partition chromatography ones as a part of method development.

    PubMed

    das Neves Costa, Fernanda; Hubert, Jane; Borie, Nicolas; Kotland, Alexis; Hewitson, Peter; Ignatova, Svetlana; Renault, Jean-Hugues

    2017-03-03

    Countercurrent chromatography (CCC) and centrifugal partition chromatography (CPC) are support free liquid-liquid chromatography techniques sharing the same basic principles and features. Method transfer has previously been demonstrated for both techniques but never from one to another. This study aimed to show such a feasibility using fractionation of Schinus terebinthifolius berries dichloromethane extract as a case study. Heptane - ethyl acetate - methanol -water (6:1:6:1, v/v/v/v) was used as solvent system with masticadienonic and 3β-masticadienolic acids as target compounds. The optimized separation methodology previously described in Part I and II, was scaled up from an analytical hydrodynamic CCC column (17.4mL) to preparative hydrostatic CPC instruments (250mL and 303mL) as a part of method development. Flow-rate and sample loading were further optimized on CPC. Mobile phase linear velocity is suggested as a transfer invariant parameter if the CPC column contains sufficient number of partition cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Stable isotope measurements of evapotranspiration partitioning in a maize field

    NASA Astrophysics Data System (ADS)

    Hogan, Patrick; Parajka, Juraj; Oismüller, Markus; Strauss, Peter; Heng, Lee; Blöschl, Günter

    2017-04-01

    Evapotranspiration (ET) is one of the most important processes in describing land surface - atmosphere interactions as it connects the energy and water balances. Furthermore knowledge of the individual components of evapotranspiration is important for ecohydrological modelling and agriculture, particularly for irrigation efficiency and crop productivity. In this study, we tested the application of the stable isotope method for evapotranspiration partitioning to a maize crop during the vegetative stage, using sap flow sensors as a comparison technique. Field scale ET was measured using an eddy covariance device and then partitioned using high frequency in-situ measurements of the isotopic signal of the canopy water vapor. The fraction of transpiration (Ft) calculated with the stable isotope method showed good agreement with the sap flow method. High correlation coefficient values were found between the two techniques, indicating the stable isotope method can successfully be applied in maize. The results show the changes in transpiration as a fraction of evapotranspiration after rain events and during the subsequent drying conditions as well as the relationship between transpiration and solar radiation and vapor pressure deficit.

  9. The Application of Strain Range Partitioning Method to Torsional Creep-Fatigue Interaction

    NASA Technical Reports Server (NTRS)

    Zamrik, S. Y.

    1975-01-01

    The method of strain range partitioning was applied to a series of torsional fatigue tests conducted on tubular 304 stainless steel specimens at 1200 F. Creep strain was superimposed on cycling strain, and the resulting strain range was partitioned into four components; completely reversed plastic shear strain, plastic shear strain followed by creep strain, creep strain followed by plastic strain and completely reversed creep strain. Each strain component was related to the cyclic life of the material. The damaging effects of the individual strain components were expressed by a linear life fraction rule. The plastic shear strain component showed the least detrimental factor when compared to creep strain reversed by plastic strain. In the latter case, a reduction of torsional fatigue life in the order of magnitude of 1.5 was observed.

  10. Application of the UTCHEM simulator to DNAPL site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, G.W.

    1995-12-31

    Numerical simulation using the University of Texas Chemical Flood Simulator (UTCHEM) was used to evaluate two dense, nonaqueous phase liquid (DNAPL) characterization methods. The methods involved the use of surfactants and partitioning tracers to characterize a suspected trichloroethene (TCE) DNAPL zone beneath a US Air Force Plant in Texas. The simulations were performed using a cross-sectional model of the alluvial aquifer in an area that is believed to contain residual TCE at the base of the aquifer. Characterization simulations compared standard groundwater sampling, an interwell NAPL Solubilization Test, and an interwell NAPL Partitioning Tracer Test. The UTCHEM simulations illustrated howmore » surfactants and partitioning tracers can be used to give definite evidence of the presence and volume of DNAPL in a situation where conventional groundwater sampling can only indicate the existence of the dissolved contaminant plume.« less

  11. A physically based catchment partitioning method for hydrological analysis

    NASA Astrophysics Data System (ADS)

    Menduni, Giovanni; Riboni, Vittoria

    2000-07-01

    We propose a partitioning method for the topographic surface, which is particularly suitable for hydrological distributed modelling and shallow-landslide distributed modelling. The model provides variable mesh size and appears to be a natural evolution of contour-based digital terrain models. The proposed method allows the drainage network to be derived from the contour lines. The single channels are calculated via a search for the steepest downslope lines. Then, for each network node, the contributing area is determined by means of a search for both steepest upslope and downslope lines. This leads to the basin being partitioned into physically based finite elements delimited by irregular polygons. In particular, the distributed computation of local geomorphological parameters (i.e. aspect, average slope and elevation, main stream length, concentration time, etc.) can be performed easily for each single element. The contributing area system, together with the information on the distribution of geomorphological parameters provide a useful tool for distributed hydrological modelling and simulation of environmental processes such as erosion, sediment transport and shallow landslides.

  12. Computing black hole partition functions from quasinormal modes

    DOE PAGES

    Arnold, Peter; Szepietowski, Phillip; Vaman, Diana

    2016-07-07

    We propose a method of computing one-loop determinants in black hole space-times (with emphasis on asymptotically anti-de Sitter black holes) that may be used for numerics when completely-analytic results are unattainable. The method utilizes the expression for one-loop determinants in terms of quasinormal frequencies determined by Denef, Hartnoll and Sachdev in [1]. A numerical evaluation must face the fact that the sum over the quasinormal modes, indexed by momentum and overtone numbers, is divergent. A necessary ingredient is then a regularization scheme to handle the divergent contributions of individual fixed-momentum sectors to the partition function. To this end, we formulatemore » an effective two-dimensional problem in which a natural refinement of standard heat kernel techniques can be used to account for contributions to the partition function at fixed momentum. We test our method in a concrete case by reproducing the scalar one-loop determinant in the BTZ black hole background. Furthermore, we then discuss the application of such techniques to more complicated spacetimes.« less

  13. Sensor And Method For Detecting A Superstrate

    NASA Technical Reports Server (NTRS)

    Arndt, G. Dickey (Inventor); Cari, James R. (Inventor); Ngo, Phong H. (Inventor); Fink, Patrick W. (Inventor); Siekierski, James D. (Inventor)

    2006-01-01

    Method and apparatus are provided for determining a superstrate on or near a sensor, e.g., for detecting the presence of an ice superstrate on an airplane wing or a road. In one preferred embodiment, multiple measurement cells are disposed along a transmission line. While the present invention is operable with different types of transmission lines, construction details for a presently preferred coplanar waveguide and a microstrip waveguide are disclosed. A computer simulation is provided as part of the invention for predicting results of a simulated superstrate detector system. The measurement cells may be physically partitioned, nonphysically partitioned with software or firmware, or include a combination of different types of partitions. In one embodiment, a plurality of transmission lines are utilized wherein each transmission line includes a plurality of measurement cells. The plurality of transmission lines may be multiplexed with the signal from each transmission line being applied to the same phase detector. In one embodiment, an inverse problem method is applied to determine the superstrate dielectric for a transmission line with multiple measurement cells.

  14. Accurate potentiometric determination of lipid membrane-water partition coefficients and apparent dissociation constants of ionizable drugs: electrostatic corrections.

    PubMed

    Elsayed, Mustafa M A; Vierl, Ulrich; Cevc, Gregor

    2009-06-01

    Potentiometric lipid membrane-water partition coefficient studies neglect electrostatic interactions to date; this leads to incorrect results. We herein show how to account properly for such interactions in potentiometric data analysis. We conducted potentiometric titration experiments to determine lipid membrane-water partition coefficients of four illustrative drugs, bupivacaine, diclofenac, ketoprofen and terbinafine. We then analyzed the results conventionally and with an improved analytical approach that considers Coulombic electrostatic interactions. The new analytical approach delivers robust partition coefficient values. In contrast, the conventional data analysis yields apparent partition coefficients of the ionized drug forms that depend on experimental conditions (mainly the lipid-drug ratio and the bulk ionic strength). This is due to changing electrostatic effects originating either from bound drug and/or lipid charges. A membrane comprising 10 mol-% mono-charged molecules in a 150 mM (monovalent) electrolyte solution yields results that differ by a factor of 4 from uncharged membranes results. Allowance for the Coulombic electrostatic interactions is a prerequisite for accurate and reliable determination of lipid membrane-water partition coefficients of ionizable drugs from potentiometric titration data. The same conclusion applies to all analytical methods involving drug binding to a surface.

  15. The effect of cholesterol on the partitioning of 1-octanol into POPC vesicles

    NASA Astrophysics Data System (ADS)

    Zakariaee Kouchaksaraee, Roja

    Microcalorimetry has become a method of choice for sensitive characterization of biomolecular interactions. In this study, isothermal titration calorimetry (ITC) was used to measure the partitioning of 1-octanol into lipid bilayers composed of 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine (POPC), a semi-unsaturated lipid, and cholesterol, a steroid, as a function of cholesterol molar concentration. The ITC instrument measures the heat evolved or absorbed upon titration of a liposome dispersion, at concentrations ranging from 0 to 40% cholesterol, into a suspension of 1-octanol in water. A model function was fit to the data in order to determine the partition coefficient of octanol into POPC bilayers and the enthalpy of interaction. I found that the partition coefficient increases and the heat of interaction becomes less negative with increasing cholesterol content, in contrast to results found by other groups for partitioning of alcohols into lipid-cholesterol bilayers containing saturated lipids. The heat of dilution of vesicles was also measured. Keywords: Partition coefficient; POPC; 1-Octanol; Cholesterol; Isothermal titration calorimetry; Lipid-alcohol interactions. Subject Terms: Calorimetry; Membranes (Biology); Biophysics; Biology -- Technique; Bilayer lipid membranes -- Biotechnology; Lipid membranes -- Biotechnology.

  16. An Introduction to Recursive Partitioning: Rationale, Application, and Characteristics of Classification and Regression Trees, Bagging, and Random Forests

    ERIC Educational Resources Information Center

    Strobl, Carolin; Malley, James; Tutz, Gerhard

    2009-01-01

    Recursive partitioning methods have become popular and widely used tools for nonparametric regression and classification in many scientific fields. Especially random forests, which can deal with large numbers of predictor variables even in the presence of complex interactions, have been applied successfully in genetics, clinical medicine, and…

  17. Co-Clustering by Bipartite Spectral Graph Partitioning for Out-of-Tutor Prediction

    ERIC Educational Resources Information Center

    Trivedi, Shubhendu; Pardos, Zachary A.; Sarkozy, Gabor N.; Heffernan, Neil T.

    2012-01-01

    Learning a more distributed representation of the input feature space is a powerful method to boost the performance of a given predictor. Often this is accomplished by partitioning the data into homogeneous groups by clustering so that separate models could be trained on each cluster. Intuitively each such predictor is a better representative of…

  18. Partitioning error components for accuracy-assessment of near-neighbor methods of imputation

    Treesearch

    Albert R. Stage; Nicholas L. Crookston

    2007-01-01

    Imputation is applied for two quite different purposes: to supply missing data to complete a data set for subsequent modeling analyses or to estimate subpopulation totals. Error properties of the imputed values have different effects in these two contexts. We partition errors of imputation derived from similar observation units as arising from three sources:...

  19. Optimal Clustering in Graphs with Weighted Edges: A Unified Approach to the Threshold Problem.

    ERIC Educational Resources Information Center

    Goetschel, Roy; Voxman, William

    1987-01-01

    Relations on a finite set V are viewed as weighted graphs. Using the language of graph theory, two methods of partitioning V are examined: selecting threshold values and applying them to a maximal weighted spanning forest, and using a parametric linear program to obtain a most adhesive partition. (Author/EM)

  20. Combined node and link partitions method for finding overlapping communities in complex networks

    PubMed Central

    Jin, Di; Gabrys, Bogdan; Dang, Jianwu

    2015-01-01

    Community detection in complex networks is a fundamental data analysis task in various domains, and how to effectively find overlapping communities in real applications is still a challenge. In this work, we propose a new unified model and method for finding the best overlapping communities on the basis of the associated node and link partitions derived from the same framework. Specifically, we first describe a unified model that accommodates node and link communities (partitions) together, and then present a nonnegative matrix factorization method to learn the parameters of the model. Thereafter, we infer the overlapping communities based on the derived node and link communities, i.e., determine each overlapped community between the corresponding node and link community with a greedy optimization of a local community function conductance. Finally, we introduce a model selection method based on consensus clustering to determine the number of communities. We have evaluated our method on both synthetic and real-world networks with ground-truths, and compared it with seven state-of-the-art methods. The experimental results demonstrate the superior performance of our method over the competing ones in detecting overlapping communities for all analysed data sets. Improved performance is particularly pronounced in cases of more complicated networked community structures. PMID:25715829

  1. Ab Initio Predictions of K, He and Ar Partitioning Between Silicate Melt and Liquid Iron Under High Pressure

    NASA Astrophysics Data System (ADS)

    Xiong, Z.; Tsuchiya, T.

    2017-12-01

    Element partitioning is an important property in recording geochemical processes during the core-mantle differentiation. However, experimental measurements of element partitioning coefficients under extreme temperature and pressure condition are still challenging. Theoretical modeling is also not easy, because it requires estimation of high temperature Gibbs free energy, which is not directly accessible by the standard molecular dynamics method. We recently developed an original technique to simulate Gibbs free energy based on the thermodynamics integration method[1]. We apply it to element partitioning of geochemical intriguing trace elements between molten silicate and liquid iron such as potassium, helium and argon as starting examples. Radiogenic potassium in the core can provide energy for Earth's magnetic field, convection in the mantle and outer core[2]. However, its partitioning behavior between silicate and iron remains unclear under high pressure[3,4]. Our calculations suggest that a clear positive temperature dependence of the partitioning coefficient but an insignificant pressure effect. Unlike sulfur and silicon, oxygen dissolved in the metals considerably enhances potassium solubility. Calculated electronic structures reveal alkali-metallic feature of potassium in liquid iron, favoring oxygen with strong electron affinity. Our results suggest that 40K could serve as a potential radiogenic heat source in the outer core if oxygen is the major light element therein.­­ We now further extend our technique to partitioning behaviors of other elements, helium and argon, to get insides into the `helium paradox' and `missing argon' problems. References [1] T. Taniuchi, and T. Tsuchiya, Phys.Rev.B. In press [2] B.A. Buffett, H.E. Huppert, J.R. Lister, and A.W. Woods, Geophys.Res.Lett. 29 (1996) 7989-8006. [3] V.R. Murthy, W. Westrenen, and Y. Fei, Nature. 426 (2003) 163-165. [4] A. Corgne, S.Keshav, Y. Fei, and W.F. McDonough, Earth.Planet.Sci.Lett. 256 (2007) 567-576

  2. Determination of partition coefficients of biomolecules in a microfluidic aqueous two phase system platform using fluorescence microscopy.

    PubMed

    Silva, D F C; Azevedo, A M; Fernandes, P; Chu, V; Conde, J P; Aires-Barros, M R

    2017-03-03

    Aqueous two phase systems (ATPS) offer great potential for selective separation of a wide range of biomolecules by exploring differences in molecular solubility in each of the two immiscible phases. However, ATPS use has been limited due to the difficulty in predicting the behavior of a given biomolecule in the partition environment together with the empirical and time-consuming techniques that are used for the determination of partition and extraction parameters. In this work, a fast and novel technique based on a microfluidic platform and using fluorescence microscopy was developed to determine the partition coefficients of biomolecules in different ATPS. This method consists of using a microfluidic device with a single microchannel and three inlets. In two of the inlets, solutions containing the ATPS forming components were loaded while the third inlet was fed with the FITC tagged biomolecule of interest prepared in milli-Q water. Using fluorescence microscopy, it was possible to follow the location of the FITC-tagged biomolecule and, by simply varying the pumping rates of the solutions, to quickly test a wide variety of ATPS compositions. The ATPS system is allowed 4min for stabilization and fluorescence micrographs are used to determine the partition coefficient.The partition coefficients obtained were shown to be consistent with results from macroscale ATPS partition. This process allows for faster screening of partition coefficients using only a few microliters of material for each ATPS composition and is amenable to automation. The partitioning behavior of several biomolecules with molecular weights (MW) ranging from 5.8 to 150kDa, and isoelectric points (pI) ranging from 4.7 to 6.4 was investigated, as well as the effect of the molecular weight of the polymer ATPS component. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. High-throughput determination of octanol/water partition coefficients using a shake-flask method and novel two-phase solvent system.

    PubMed

    Morikawa, Go; Suzuka, Chihiro; Shoji, Atsushi; Shibusawa, Yoichi; Yanagida, Akio

    2016-01-05

    A high-throughput method for determining the octanol/water partition coefficient (P(o/w)) of a large variety of compounds exhibiting a wide range in hydrophobicity was established. The method combines a simple shake-flask method with a novel two-phase solvent system comprising an acetonitrile-phosphate buffer (0.1 M, pH 7.4)-1-octanol (25:25:4, v/v/v; AN system). The AN system partition coefficients (K(AN)) of 51 standard compounds for which log P(o/w) (at pH 7.4; log D) values had been reported were determined by single two-phase partitioning in test tubes, followed by measurement of the solute concentration in both phases using an automatic flow injection-ultraviolet detection system. The log K(AN) values were closely related to reported log D values, and the relationship could be expressed by the following linear regression equation: log D=2.8630 log K(AN) -0.1497(n=51). The relationship reveals that log D values (+8 to -8) for a large variety of highly hydrophobic and/or hydrophilic compounds can be estimated indirectly from the narrow range of log K(AN) values (+3 to -3) determined using the present method. Furthermore, log K(AN) values for highly polar compounds for which no log D values have been reported, such as amino acids, peptides, proteins, nucleosides, and nucleotides, can be estimated using the present method. The wide-ranging log D values (+5.9 to -7.5) of these molecules were estimated for the first time from their log K(AN) values and the above regression equation. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Computer program for calculating and fitting thermodynamic functions

    NASA Technical Reports Server (NTRS)

    Mcbride, Bonnie J.; Gordon, Sanford

    1992-01-01

    A computer program is described which (1) calculates thermodynamic functions (heat capacity, enthalpy, entropy, and free energy) for several optional forms of the partition function, (2) fits these functions to empirical equations by means of a least-squares fit, and (3) calculates, as a function of temperture, heats of formation and equilibrium constants. The program provides several methods for calculating ideal gas properties. For monatomic gases, three methods are given which differ in the technique used for truncating the partition function. For diatomic and polyatomic molecules, five methods are given which differ in the corrections to the rigid-rotator harmonic-oscillator approximation. A method for estimating thermodynamic functions for some species is also given.

  5. Quantified degree of eccentricity of aortic valve calcification predicts risk of paravalvular regurgitation and response to balloon post-dilation after self-expandable transcatheter aortic valve replacement.

    PubMed

    Park, Jun-Bean; Hwang, In-Chang; Lee, Whal; Han, Jung-Kyu; Kim, Chi-Hoon; Lee, Seung-Pyo; Yang, Han-Mo; Park, Eun-Ah; Kim, Hyung-Kwan; Chiam, Paul T L; Kim, Yong-Jin; Koo, Bon-Kwon; Sohn, Dae-Won; Ahn, Hyuk; Kang, Joon-Won; Park, Seung-Jung; Kim, Hyo-Soo

    2018-05-15

    Limited data exist regarding the impact of aortic valve calcification (AVC) eccentricity on the risk of paravalvular regurgitation (PVR) and response to balloon post-dilation (BPD) after transcatheter aortic valve replacement (TAVR). We investigated the prognostic value of AVC eccentricity in predicting the risk of PVR and response to BPD in patients undergoing TAVR. We analyzed 85 patients with severe aortic stenosis who underwent self-expandable TAVR (43 women; 77.2±7.1years). AVC was quantified as the total amount of calcification (total AVC load) and as the eccentricity of calcium (EoC) using calcium volume scoring with contrast computed tomography angiography (CTA). The EoC was defined as the maximum absolute difference in calcium volume scores between 2 adjacent sectors (bi-partition method) or between sectors based on leaflets (leaflet-based method). Total AVC load and bi-partition EoC, but not leaflet-based EoC, were significant predictors for the occurrence of ≥moderate PVR, and bi-partition EoC had a better predictive value than total AVC load (area under the curve [AUC]=0.863 versus 0.760, p for difference=0.006). In multivariate analysis, bi-partition EoC was an independent predictor for the risk of ≥moderate PVR regardless of perimeter oversizing index. The greater bi-partition EoC was the only significant parameter to predict poor response to BPD (AUC=0.775, p=0.004). Pre-procedural assessment of AVC eccentricity using CTA as "bi-partition EoC" provides useful predictive information on the risk of significant PVR and response to BPD in patients undergoing TAVR with self-expandable valves. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. The threshold bootstrap clustering: a new approach to find families or transmission clusters within molecular quasispecies.

    PubMed

    Prosperi, Mattia C F; De Luca, Andrea; Di Giambenedetto, Simona; Bracciale, Laura; Fabbiani, Massimiliano; Cauda, Roberto; Salemi, Marco

    2010-10-25

    Phylogenetic methods produce hierarchies of molecular species, inferring knowledge about taxonomy and evolution. However, there is not yet a consensus methodology that provides a crisp partition of taxa, desirable when considering the problem of intra/inter-patient quasispecies classification or infection transmission event identification. We introduce the threshold bootstrap clustering (TBC), a new methodology for partitioning molecular sequences, that does not require a phylogenetic tree estimation. The TBC is an incremental partition algorithm, inspired by the stochastic Chinese restaurant process, and takes advantage of resampling techniques and models of sequence evolution. TBC uses as input a multiple alignment of molecular sequences and its output is a crisp partition of the taxa into an automatically determined number of clusters. By varying initial conditions, the algorithm can produce different partitions. We describe a procedure that selects a prime partition among a set of candidate ones and calculates a measure of cluster reliability. TBC was successfully tested for the identification of type-1 human immunodeficiency and hepatitis C virus subtypes, and compared with previously established methodologies. It was also evaluated in the problem of HIV-1 intra-patient quasispecies clustering, and for transmission cluster identification, using a set of sequences from patients with known transmission event histories. TBC has been shown to be effective for the subtyping of HIV and HCV, and for identifying intra-patient quasispecies. To some extent, the algorithm was able also to infer clusters corresponding to events of infection transmission. The computational complexity of TBC is quadratic in the number of taxa, lower than other established methods; in addition, TBC has been enhanced with a measure of cluster reliability. The TBC can be useful to characterise molecular quasipecies in a broad context.

  7. Comparison of salting-out and sugaring-out liquid-liquid extraction methods for the partition of 10-hydroxy-2-decenoic acid in royal jelly and their co-extracted protein content.

    PubMed

    Tu, Xijuan; Sun, Fanyi; Wu, Siyuan; Liu, Weiyi; Gao, Zhaosheng; Huang, Shaokang; Chen, Wenbin

    2018-01-15

    Homogeneous liquid-liquid extraction (h-LLE) has been receiving considerable attention as a sample preparation method due to its simple and fast partition of compounds with a wide range of polarities. To better understand the differences between the two h-LLE extraction approaches, salting-out assisted liquid-liquid extraction (SALLE) and sugaring-out assisted liquid-liquid extraction (SULLE), have been compared for the partition of 10-hydroxy-2-decenoic acid (10-HDA) from royal jelly, and for the co-extraction of proteins. Effects of the amount of phase partition agents and the concentration of acetonitrile (ACN) on the h-LLE were discussed. Results showed that partition efficiency of 10-HDA depends on the phase ratio in both SALLE and SULLE. Though the partition triggered by NaCl and glucose is less efficient than MgSO 4 in the 50% (v/v) ACN-water mixture, their extraction yields can be improved to be similar with that in MgSO 4 SALLE by increasing the initial concentration of ACN in the ACN-water mixture. The content of co-extracted protein was correlated with water concentration in the obtained upper phase. MgSO 4 showed the largest protein co-extraction at the low concentration of salt. Glucose exhibited a large protein co-extraction in the high phase ratio condition. Furthermore, NaCl with high initial ACN concentration is recommended because it produced high extraction yield for 10-HDA and the lowest amount of co-extracted protein. These observations would be valuable for the sample preparation of royal jelly. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Salt Stress and Ethylene Antagonistically Regulate Nucleocytoplasmic Partitioning of COP1 to Control Seed Germination1[OPEN

    PubMed Central

    Shi, Hui; Gu, Juntao; Dong, Jingao; Deng, Xing Wang

    2016-01-01

    Seed germination, a critical stage initiating the life cycle of a plant, is severely affected by salt stress. However, the underlying mechanism of salt inhibition of seed germination (SSG) is unclear. Here, we report that the Arabidopsis (Arabidopsis thaliana) CONSTITUTIVE PHOTOMORPHOGENESIS1 (COP1) counteracts SSG. Genetic assays provide evidence that SSG in loss of function of the COP1 mutant was stronger than this in the wild type. A GUS-COP1 fusion was constitutively localized to the nucleus in radicle cells. Salt treatment caused COP1 to be retained in the cytosol, but the addition of ethylene precursor 1-aminocyclopropane-1-carboxylate had the reverse effect on the translocation of COP1 to the nucleus, revealing that ethylene and salt exert opposite regulatory effects on the localization of COP1 in germinating seeds. However, loss of function of the ETHYLENE INSENSITIVE3 (EIN3) mutant impaired the ethylene-mediated rescue of the salt restriction of COP1 to the nucleus. Further research showed that the interaction between COP1 and LONG HYPOCOTYL5 (HY5) had a role in SSG. Correspondingly, SSG in loss of function of HY5 was suppressed. Biochemical detection showed that salt promoted the stabilization of HY5, whereas ethylene restricted its accumulation. Furthermore, salt treatment stimulated and ethylene suppressed transcription of ABA INSENSITIVE5 (ABI5), which was directly transcriptionally regulated by HY5. Together, our results reveal that salt stress and ethylene antagonistically regulate nucleocytoplasmic partitioning of COP1, thereby controlling Arabidopsis seed germination via the COP1-mediated down-regulation of HY5 and ABI5. These findings enhance our understanding of the stress response and have great potential for application in agricultural production. PMID:26850275

  9. Roles of adipose restriction and metabolic factors in progression of steatosis to steatohepatitis in obese, diabetic mice.

    PubMed

    Larter, Claire Z; Yeh, Matthew M; Van Rooyen, Derrick M; Teoh, Narci C; Brooling, John; Hou, Jing Yun; Williams, Jacqueline; Clyne, Matthew; Nolan, Christopher J; Farrell, Geoffrey C

    2009-10-01

    We previously reported that steatohepatitis develops in obese, hypercholesterolemic, diabetic foz/foz mice fed a high-fat (HF) diet for 12 months. We now report earlier onset of steatohepatitis in relation to metabolic abnormalities, and clarify the roles of dietary fat and bodily lipid partitioning on steatosis severity, liver injury and inflammatory recruitment in this novel non-alcoholic steatohepatitis (NASH) model. Foz/foz (Alms1 mutant) and wild-type (WT) mice were fed a HF diet or chow, and metabolic characteristics and liver histology were studied at 2, 6, 12 and 24 weeks. After 12 weeks HF-feeding, foz/foz mice were obese and diabetic with approximately 70% reduction in serum adiponectin. Hepatomegaly developed at this time, corresponding to a plateau in adipose expansion and increased adipose inflammation. Liver histology showed mild inflammation and hepatocyte ballooning as well as steatosis. By 24 weeks, HF-fed foz/foz mice developed severe steatohepatitis (marked steatosis, alanine aminotransferase elevation, ballooning, inflammation, fibrosis), whereas dietary and genetic controls showed only simple steatosis. While steatosis was associated with hepatic lipogenesis, indicated by increased fatty acid synthase activity, steatohepatitis was associated with significantly higher levels of CD36, indicating active fatty acid uptake, possibly under the influence of peroxisome proliferator-activated receptor-gamma. In mice genetically predisposed to obesity and diabetes, HF feeding leads to restriction of adipose tissue for accommodation of excess energy, causing lipid partitioning into liver, and transformation of simple steatosis to fibrosing steatohepatitis. The way in which HF feeding 'saturates' adipose stores, decreases serum adiponectin and causes hepatic inflammation in steatohepatitis may provide clues to pathogenesis of NASH in metabolic syndrome.

  10. Restriction of dietary methyl donors limits methionine availability and affects the partitioning of dietary methionine for creatine and phosphatidylcholine synthesis in the neonatal piglet.

    PubMed

    Robinson, Jason L; McBreairty, Laura E; Randell, Edward W; Brunton, Janet A; Bertolo, Robert F

    2016-09-01

    Methionine is required for protein synthesis and provides a methyl group for >50 critical transmethylation reactions including creatine and phosphatidylcholine synthesis as well as DNA and protein methylation. However, the availability of methionine depends on dietary sources as well as remethylation of demethylated methionine (i.e., homocysteine) by the dietary methyl donors folate and choline (via betaine). By restricting dietary methyl supply, we aimed to determine the extent that dietary methyl donors contribute to methionine availability for protein synthesis and transmethylation reactions in neonatal piglets. Piglets 4-8 days of age were fed a diet deficient (MD-) (n=8) or sufficient (MS+) (n=7) in folate, choline and betaine. After 5 days, dietary methionine was reduced to 80% of requirement in both groups to elicit a response. On day 8, animals were fed [(3)H-methyl]methionine for 6h to measure methionine partitioning into hepatic protein, phosphatidylcholine, creatine and DNA. MD- feeding reduced plasma choline, betaine and folate (P<.05) and increased homocysteine ~3-fold (P<.05). With MD- feeding, hepatic phosphatidylcholine synthesis was 60% higher (P<.05) at the expense of creatine synthesis, which was 30% lower during MD- feeding (P<.05); protein synthesis as well as DNA and protein methylation were unchanged. In the liver, ~30% of dietary label was traced to phosphatidylcholine and creatine together, with ~50% traced to methylation of proteins and ~20% incorporated in synthesized protein. Dietary methyl donors are integral to neonatal methionine requirements and can affect methionine availability for transmethylation pathways. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Salt Stress and Ethylene Antagonistically Regulate Nucleocytoplasmic Partitioning of COP1 to Control Seed Germination.

    PubMed

    Yu, Yanwen; Wang, Juan; Shi, Hui; Gu, Juntao; Dong, Jingao; Deng, Xing Wang; Huang, Rongfeng

    2016-04-01

    Seed germination, a critical stage initiating the life cycle of a plant, is severely affected by salt stress. However, the underlying mechanism of salt inhibition of seed germination (SSG) is unclear. Here, we report that the Arabidopsis (Arabidopsis thaliana) CONSTITUTIVE PHOTOMORPHOGENESIS1 (COP1) counteracts SSG Genetic assays provide evidence that SSG in loss of function of the COP1 mutant was stronger than this in the wild type. A GUS-COP1 fusion was constitutively localized to the nucleus in radicle cells. Salt treatment caused COP1 to be retained in the cytosol, but the addition of ethylene precursor 1-aminocyclopropane-1-carboxylate had the reverse effect on the translocation of COP1 to the nucleus, revealing that ethylene and salt exert opposite regulatory effects on the localization of COP1 in germinating seeds. However, loss of function of the ETHYLENE INSENSITIVE3 (EIN3) mutant impaired the ethylene-mediated rescue of the salt restriction of COP1 to the nucleus. Further research showed that the interaction between COP1 and LONG HYPOCOTYL5 (HY5) had a role in SSG Correspondingly, SSG in loss of function of HY5 was suppressed. Biochemical detection showed that salt promoted the stabilization of HY5, whereas ethylene restricted its accumulation. Furthermore, salt treatment stimulated and ethylene suppressed transcription of ABA INSENSITIVE5 (ABI5), which was directly transcriptionally regulated by HY5. Together, our results reveal that salt stress and ethylene antagonistically regulate nucleocytoplasmic partitioning of COP1, thereby controlling Arabidopsis seed germination via the COP1-mediated down-regulation of HY5 and ABI5. These findings enhance our understanding of the stress response and have great potential for application in agricultural production. © 2016 American Society of Plant Biologists. All Rights Reserved.

  12. Reconciling Isotopic Partitioning Estimates of Moisture Fluxes in Semi-arid Landscapes Through a New Modeling Approach for Evaporation

    NASA Astrophysics Data System (ADS)

    Kaushik, A.; Berkelhammer, M. B.; O'Neill, M.; Noone, D.

    2017-12-01

    The partitioning of land surface latent heat flux into evaporation and transpiration remains a challenging problem despite a basic understanding of the underlying mechanisms. Water isotopes are useful tracers for separating evaporation and transpiration contributions because E and T have distinct isotopic ratios. Here we use the isotope-based partitioning method at a semi-arid grassland tall-tower site in Colorado. Our results suggest that under certain conditions evaporation cannot be isotopically distinguished from transpiration without modification of existing partitioning techniques. Over a 4-year period, we measured profiles of stable oxygen and hydrogen isotope ratios of water vapor from the surface to 300 m and soil water down to 1 m along with standard meteorological fluxes. Using these data, we evaluated the contributions of rainfall, equilibration, surface water vapor exchange and sub-surface vapor diffusion to the isotopic composition of evapotranspiration (ET). Applying the standard isotopic approach to find the transpiration portion of ET (i.e., T/ET), we see a significant discrepancy compared with a method to constrain T/ET based on gross primary productivity (GPP). By evaluating the kinetic fractionation associated with soil evaporation and vapor diffusion we find that a significant proportion (58-84%) of evaporation following precipitation is non-fractionating. This is possible when water from isolated soil layers is being nearly completely evaporated. Non-fractionating evaporation looks isotopically like transpiration and therefore leads to an overestimation of T/ET. Including non-fractionating evaporation reconciles the isotope-based partitioning estimates of T/ET with the GPP method, and may explain the overestimation of T/ET from isotopes compared to other methods. Finally, we examine the application of non-fractionating evaporation to other boundary layer moisture flux processes such as rain evaporation, where complete evaporation of smaller drop pools may produce a similarly weaker kinetic effect.

  13. Dynamic partitioning for hybrid simulation of the bistable HIV-1 transactivation network.

    PubMed

    Griffith, Mark; Courtney, Tod; Peccoud, Jean; Sanders, William H

    2006-11-15

    The stochastic kinetics of a well-mixed chemical system, governed by the chemical Master equation, can be simulated using the exact methods of Gillespie. However, these methods do not scale well as systems become more complex and larger models are built to include reactions with widely varying rates, since the computational burden of simulation increases with the number of reaction events. Continuous models may provide an approximate solution and are computationally less costly, but they fail to capture the stochastic behavior of small populations of macromolecules. In this article we present a hybrid simulation algorithm that dynamically partitions the system into subsets of continuous and discrete reactions, approximates the continuous reactions deterministically as a system of ordinary differential equations (ODE) and uses a Monte Carlo method for generating discrete reaction events according to a time-dependent propensity. Our approach to partitioning is improved such that we dynamically partition the system of reactions, based on a threshold relative to the distribution of propensities in the discrete subset. We have implemented the hybrid algorithm in an extensible framework, utilizing two rigorous ODE solvers to approximate the continuous reactions, and use an example model to illustrate the accuracy and potential speedup of the algorithm when compared with exact stochastic simulation. Software and benchmark models used for this publication can be made available upon request from the authors.

  14. Overlapping clusters for distributed computation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mirrokni, Vahab; Andersen, Reid; Gleich, David F.

    2010-11-01

    Scalable, distributed algorithms must address communication problems. We investigate overlapping clusters, or vertex partitions that intersect, for graph computations. This setup stores more of the graph than required but then affords the ease of implementation of vertex partitioned algorithms. Our hope is that this technique allows us to reduce communication in a computation on a distributed graph. The motivation above draws on recent work in communication avoiding algorithms. Mohiyuddin et al. (SC09) design a matrix-powers kernel that gives rise to an overlapping partition. Fritzsche et al. (CSC2009) develop an overlapping clustering for a Schwarz method. Both techniques extend an initialmore » partitioning with overlap. Our procedure generates overlap directly. Indeed, Schwarz methods are commonly used to capitalize on overlap. Elsewhere, overlapping communities (Ahn et al, Nature 2009; Mishra et al. WAW2007) are now a popular model of structure in social networks. These have long been studied in statistics (Cole and Wishart, CompJ 1970). We present two types of results: (i) an estimated swapping probability {rho}{infinity}; and (ii) the communication volume of a parallel PageRank solution (link-following {alpha} = 0.85) using an additive Schwarz method. The volume ratio is the amount of extra storage for the overlap (2 means we store the graph twice). Below, as the ratio increases, the swapping probability and PageRank communication volume decreases.« less

  15. A novel method for measuring polymer-water partition coefficients.

    PubMed

    Zhu, Tengyi; Jafvert, Chad T; Fu, Dafang; Hu, Yue

    2015-11-01

    Low density polyethylene (LDPE) often is used as the sorbent material in passive sampling devices to estimate the average temporal chemical concentration in water bodies or sediment pore water. To calculate water phase chemical concentrations from LDPE concentrations accurately, it is necessary to know the LDPE-water partition coefficients (KPE-w) of the chemicals of interest. However, even moderately hydrophobic chemicals have large KPE-w values, making direct measurement experimentally difficult. In this study we evaluated a simple three phase system from which KPE-w can be determined easily and accurately. In the method, chemical equilibrium distribution between LDPE and a surfactant micelle pseudo-phase is measured, with the ratio of these concentrations equal to the LDPE-micelle partition coefficient (KPE-mic). By employing sufficient mass of polymer and surfactant (Brij 30), the mass of chemical in the water phase remains negligible, albeit in equilibrium. In parallel, the micelle-water partition coefficient (Kmic-w) is determined experimentally. KPE-w is the product of KPE-mic and Kmic-w. The method was applied to measure values of KPE-w for 17 polycyclic aromatic hydrocarbons, 37 polychlorinated biphenyls, and 9 polybrominated diphenylethers. These values were compared to literature values. Mass fraction-based chemical activity coefficients (γ) were determined in each phase and showed that for each chemical, the micelles and LDPE had nearly identical affinity. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. An application of the patient rule-induction method for evaluating the contribution of the Apolipoprotein E and Lipoprotein Lipase genes to predicting ischemic heart disease.

    PubMed

    Dyson, Greg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G; Tybjaerg-Hansen, Anne; Sing, Charles F

    2007-09-01

    Different combinations of genetic and environmental risk factors are known to contribute to the complex etiology of ischemic heart disease (IHD) in different subsets of individuals. We employed the Patient Rule-Induction Method (PRIM) to select the combination of risk factors and risk factor values that identified each of 16 mutually exclusive partitions of individuals having significantly different levels of risk of IHD. PRIM balances two competing objectives: (1) finding partitions where the risk of IHD is high and (2) maximizing the number of IHD cases explained by the partitions. A sequential PRIM analysis was applied to data on the incidence of IHD collected over 8 years for a sample of 5,455 unrelated individuals from the Copenhagen City Heart Study (CCHS) to assess the added value of variation in two candidate susceptibility genes beyond the traditional, lipid and body mass index risk factors for IHD. An independent sample of 362 unrelated individuals also from the city of Copenhagen was used to test the model obtained for each of the hypothesized partitions. Copyright (c) 2007 Wiley-Liss, Inc.

  17. Anti-tuberculosis and cytotoxic evaluation of the seaweed Sargassum boveanum.

    PubMed

    Akbari, Vajihe; Zafari, Saeed; Yegdaneh, Afsaneh

    2018-02-01

    Marine seaweeds produce a variety of compounds with different biological activities, including antituberculosis and anticancer effects. The aim of this study was to investigate anti-tuberculosis activity of Sargassum boveanum ( S. boveanum ) and cytotoxicity of different fractions of this seaweed. S. boveanum was collected from Persian Gulf. The plant was extracted by maceration with methanol-ethyl acetate solvent. The extract was evaporated and partitioned by Kupchan method to yield hexane, tricholoroethane, chloroform, and butanol partitions. The anti-tuberculosis activity of the crude extract and toxicity of the fractions were investigated using green fluorescent protein reporter microplate assay and 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide assay methods, respectively. The cell survivals of HeLa cell were decreased by increasing the concentration of the extracts. The IC 50 values of hexane, tricholoroethane, chloroform, and butanol partitions were 150.3 ± 23.10, 437.0 ± 147.3, 110.4 ± 33.67, and 1025.0 ± 15.20 μg/mL, respectively. The crude extract was not active against tuberculosis. This study reveals that different partitions of S. boveanum have cytotoxic activity against the cancer cell lines.

  18. Aqueous solubility, effects of salts on aqueous solubility, and partitioning behavior of hexafluorobenzene: experimental results and COSMO-RS predictions.

    PubMed

    Schröder, Bernd; Freire, Mara G; Varanda, Fatima R; Marrucho, Isabel M; Santos, Luís M N B F; Coutinho, João A P

    2011-07-01

    The aqueous solubility of hexafluorobenzene has been determined, at 298.15K, using a shake-flask method with a spectrophotometric quantification technique. Furthermore, the solubility of hexafluorobenzene in saline aqueous solutions, at distinct salt concentrations, has been measured. Both salting-in and salting-out effects were observed and found to be dependent on the nature of the cationic/anionic composition of the salt. COSMO-RS, the Conductor-like Screening Model for Real Solvents, has been used to predict the corresponding aqueous solubilities at conditions similar to those used experimentally. The prediction results showed that the COSMO-RS approach is suitable for the prediction of salting-in/-out effects. The salting-in/-out phenomena have been rationalized with the support of COSMO-RS σ-profiles. The prediction potential of COSMO-RS regarding aqueous solubilities and octanol-water partition coefficients has been compared with typically used QSPR-based methods. Up to now, the absence of accurate solubility data for hexafluorobenzene hampered the calculation of the respective partition coefficients. Combining available accurate vapor pressure data with the experimentally determined water solubility, a novel air-water partition coefficient has been derived. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Improved image decompression for reduced transform coding artifacts

    NASA Technical Reports Server (NTRS)

    Orourke, Thomas P.; Stevenson, Robert L.

    1994-01-01

    The perceived quality of images reconstructed from low bit rate compression is severely degraded by the appearance of transform coding artifacts. This paper proposes a method for producing higher quality reconstructed images based on a stochastic model for the image data. Quantization (scalar or vector) partitions the transform coefficient space and maps all points in a partition cell to a representative reconstruction point, usually taken as the centroid of the cell. The proposed image estimation technique selects the reconstruction point within the quantization partition cell which results in a reconstructed image which best fits a non-Gaussian Markov random field (MRF) image model. This approach results in a convex constrained optimization problem which can be solved iteratively. At each iteration, the gradient projection method is used to update the estimate based on the image model. In the transform domain, the resulting coefficient reconstruction points are projected to the particular quantization partition cells defined by the compressed image. Experimental results will be shown for images compressed using scalar quantization of block DCT and using vector quantization of subband wavelet transform. The proposed image decompression provides a reconstructed image with reduced visibility of transform coding artifacts and superior perceived quality.

  20. THE SEDIMENTATION PROPERTIES OF THE SKIN-SENSITIZING ANTIBODIES OF RAGWEED-SENSITIVE PATIENTS

    PubMed Central

    Andersen, Burton R.; Vannier, Wilton E.

    1964-01-01

    The sedimentation coefficients of the skin-sensitizing antibodies to ragweed were evaluated by the moving partition cell method and the sucrose density gradient method. The most reliable results were obtained by sucrose density gradient ultracentrifugation which showed that the major portion of skin-sensitizing antibodies to ragweed sediment with an average value of 7.7S (7.4 to 7.9S). This is about one S unit faster than γ-globulins (6.8S). The data from the moving partition cell method are in agreement with these results. Our studies failed to demonstrate heterogeneity of the skin-sensitizing antibodies with regard to sedimentation rate. PMID:14194391

  1. Electron correlation at the MgF2(110) surface: a comparison of incremental and local correlation methods.

    PubMed

    Hammerschmidt, Lukas; Maschio, Lorenzo; Müller, Carsten; Paulus, Beate

    2015-01-13

    We have applied the Method of Increments and the periodic Local-MP2 approach to the study of the (110) surface of magnesium fluoride, a system of significant interest in heterogeneous catalysis. After careful assessment of the approximations inherent in both methods, the two schemes, though conceptually different, are shown to yield nearly identical results. This remains true even when analyzed in fine detail through partition of the individual contribution to the total energy. This kind of partitioning also provides thorough insight into the electron correlation effects underlying the surface formation process, which are discussed in detail.

  2. Partitioning Ocean Wave Spectra Obtained from Radar Observations

    NASA Astrophysics Data System (ADS)

    Delaye, Lauriane; Vergely, Jean-Luc; Hauser, Daniele; Guitton, Gilles; Mouche, Alexis; Tison, Celine

    2016-08-01

    2D wave spectra of ocean waves can be partitioned into several wave components to better characterize the scene. We present here two methods of component detection: one based on watershed algorithm and the other based on a Bayesian approach. We tested both methods on a set of simulated SWIM data, the Ku-band real aperture radar embarked on the CFOSAT (China- France Oceanography Satellite) mission which launch is planned mid-2018. We present the results and the limits of both approaches and show that Bayesian method can also be applied to other kind of wave spectra observations as those obtained with the radar KuROS, an airborne radar wave spectrometer.

  3. Rapid analysis of dissolved methane, ethylene, acetylene and ethane using partition coefficients and headspace-gas chromatography.

    PubMed

    Lomond, Jasmine S; Tong, Anthony Z

    2011-01-01

    Analysis of dissolved methane, ethylene, acetylene, and ethane in water is crucial in evaluating anaerobic activity and investigating the sources of hydrocarbon contamination in aquatic environments. A rapid chromatographic method based on phase equilibrium between water and its headspace is developed for these analytes. The new method requires minimal sample preparation and no special apparatus except those associated with gas chromatography. Instead of Henry's Law used in similar previous studies, partition coefficients are used for the first time to calculate concentrations of dissolved hydrocarbon gases, which considerably simplifies the calculation involved. Partition coefficients are determined to be 128, 27.9, 1.28, and 96.3 at 30°C for methane, ethylene, acetylene, and ethane, respectively. It was discovered that the volume ratio of gas-to-liquid phase is critical to the accuracy of the measurements. The method performance can be readily improved by reducing the volume ratio of the two phases. Method validation shows less than 6% variation in accuracy and precision except at low levels of methane where interferences occur in ambient air. Method detection limits are determined to be in the low ng/L range for all analytes. The performance of the method is further tested using environmental samples collected from various sites in Nova Scotia.

  4. Using Replicates in Information Retrieval Evaluation.

    PubMed

    Voorhees, Ellen M; Samarov, Daniel; Soboroff, Ian

    2017-09-01

    This article explores a method for more accurately estimating the main effect of the system in a typical test-collection-based evaluation of information retrieval systems, thus increasing the sensitivity of system comparisons. Randomly partitioning the test document collection allows for multiple tests of a given system and topic (replicates). Bootstrap ANOVA can use these replicates to extract system-topic interactions-something not possible without replicates-yielding a more precise value for the system effect and a narrower confidence interval around that value. Experiments using multiple TREC collections demonstrate that removing the topic-system interactions substantially reduces the confidence intervals around the system effect as well as increases the number of significant pairwise differences found. Further, the method is robust against small changes in the number of partitions used, against variability in the documents that constitute the partitions, and the measure of effectiveness used to quantify system effectiveness.

  5. TriageTools: tools for partitioning and prioritizing analysis of high-throughput sequencing data.

    PubMed

    Fimereli, Danai; Detours, Vincent; Konopka, Tomasz

    2013-04-01

    High-throughput sequencing is becoming a popular research tool but carries with it considerable costs in terms of computation time, data storage and bandwidth. Meanwhile, some research applications focusing on individual genes or pathways do not necessitate processing of a full sequencing dataset. Thus, it is desirable to partition a large dataset into smaller, manageable, but relevant pieces. We present a toolkit for partitioning raw sequencing data that includes a method for extracting reads that are likely to map onto pre-defined regions of interest. We show the method can be used to extract information about genes of interest from DNA or RNA sequencing samples in a fraction of the time and disk space required to process and store a full dataset. We report speedup factors between 2.6 and 96, depending on settings and samples used. The software is available at http://www.sourceforge.net/projects/triagetools/.

  6. Using Replicates in Information Retrieval Evaluation

    PubMed Central

    VOORHEES, ELLEN M.; SAMAROV, DANIEL; SOBOROFF, IAN

    2018-01-01

    This article explores a method for more accurately estimating the main effect of the system in a typical test-collection-based evaluation of information retrieval systems, thus increasing the sensitivity of system comparisons. Randomly partitioning the test document collection allows for multiple tests of a given system and topic (replicates). Bootstrap ANOVA can use these replicates to extract system-topic interactions—something not possible without replicates—yielding a more precise value for the system effect and a narrower confidence interval around that value. Experiments using multiple TREC collections demonstrate that removing the topic-system interactions substantially reduces the confidence intervals around the system effect as well as increases the number of significant pairwise differences found. Further, the method is robust against small changes in the number of partitions used, against variability in the documents that constitute the partitions, and the measure of effectiveness used to quantify system effectiveness. PMID:29905334

  7. SO(N) restricted Schur polynomials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kemp, Garreth, E-mail: garreth.kemp@students.wits.ac.za

    2015-02-15

    We focus on the 1/4-BPS sector of free super Yang-Mills theory with an SO(N) gauge group. This theory has an AdS/CFT (an equivalence between a conformal field theory in d-1 dimensions and type II string theory defined on an AdS space in d-dimensions) dual in the form of type IIB string theory with AdS{sub 5}×RP{sup 5} geometry. With the aim of studying excited giant graviton dynamics, we construct an orthogonal basis for this sector of the gauge theory in this work. First, we demonstrate that the counting of states, as given by the partition function, and the counting of restrictedmore » Schur polynomials match by restricting to a particular class of Young diagram labels. We then give an explicit construction of these gauge invariant operators and evaluate their two-point function exactly. This paves the way to studying the spectral problem of these operators and their D-brane duals.« less

  8. Dietary fat and not calcium supplementation or dairy product consumption is associated with changes in anthropometrics during a randomized, placebo-controlled energy-restriction trial.

    PubMed

    Smilowitz, Jennifer T; Wiest, Michelle M; Teegarden, Dorothy; Zemel, Michael B; German, J Bruce; Van Loan, Marta D

    2011-10-05

    Insufficient calcium intake has been proposed to cause unbalanced energy partitioning leading to obesity. However, weight loss interventions including dietary calcium or dairy product consumption have not reported changes in lipid metabolism measured by the plasma lipidome. The objective of this study was to determine the relationships between dairy product or supplemental calcium intake with changes in the plasma lipidome and body composition during energy restriction. A secondary objective of this study was to explore the relationships among calculated macronutrient composition of the energy restricted diet to changes in the plasma lipidome, and body composition during energy restriction. Overweight adults (n = 61) were randomized into one of three intervention groups including a deficit of 500kcal/d: 1) placebo; 2) 900 mg/d calcium supplement; and 3) 3-4 servings of dairy products/d plus a placebo supplement. Plasma fatty acid methyl esters of cholesterol ester, diacylglycerol, free fatty acids, lysophosphatidylcholine, phosphatidylcholine, phosphatidylethanolamine and triacylglycerol were quantified by capillary gas chromatography. After adjustments for energy and protein (g/d) intake, there was no significant effect of treatment on changes in weight, waist circumference or body composition. Plasma lipidome did not differ among dietary treatment groups. Stepwise regression identified correlations between reported intake of monounsaturated fat (% of energy) and changes in % lean mass (r = -0.44, P < 0.01) and % body fat (r = 0.48, P < 0.001). Polyunsaturated fat intake was associated with the % change in waist circumference (r = 0.44, P < 0.01). Dietary saturated fat was not associated with any changes in anthropometrics or the plasma lipidome. Dairy product consumption or calcium supplementation during energy restriction over the course of 12 weeks did not affect plasma lipids. Independent of calcium and dairy product consumption, short-term energy restriction altered body composition. Reported dietary fat composition of energy restricted diets was associated with the degree of change in body composition in these overweight and obese individuals.

  9. A New Algorithm to Optimize Maximal Information Coefficient

    PubMed Central

    Luo, Feng; Yuan, Zheming

    2016-01-01

    The maximal information coefficient (MIC) captures dependences between paired variables, including both functional and non-functional relationships. In this paper, we develop a new method, ChiMIC, to calculate the MIC values. The ChiMIC algorithm uses the chi-square test to terminate grid optimization and then removes the restriction of maximal grid size limitation of original ApproxMaxMI algorithm. Computational experiments show that ChiMIC algorithm can maintain same MIC values for noiseless functional relationships, but gives much smaller MIC values for independent variables. For noise functional relationship, the ChiMIC algorithm can reach the optimal partition much faster. Furthermore, the MCN values based on MIC calculated by ChiMIC can capture the complexity of functional relationships in a better way, and the statistical powers of MIC calculated by ChiMIC are higher than those calculated by ApproxMaxMI. Moreover, the computational costs of ChiMIC are much less than those of ApproxMaxMI. We apply the MIC values tofeature selection and obtain better classification accuracy using features selected by the MIC values from ChiMIC. PMID:27333001

  10. Research on Crack Formation in Gypsum Partitions with Doorway by Means of FEM and Fracture Mechanics

    NASA Astrophysics Data System (ADS)

    Kania, Tomasz; Stawiski, Bohdan

    2017-10-01

    Cracking damage in non-loadbearing internal partition walls is a serious problem that frequently occurs in new buildings within the short term after putting them into service or even before completion of construction. Damage in partition walls is sometimes so great that they cannot be accepted by their occupiers. This problem was illustrated by the example of damage in a gypsum partition wall with doorway attributed to deflection of the slabs beneath and above it. In searching for the deflection which causes damage in masonry walls, fracture mechanics applied to the Finite Element Method (FEM) have been used. For a description of gypsum behaviour, the smeared cracking material model has been selected, where stresses are transferred across the narrowly opened crack until its width reaches the ultimate value. Cracks in the Finite Element models overlapped the real damage observed in the buildings. In order to avoid cracks under the deflection of large floor slabs, the model of a wall with reinforcement in the doorstep zone and a 40 mm thick elastic junction between the partition and ceiling has been analysed.

  11. Gas-particle partitioning of alcohol vapors on organic aerosols.

    PubMed

    Chan, Lap P; Lee, Alex K Y; Chan, Chak K

    2010-01-01

    Single particle levitation using an electrodynamic balance (EDB) has been found to give accurate and direct hygroscopic measurements (gas-particle partitioning of water) for a number of inorganic and organic aerosol systems. In this paper, we extend the use of an EDB to examine the gas-particle partitioning of volatile to semivolatile alcohols, including methanol, n-butanol, n-octanol, and n-decanol, on levitated oleic acid particles. The measured K(p) agreed with Pankow's absorptive partitioning model. At high n-butanol vapor concentrations (10(3) ppm), the uptake of n-butanol reduced the average molecular-weight of the oleic acid particle appreciably and hence increased the K(p) according to Pankow's equation. Moreover, the hygroscopicity of mixed oleic acid/n-butanol particles was higher than the predictions given by the UNIFAC model (molecular group contribution method) and the ZSR equation (additive rule), presumably due to molecular interactions between the chemical species in the mixed particles. Despite the high vapor concentrations used, these findings warrant further research on the partitioning of atmospheric organic vapors (K(p)) near sources and how collectively they affect the hygroscopic properties of organic aerosols.

  12. Measuring the critical band for speech.

    PubMed

    Healy, Eric W; Bacon, Sid P

    2006-02-01

    The current experiments were designed to measure the frequency resolution employed by listeners during the perception of everyday sentences. Speech bands having nearly vertical filter slopes and narrow bandwidths were sharply partitioned into various numbers of equal log- or ERBN-width subbands. The temporal envelope from each partition was used to amplitude modulate a corresponding band of low-noise noise, and the modulated carriers were combined and presented to normal-hearing listeners. Intelligibility increased and reached asymptote as the number of partitions increased. In the mid- and high-frequency regions of the speech spectrum, the partition bandwidth corresponding to asymptotic performance matched current estimates of psychophysical tuning across a number of conditions. These results indicate that, in these regions, the critical band for speech matches the critical band measured using traditional psychoacoustic methods and nonspeech stimuli. However, in the low-frequency region, partition bandwidths at asymptote were somewhat narrower than would be predicted based upon psychophysical tuning. It is concluded that, overall, current estimates of psychophysical tuning represent reasonably well the ability of listeners to extract spectral detail from running speech.

  13. Plant interspecies competition for sunlight: a mathematical model of canopy partitioning.

    PubMed

    Nevai, Andrew L; Vance, Richard R

    2007-07-01

    We examine the influence of canopy partitioning on the outcome of competition between two plant species that interact only by mutually shading each other. This analysis is based on a Kolmogorov-type canopy partitioning model for plant species with clonal growth form and fixed vertical leaf profiles (Vance and Nevai in J. Theor. Biol., 2007, to appear). We show that canopy partitioning is necessary for the stable coexistence of the two competing plant species. We also use implicit methods to show that, under certain conditions, the species' nullclines can intersect at most once. We use nullcline endpoint analysis to show that when the nullclines do intersect, and in such a way that they cross, then the resulting equilibrium point is always stable. We also construct surfaces that divide parameter space into regions within which the various outcomes of competition occur, and then study parameter dependence in the locations of these surfaces. The analysis presented here and in a companion paper (Nevai and Vance, The role of leaf height in plant competition for sunlight: analysis of a canopy partitioning model, in review) together shows that canopy partitioning is both necessary and, under appropriate parameter values, sufficient for the stable coexistence of two hypothetical plant species whose structure and growth are described by our model.

  14. Analytical prediction of the interior noise for cylindrical models of aircraft fuselages for prescribed exterior noise fields. Phase 2: Models for sidewall trim, stiffened structures and cabin acoustics with floor partition

    NASA Technical Reports Server (NTRS)

    Pope, L. D.; Wilby, E. G.

    1982-01-01

    An airplane interior noise prediction model is developed to determine the important parameters associated with sound transmission into the interiors of airplanes, and to identify apropriate noise control methods. Models for stiffened structures, and cabin acoustics with floor partition are developed. Validation studies are undertaken using three test articles: a ring stringer stiffened cylinder, an unstiffened cylinder with floor partition, and ring stringer stiffened cylinder with floor partition and sidewall trim. The noise reductions of the three test articles are computed using the heoretical models and compared to measured values. A statistical analysis of the comparison data indicates that there is no bias in the predictions although a substantial random error exists so that a discrepancy of more than five or six dB can be expected for about one out of three predictions.

  15. Spatial coding-based approach for partitioning big spatial data in Hadoop

    NASA Astrophysics Data System (ADS)

    Yao, Xiaochuang; Mokbel, Mohamed F.; Alarabi, Louai; Eldawy, Ahmed; Yang, Jianyu; Yun, Wenju; Li, Lin; Ye, Sijing; Zhu, Dehai

    2017-09-01

    Spatial data partitioning (SDP) plays a powerful role in distributed storage and parallel computing for spatial data. However, due to skew distribution of spatial data and varying volume of spatial vector objects, it leads to a significant challenge to ensure both optimal performance of spatial operation and data balance in the cluster. To tackle this problem, we proposed a spatial coding-based approach for partitioning big spatial data in Hadoop. This approach, firstly, compressed the whole big spatial data based on spatial coding matrix to create a sensing information set (SIS), including spatial code, size, count and other information. SIS was then employed to build spatial partitioning matrix, which was used to spilt all spatial objects into different partitions in the cluster finally. Based on our approach, the neighbouring spatial objects can be partitioned into the same block. At the same time, it also can minimize the data skew in Hadoop distributed file system (HDFS). The presented approach with a case study in this paper is compared against random sampling based partitioning, with three measurement standards, namely, the spatial index quality, data skew in HDFS, and range query performance. The experimental results show that our method based on spatial coding technique can improve the query performance of big spatial data, as well as the data balance in HDFS. We implemented and deployed this approach in Hadoop, and it is also able to support efficiently any other distributed big spatial data systems.

  16. HARP: A Dynamic Inertial Spectral Partitioner

    NASA Technical Reports Server (NTRS)

    Simon, Horst D.; Sohn, Andrew; Biswas, Rupak

    1997-01-01

    Partitioning unstructured graphs is central to the parallel solution of computational science and engineering problems. Spectral partitioners, such recursive spectral bisection (RSB), have proven effecfive in generating high-quality partitions of realistically-sized meshes. The major problem which hindered their wide-spread use was their long execution times. This paper presents a new inertial spectral partitioner, called HARP. The main objective of the proposed approach is to quickly partition the meshes at runtime in a manner that works efficiently for real applications in the context of distributed-memory machines. The underlying principle of HARP is to find the eigenvectors of the unpartitioned vertices and then project them onto the eigerivectors of the original mesh. Results for various meshes ranging in size from 1000 to 100,000 vertices indicate that HARP can indeed partition meshes rapidly at runtime. Experimental results show that our largest mesh can be partitioned sequentially in only a few seconds on an SP2 which is several times faster than other spectral partitioners while maintaining the solution quality of the proven RSB method. A parallel WI version of HARP has also been implemented on IBM SP2 and Cray T3E. Parallel HARP, running on 64 processors SP2 and T3E, can partition a mesh containing more than 100,000 vertices into 64 subgrids in about half a second. These results indicate that graph partitioning can now be truly embedded in dynamically-changing real-world applications.

  17. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems

    PubMed Central

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  18. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    PubMed

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.

  19. A Chebyshev method for state-to-state reactive scattering using reactant-product decoupling: OH + H2 → H2O + H.

    PubMed

    Cvitaš, Marko T; Althorpe, Stuart C

    2013-08-14

    We extend a recently developed wave packet method for computing the state-to-state quantum dynamics of AB + CD → ABC + D reactions [M. T. Cvitaš and S. C. Althorpe, J. Phys. Chem. A 113, 4557 (2009)] to include the Chebyshev propagator. The method uses the further partitioned approach to reactant-product decoupling, which uses artificial decoupling potentials to partition the coordinate space of the reaction into separate reactant, product, and transition-state regions. Separate coordinates and basis sets can then be used that are best adapted to each region. We derive improved Chebyshev partitioning formulas which include Mandelshtam-and-Taylor-type decoupling potentials, and which are essential for the non-unitary discrete variable representations that must be used in 4-atom reactive scattering calculations. Numerical tests on the fully dimensional OH + H2 → H2O + H reaction for J = 0 show that the new version of the method is as efficient as the previously developed split-operator version. The advantages of the Chebyshev propagator (most notably the ease of parallelization for J > 0) can now be fully exploited in state-to-state reactive scattering calculations on 4-atom reactions.

  20. Partition resampling and extrapolation averaging: approximation methods for quantifying gene expression in large numbers of short oligonucleotide arrays.

    PubMed

    Goldstein, Darlene R

    2006-10-01

    Studies of gene expression using high-density short oligonucleotide arrays have become a standard in a variety of biological contexts. Of the expression measures that have been proposed to quantify expression in these arrays, multi-chip-based measures have been shown to perform well. As gene expression studies increase in size, however, utilizing multi-chip expression measures is more challenging in terms of computing memory requirements and time. A strategic alternative to exact multi-chip quantification on a full large chip set is to approximate expression values based on subsets of chips. This paper introduces an extrapolation method, Extrapolation Averaging (EA), and a resampling method, Partition Resampling (PR), to approximate expression in large studies. An examination of properties indicates that subset-based methods can perform well compared with exact expression quantification. The focus is on short oligonucleotide chips, but the same ideas apply equally well to any array type for which expression is quantified using an entire set of arrays, rather than for only a single array at a time. Software implementing Partition Resampling and Extrapolation Averaging is under development as an R package for the BioConductor project.

  1. DAPNe with micro-capillary separatory chemistry-coupled to MALDI-MS for the analysis of polar and non-polar lipid metabolism in one cell

    NASA Astrophysics Data System (ADS)

    Hamilton, Jason S.; Aguilar, Roberto; Petros, Robby A.; Verbeck, Guido F.

    2017-05-01

    The cellular metabolome is considered to be a representation of cellular phenotype and cellular response to changes to internal or external events. Methods to expand the coverage of the expansive physiochemical properties that makeup the metabolome currently utilize multi-step extractions and chromatographic separations prior to chemical detection, leading to lengthy analysis times. In this study, a single-step procedure for the extraction and separation of a sample using a micro-capillary as a separatory funnel to achieve analyte partitioning within an organic/aqueous immiscible solvent system is described. The separated analytes are then spotted for MALDI-MS imaging and distribution ratios are calculated. Initially, the method is applied to standard mixtures for proof of partitioning. The extraction of an individual cell is non-reproducible; therefore, a broad chemical analysis of metabolites is necessary and will be illustrated with the one-cell analysis of a single Snu-5 gastric cancer cell taken from a cellular suspension. The method presented here shows a broad partitioning dynamic range as a single-step method for lipid analysis demonstrating a decrease in ion suppression often present in MALDI analysis of lipids.

  2. Interplay between geometry and flow distribution in an airway tree.

    PubMed

    Mauroy, B; Filoche, M; Andrade, J S; Sapoval, B

    2003-04-11

    Uniform flow distribution in a symmetric volume can be realized through a symmetric branched tree. It is shown here, however, by 3D numerical simulation of the Navier-Stokes equations, that the flow partitioning can be highly sensitive to deviations from exact symmetry if inertial effects are present. The flow asymmetry is quantified and found to depend on the Reynolds number. Moreover, for a given Reynolds number, we show that the flow distribution depends on the aspect ratio of the branching elements as well as their angular arrangement. Our results indicate that physiological variability should be severely restricted in order to ensure adequate fluid distribution through a tree.

  3. Environmental Containment Property Estimation Using QSARs in an Expert System

    DTIC Science & Technology

    1991-10-15

    economical method to estimate aqueous solubility, octanol/ water partition coefficients, vapor pressures, organic carbon, normalized soil sorption...PROPERTY ESTIMATION USING QSARs IN AN EXPERT SYSTEM William J. Doucette Mark S. Holt Doug J. Denne Joan E. McLean Utah State University Utah Water ...persistence of a chemical are aqueous solubility, octanol/ water partition coefficient, soil/ water sorption coefficient, Henry’s Law constant

  4. Recognition of building group patterns in topographic maps based on graph partitioning and random forest

    NASA Astrophysics Data System (ADS)

    He, Xianjin; Zhang, Xinchang; Xin, Qinchuan

    2018-02-01

    Recognition of building group patterns (i.e., the arrangement and form exhibited by a collection of buildings at a given mapping scale) is important to the understanding and modeling of geographic space and is hence essential to a wide range of downstream applications such as map generalization. Most of the existing methods develop rigid rules based on the topographic relationships between building pairs to identify building group patterns and thus their applications are often limited. This study proposes a method to identify a variety of building group patterns that allow for map generalization. The method first identifies building group patterns from potential building clusters based on a machine-learning algorithm and further partitions the building clusters with no recognized patterns based on the graph partitioning method. The proposed method is applied to the datasets of three cities that are representative of the complex urban environment in Southern China. Assessment of the results based on the reference data suggests that the proposed method is able to recognize both regular (e.g., the collinear, curvilinear, and rectangular patterns) and irregular (e.g., the L-shaped, H-shaped, and high-density patterns) building group patterns well, given that the correctness values are consistently nearly 90% and the completeness values are all above 91% for three study areas. The proposed method shows promises in automated recognition of building group patterns that allows for map generalization.

  5. Incorporating profile information in community detection for online social networks

    NASA Astrophysics Data System (ADS)

    Fan, W.; Yeung, K. H.

    2014-07-01

    Community structure is an important feature in the study of complex networks. It is because nodes of the same community may have similar properties. In this paper we extend two popular community detection methods to partition online social networks. In our extended methods, the profile information of users is used for partitioning. We apply the extended methods in several sample networks of Facebook. Compared with the original methods, the community structures we obtain have higher modularity. Our results indicate that users' profile information is consistent with the community structure of their friendship network to some extent. To the best of our knowledge, this paper is the first to discuss how profile information can be used to improve community detection in online social networks.

  6. Network immunization under limited budget using graph spectra

    NASA Astrophysics Data System (ADS)

    Zahedi, R.; Khansari, M.

    2016-03-01

    In this paper, we propose a new algorithm that minimizes the worst expected growth of an epidemic by reducing the size of the largest connected component (LCC) of the underlying contact network. The proposed algorithm is applicable to any level of available resources and, despite the greedy approaches of most immunization strategies, selects nodes simultaneously. In each iteration, the proposed method partitions the LCC into two groups. These are the best candidates for communities in that component, and the available resources are sufficient to separate them. Using Laplacian spectral partitioning, the proposed method performs community detection inference with a time complexity that rivals that of the best previous methods. Experiments show that our method outperforms targeted immunization approaches in both real and synthetic networks.

  7. Dual-Level Method for Estimating Multistructural Partition Functions with Torsional Anharmonicity.

    PubMed

    Bao, Junwei Lucas; Xing, Lili; Truhlar, Donald G

    2017-06-13

    For molecules with multiple torsions, an accurate evaluation of the molecular partition function requires consideration of multiple structures and their torsional-potential anharmonicity. We previously developed a method called MS-T for this problem, and it requires an exhaustive conformational search with frequency calculations for all the distinguishable conformers; this can become expensive for molecules with a large number of torsions (and hence a large number of structures) if it is carried out with high-level methods. In the present work, we propose a cost-effective method to approximate the MS-T partition function when there are a large number of structures, and we test it on a transition state that has eight torsions. This new method is a dual-level method that combines an exhaustive conformer search carried out by a low-level electronic structure method (for instance, AM1, which is very inexpensive) and selected calculations with a higher-level electronic structure method (for example, density functional theory with a functional that is suitable for conformational analysis and thermochemistry). To provide a severe test of the new method, we consider a transition state structure that has 8 torsional degrees of freedom; this transition state structure is formed along one of the reaction pathways of the hydrogen abstraction reaction (at carbon-1) of ketohydroperoxide (KHP; its IUPAC name is 4-hydroperoxy-2-pentanone) by OH radical. We find that our proposed dual-level method is able to significantly reduce the computational cost for computing MS-T partition functions for this test case with a large number of torsions and with a large number of conformers because we carry out high-level calculations for only a fraction of the distinguishable conformers found by the low-level method. In the example studied here, the dual-level method with 40 high-level optimizations (1.8% of the number of optimizations in a coarse-grained full search and 0.13% of the number of optimizations in a fine-grained full search) reproduces the full calculation of the high-level partition function within a factor of 1.0 to 2.0 from 200 to 1000 K. The error in the dual-level method can be further reduced to factors of 0.6 to 1.1 over the whole temperature interval from 200 to 2400 K by optimizing 128 structures (5.9% of the number of optimizations in a fine-grained full search and 0.41% of the number of optimizations in a fine-grained full search). These factor-of-two or better errors are small compared to errors up to a factor of 1.0 × 10 3 if one neglects multistructural effects for the case under study.

  8. Mesoscopic-microscopic spatial stochastic simulation with automatic system partitioning.

    PubMed

    Hellander, Stefan; Hellander, Andreas; Petzold, Linda

    2017-12-21

    The reaction-diffusion master equation (RDME) is a model that allows for efficient on-lattice simulation of spatially resolved stochastic chemical kinetics. Compared to off-lattice hard-sphere simulations with Brownian dynamics or Green's function reaction dynamics, the RDME can be orders of magnitude faster if the lattice spacing can be chosen coarse enough. However, strongly diffusion-controlled reactions mandate a very fine mesh resolution for acceptable accuracy. It is common that reactions in the same model differ in their degree of diffusion control and therefore require different degrees of mesh resolution. This renders mesoscopic simulation inefficient for systems with multiscale properties. Mesoscopic-microscopic hybrid methods address this problem by resolving the most challenging reactions with a microscale, off-lattice simulation. However, all methods to date require manual partitioning of a system, effectively limiting their usefulness as "black-box" simulation codes. In this paper, we propose a hybrid simulation algorithm with automatic system partitioning based on indirect a priori error estimates. We demonstrate the accuracy and efficiency of the method on models of diffusion-controlled networks in 3D.

  9. New gap-filling and partitioning technique for H2O eddy fluxes measured over forests

    NASA Astrophysics Data System (ADS)

    Kang, Minseok; Kim, Joon; Malla Thakuri, Bindu; Chun, Junghwa; Cho, Chunho

    2018-01-01

    The continuous measurement of H2O fluxes using the eddy covariance (EC) technique is still challenging for forests because of large amounts of wet canopy evaporation (EWC), which occur during and following rain events when the EC systems rarely work correctly. We propose a new gap-filling and partitioning technique for the H2O fluxes: a model-statistics hybrid (MSH) method. It enables the recovery of the missing EWC in the traditional gap-filling method and the partitioning of the evapotranspiration (ET) into transpiration and (wet canopy) evaporation. We tested and validated the new method using the data sets from two flux towers, which are located at forests in hilly and complex terrains. The MSH reasonably recovered the missing EWC of 16-41 mm yr-1 and separated it from the ET (14-23 % of the annual ET). Additionally, we illustrated certain advantages of the proposed technique which enable us to understand better how ET responds to environmental changes and how the water cycle is connected to the carbon cycle in a forest ecosystem.

  10. Gas chromatographic quantitation of underivatized amines in the determination of their octanol-0.1 M sodium hydroxide partition coefficients by the shake-flask method.

    PubMed

    Grunewald, G L; Pleiss, M A; Gatchell, C L; Pazhenchevsky, R; Rafferty, M F

    1984-06-01

    The use of gas chromatography (GC) for the determination of 0.1 M sodium hydroxide-octanol partition coefficients (log P) for a wide variety of ethylamines is demonstrated. The conventional shake-flask procedure (SFP) is utilized, with the addition of an internal reference, which is cleanly separated from the desired solute and solvents on a 10% Apiezon L, 2% potassium hydroxide on 80-100 mesh Chromosorb W AW column. The partitioned solute is extracted from the aqueous phase with chloroform and analyzed by GC. The method provides an accurate and highly reproducible means of determining log P values, as demonstrated by the low relative standard errors. The technique is both rapid and extremely versatile. The use of the internal standard method of analysis introduces consistency, since variables like the exact weight of solute are not necessary (unlike the traditional SFP) and the volume of sample injected is not critical. The technique is readily accessible to microgram quantities of solutes, making it ideal for a wide range of volatile, amine-bearing compounds.

  11. On the composition of an arbitrary collection of SU(2) spins: an enumerative combinatoric approach

    NASA Astrophysics Data System (ADS)

    Gyamfi, J. A.; Barone, V.

    2018-03-01

    The whole enterprise of spin compositions can be recast as simple enumerative combinatoric problems. We show here that enumerative combinatorics (Stanley 2011 Enumerative Combinatorics (Cambridge Studies in Advanced Mathematics vol 1) (Cambridge: Cambridge University Press)) is a natural setting for spin composition, and easily leads to very general analytic formulae—many of which hitherto not present in the literature. Based on it, we propose three general methods for computing spin multiplicities; namely, (1) the multi-restricted composition, (2) the generalized binomial and (3) the generating function methods. Symmetric and anti-symmetric compositions of SU(2) spins are also discussed, using generating functions. Of particular importance is the observation that while the common Clebsch-Gordan decomposition—which considers the spins as distinguishable—is related to integer compositions, the symmetric and anti-symmetric compositions (where one considers the spins as indistinguishable) are obtained considering integer partitions. The integers in question here are none other than the occupation numbers of the Holstein-Primakoff bosons. The pervasiveness of q-analogues in our approach is a testament to the fundamental role they play in spin compositions. In the appendix, some new results in the power series representation of Gaussian polynomials (or q-binomial coefficients)—relevant to symmetric and antisymmetric compositions—are presented.

  12. Concerted changes in N and C primary metabolism in alfalfa (Medicago sativa) under water restriction

    PubMed Central

    Aranjuelo, Iker

    2013-01-01

    Although the mechanisms of nodule N2 fixation in legumes are now well documented, some uncertainty remains on the metabolic consequences of water deficit. In most cases, little consideration is given to other organs and, therefore, the coordinated changes in metabolism in leaves, roots, and nodules are not well known. Here, the effect of water restriction on exclusively N2-fixing alfalfa (Medicago sativa L.) plants was investigated, and proteomic, metabolomic, and physiological analyses were carried out. It is shown that the inhibition of nitrogenase activity caused by water restriction was accompanied by concerted alterations in metabolic pathways in nodules, leaves, and roots. The data suggest that nodule metabolism and metabolic exchange between plant organs nearly reached homeostasis in asparagine synthesis and partitioning, as well as the N demand from leaves. Typically, there was (i) a stimulation of the anaplerotic pathway to sustain the provision of C skeletons for amino acid (e.g. glutamate and proline) synthesis; (ii) re-allocation of glycolytic products to alanine and serine/glycine; and (iii) subtle changes in redox metabolites suggesting the implication of a slight oxidative stress. Furthermore, water restriction caused little change in both photosynthetic efficiency and respiratory cost of N2 fixation by nodules. In other words, the results suggest that under water stress, nodule metabolism follows a compromise between physiological imperatives (N demand, oxidative stress) and the lower input to sustain catabolism. PMID:23440170

  13. An Enriched Unified Medical Language System Semantic Network with a Multiple Subsumption Hierarchy

    PubMed Central

    Zhang, Li; Perl, Yehoshua; Halper, Michael; Geller, James; Cimino, James J.

    2004-01-01

    Objective: The Unified Medical Language System's (UMLS's) Semantic Network's (SN's) two-tree structure is restrictive because it does not allow a semantic type to be a specialization of several other semantic types. In this article, the SN is expanded into a multiple subsumption structure with a directed acyclic graph (DAG) IS-A hierarchy, allowing a semantic type to have multiple parents. New viable IS-A links are added as warranted. Design: Two methodologies are presented to identify and add new viable IS-A links. The first methodology is based on imposing the characteristic of connectivity on a previously presented partition of the SN. Four transformations are provided to find viable IS-A links in the process of converting the partition's disconnected groups into connected ones. The second methodology identifies new IS-A links through a string matching process involving names and definitions of various semantic types in the SN. A domain expert is needed to review all the results to determine the validity of the new IS-A links. Results: Nineteen new IS-A links are added to the SN, and four new semantic types are also created to support the multiple subsumption framework. The resulting network, called the Enriched Semantic Network (ESN), exhibits a DAG-structured hierarchy. A partition of the ESN containing 19 connected groups is also derived. Conclusion: The ESN is an expanded abstraction of the UMLS compared with the original SN. Its multiple subsumption hierarchy can accommodate semantic types with multiple parents. Its representation thus provides direct access to a broader range of subsumption knowledge. PMID:14764611

  14. The spliced leader RNA gene array in phloem-restricted plant trypanosomatids (Phytomonas) partitions into two major groupings: epidemiological implications.

    PubMed

    Dollet, M; Sturm, N R; Campbell, D A

    2001-03-01

    The arbitrary genus Phytomonas includes a biologically diverse group of kinetoplastids that live in a wide variety of plant environments. To understand better the subdivisions within the phytomonads and the variability within groups, the exon, intron and non-transcribed spacer sequences of the spliced leader RNA gene were compared among isolates of the phloem-restricted members. A total of 29 isolates associated with disease in coconut, oil palm and red ginger (Alpinia purpurata, Zingibreaceae) were examined, all originating from plantations in South America and the Caribbean over a 12-year period. Analysis of non-transcribed spacer sequences revealed 2 main groups, I and II; group II could be further subdivided into 2 subgroups, IIa and Ilb. Three classes of spliced leader (SL) RNA gene were seen, with SLI corresponding to group I, SLIIa to group lIa, and SLIIb to group IIb. Two isolates showed some characteristics of both major groups. Group-specific oligonucleotide probes for hybridization studies were tested, and a multiplex amplification scheme was devised to allow direct differentiation between the 2 major groups of phloem-restricted Phytomonas. These results provide tools for diagnostic and molecular epidemiology of plant trypanosomes that are pathogenic for commercially important flowers and palms.

  15. Assimilation of Endogenous Nicotinamide Riboside Is Essential for Calorie Restriction-mediated Life Span Extension in Saccharomyces cerevisiae*

    PubMed Central

    Lu, Shu-Ping; Kato, Michiko; Lin, Su-Ju

    2009-01-01

    NAD+ (nicotinamide adenine dinucleotide) is an essential cofactor involved in various biological processes including calorie restriction-mediated life span extension. Administration of nicotinamide riboside (NmR) has been shown to ameliorate deficiencies related to aberrant NAD+ metabolism in both yeast and mammalian cells. However, the biological role of endogenous NmR remains unclear. Here we demonstrate that salvaging endogenous NmR is an integral part of NAD+ metabolism. A balanced NmR salvage cycle is essential for calorie restriction-induced life span extension and stress resistance in yeast. Our results also suggest that partitioning of the pyridine nucleotide flux between the classical salvage cycle and the NmR salvage branch might be modulated by the NAD+-dependent Sir2 deacetylase. Furthermore, two novel deamidation steps leading to nicotinic acid mononucleotide and nicotinic acid riboside production are also uncovered that further underscore the complexity and flexibility of NAD+ metabolism. In addition, utilization of extracellular nicotinamide mononucleotide requires prior conversion to NmR mediated by a periplasmic phosphatase Pho5. Conversion to NmR may thus represent a strategy for the transport and assimilation of large nonpermeable NAD+ precursors. Together, our studies provide a molecular basis for how NAD+ homeostasis factors confer metabolic flexibility. PMID:19416965

  16. Assimilation of endogenous nicotinamide riboside is essential for calorie restriction-mediated life span extension in Saccharomyces cerevisiae.

    PubMed

    Lu, Shu-Ping; Kato, Michiko; Lin, Su-Ju

    2009-06-19

    NAD(+) (nicotinamide adenine dinucleotide) is an essential cofactor involved in various biological processes including calorie restriction-mediated life span extension. Administration of nicotinamide riboside (NmR) has been shown to ameliorate deficiencies related to aberrant NAD(+) metabolism in both yeast and mammalian cells. However, the biological role of endogenous NmR remains unclear. Here we demonstrate that salvaging endogenous NmR is an integral part of NAD(+) metabolism. A balanced NmR salvage cycle is essential for calorie restriction-induced life span extension and stress resistance in yeast. Our results also suggest that partitioning of the pyridine nucleotide flux between the classical salvage cycle and the NmR salvage branch might be modulated by the NAD(+)-dependent Sir2 deacetylase. Furthermore, two novel deamidation steps leading to nicotinic acid mononucleotide and nicotinic acid riboside production are also uncovered that further underscore the complexity and flexibility of NAD(+) metabolism. In addition, utilization of extracellular nicotinamide mononucleotide requires prior conversion to NmR mediated by a periplasmic phosphatase Pho5. Conversion to NmR may thus represent a strategy for the transport and assimilation of large nonpermeable NAD(+) precursors. Together, our studies provide a molecular basis for how NAD(+) homeostasis factors confer metabolic flexibility.

  17. 1H NMR investigation of thermally triggered insulin release from poly(N-isopropylacrylamide) microgels.

    PubMed

    Nolan, Christine M; Gelbaum, Leslie T; Lyon, L Andrew

    2006-10-01

    We describe investigations of insulin release from thermoresponsive microgels using variable temperature (1)H NMR. Microgel particles composed of poly(N-isopropylacrylamide) were loaded with the peptide via a swelling technique, and this method was compared to simple equilibrium partitioning. Variable temperature (1)H NMR studies suggest that the swelling loading method results in enhanced entrapment of the peptide versus equilibrium partitioning. A centrifugation-loading assay supports this finding. Pseudo-temperature jump (1)H NMR measurements suggest that the insulin release rate is partially decoupled from microgel collapse. These types of direct release investigations could prove to be useful methods in the future design of controlled macromolecule drug delivery devices.

  18. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  19. Ion exchange membranes as novel passive sampling material for organic ions: application for the determination of freely dissolved concentrations.

    PubMed

    Oemisch, Luise; Goss, Kai-Uwe; Endo, Satoshi

    2014-11-28

    Many studies in pharmacology, toxicology and environmental science require a method for determining the freely dissolved concentration of a target substance. A recently developed tool for this purpose is equilibrium passive sampling with polymeric materials. However, this method has rarely been applied to ionic organic substances, primarily due to limited availability of convenient sorption materials. This study introduces ion exchange membranes (IEMs) as a novel passive sampling material for organic ions. The partitioning of 4-ethylbenzene-1-sulfonate, 2,4-dichlorophenoxyacetic acid and pentachlorophenol to one anion exchange membrane (FAS) and of difenzoquat, nicotine and verapamil to one cation exchange membrane (FKS) was investigated. All test substances exhibited a sufficiently high affinity for the respective IEM with logarithmic IEM-water partition coefficients >2.3. Sorption equilibrium was established quickly, within several hours for the FAS membrane and within 1-3 days for the FKS membrane. For permanently charged substances the partitioning to the IEMs was independent of pH, but was influenced by the salt composition of the test solution. For all test substances sorption to IEM was dependent on the substance concentration. Bovine serum albumin-water partition coefficients determined by passive sampling with IEMs agree well with those determined by the conventional dialysis method. The results of this study indicate that IEMs exhibit the potential to measure freely dissolved concentrations of organic ions in a simple and time-saving manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Case Study of Airborne Pathogen Dispersion Patterns in Emergency Departments with Different Ventilation and Partition Conditions

    PubMed Central

    Cheong, Chang Heon; Lee, Seonhye

    2018-01-01

    The prevention of airborne infections in emergency departments is a very important issue. This study investigated the effects of architectural features on airborne pathogen dispersion in emergency departments by using a CFD (computational fluid dynamics) simulation tool. The study included three architectural features as the major variables: increased ventilation rate, inlet and outlet diffuser positions, and partitions between beds. The most effective method for preventing pathogen dispersion and reducing the pathogen concentration was found to be increasing the ventilation rate. Installing partitions between the beds and changing the ventilation system’s inlet and outlet diffuser positions contributed only minimally to reducing the concentration of airborne pathogens. PMID:29534043

  1. Case Study of Airborne Pathogen Dispersion Patterns in Emergency Departments with Different Ventilation and Partition Conditions.

    PubMed

    Cheong, Chang Heon; Lee, Seonhye

    2018-03-13

    The prevention of airborne infections in emergency departments is a very important issue. This study investigated the effects of architectural features on airborne pathogen dispersion in emergency departments by using a CFD (computational fluid dynamics) simulation tool. The study included three architectural features as the major variables: increased ventilation rate, inlet and outlet diffuser positions, and partitions between beds. The most effective method for preventing pathogen dispersion and reducing the pathogen concentration was found to be increasing the ventilation rate. Installing partitions between the beds and changing the ventilation system's inlet and outlet diffuser positions contributed only minimally to reducing the concentration of airborne pathogens.

  2. Localization in abelian Chern-Simons theory

    NASA Astrophysics Data System (ADS)

    McLellan, B. D. K.

    2013-02-01

    Chern-Simons theory on a closed contact three-manifold is studied when the Lie group for gauge transformations is compact, connected, and abelian. The abelian Chern-Simons partition function is derived using the Faddeev-Popov gauge fixing method. The partition function is then formally computed using the technique of non-abelian localization. This study leads to a natural identification of the abelian Reidemeister-Ray-Singer torsion as a specific multiple of the natural unit symplectic volume form on the moduli space of flat abelian connections for the class of Sasakian three-manifolds. The torsion part of the abelian Chern-Simons partition function is computed explicitly in terms of Seifert data for a given Sasakian three-manifold.

  3. Adaptive zero-tree structure for curved wavelet image coding

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Wang, Demin; Vincent, André

    2006-02-01

    We investigate the issue of efficient data organization and representation of the curved wavelet coefficients [curved wavelet transform (WT)]. We present an adaptive zero-tree structure that exploits the cross-subband similarity of the curved wavelet transform. In the embedded zero-tree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT), the parent-child relationship is defined in such a way that a parent has four children, restricted to a square of 2×2 pixels, the parent-child relationship in the adaptive zero-tree structure varies according to the curves along which the curved WT is performed. Five child patterns were determined based on different combinations of curve orientation. A new image coder was then developed based on this adaptive zero-tree structure and the set-partitioning technique. Experimental results using synthetic and natural images showed the effectiveness of the proposed adaptive zero-tree structure for encoding of the curved wavelet coefficients. The coding gain of the proposed coder can be up to 1.2 dB in terms of peak SNR (PSNR) compared to the SPIHT coder. Subjective evaluation shows that the proposed coder preserves lines and edges better than the SPIHT coder.

  4. Doppler optical coherence microscopy and tomography applied to inner ear mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, Scott; Freeman, Dennis M.; Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, Massachusetts

    While it is clear that cochlear traveling waves underlie the extraordinary sensitivity, frequency selectivity, and dynamic range of mammalian hearing, the underlying micromechanical mechanisms remain unresolved. Recent advances in low coherence measurement techniques show promise over traditional laser Doppler vibrometry and video microscopy, which are limited by low reflectivities of cochlear structures and restricted optical access. Doppler optical coherence tomography (DOCT) and Doppler optical coherence microscopy (DOCM) both utilize a broadband source to limit constructive interference of scattered light to a small axial depth called a coherence gate. The coherence gate can be swept axially to image and measure sub-nanometermore » motions of cochlear structures throughout the cochlear partition. The coherence gate of DOCT is generally narrower than the confocal gate of the focusing optics, enabling increased axial resolution (typically 15 μm) within optical sections of the cochlear partition. DOCM, frequently implemented in the time domain, centers the coherence gate on the focal plane, achieving enhanced lateral and axial resolution when the confocal gate is narrower than the coherence gate. We compare these two complementary systems and demonstrate their utility in studying cellular and micromechanical mechanisms involved in mammalian hearing.« less

  5. Research of image retrieval technology based on color feature

    NASA Astrophysics Data System (ADS)

    Fu, Yanjun; Jiang, Guangyu; Chen, Fengying

    2009-10-01

    Recently, with the development of the communication and the computer technology and the improvement of the storage technology and the capability of the digital image equipment, more and more image resources are given to us than ever. And thus the solution of how to locate the proper image quickly and accurately is wanted.The early method is to set up a key word for searching in the database, but now the method has become very difficult when we search much more picture that we need. In order to overcome the limitation of the traditional searching method, content based image retrieval technology was aroused. Now, it is a hot research subject.Color image retrieval is the important part of it. Color is the most important feature for color image retrieval. Three key questions on how to make use of the color characteristic are discussed in the paper: the expression of color, the abstraction of color characteristic and the measurement of likeness based on color. On the basis, the extraction technology of the color histogram characteristic is especially discussed. Considering the advantages and disadvantages of the overall histogram and the partition histogram, a new method based the partition-overall histogram is proposed. The basic thought of it is to divide the image space according to a certain strategy, and then calculate color histogram of each block as the color feature of this block. Users choose the blocks that contain important space information, confirming the right value. The system calculates the distance between the corresponding blocks that users choosed. Other blocks merge into part overall histograms again, and the distance should be calculated. Then accumulate all the distance as the real distance between two pictures. The partition-overall histogram comprehensive utilizes advantages of two methods above, by choosing blocks makes the feature contain more spatial information which can improve performance; the distances between partition-overall histogram make rotating and translation does not change. The HSV color space is used to show color characteristic of image, which is suitable to the visual characteristic of human. Taking advance of human's feeling to color, it quantifies color sector with unequal interval, and get characteristic vector. Finally, it matches the similarity of image with the algorithm of the histogram intersection and the partition-overall histogram. Users can choose a demonstration image to show inquired vision require, and also can adjust several right value through the relevance-feedback method to obtain the best result of search.An image retrieval system based on these approaches is presented. The result of the experiments shows that the image retrieval based on partition-overall histogram can keep the space distribution information while abstracting color feature efficiently, and it is superior to the normal color histograms in precision rate while researching. The query precision rate is more than 95%. In addition, the efficient block expression will lower the complicate degree of the images to be searched, and thus the searching efficiency will be increased. The image retrieval algorithms based on the partition-overall histogram proposed in the paper is efficient and effective.

  6. A New Method to Quantify the Isotopic Signature of Leaf Transpiration: Implications for Landscape-Scale Evapotranspiration Partitioning Studies

    NASA Astrophysics Data System (ADS)

    Wang, L.; Good, S. P.; Caylor, K. K.

    2010-12-01

    Characterizing the constituent components of evapotranspiration is crucial to better understand ecosystem-level water budgets and water use dynamics. Isotope based evapotranspiration partitioning methods are promising but their utility lies in the accurate estimation of the isotopic composition of underlying transpiration and evaporation. Here we report a new method to quantify the isotopic signature of leaf transpiration under field conditions. This method utilizes a commercially available laser-based isotope analyzer and a transparent leaf chamber, modified from Licor conifer leaf chamber. The method is based on the water mass balance in ambient air and leaf transpired air. We verified the method using “artificial leaves” and glassline extracted samples. The method provides a new and direct way to estimate leaf transpiration isotopic signatures and it has wide applications in ecology, hydrology and plant physiology.

  7. Feeding height stratification among the herbivorous dinosaurs from the Dinosaur Park Formation (upper Campanian) of Alberta, Canada

    PubMed Central

    2013-01-01

    Background Herbivore coexistence on the Late Cretaceous island continent of Laramidia has been a topic of great interest, stemming from the paradoxically high diversity and biomass of these animals in relation to the relatively small landmass available to them. Various hypotheses have been advanced to account for these facts, of which niche partitioning is among the most frequently invoked. However, despite its wide acceptance, this hypothesis has not been rigorously tested. This study uses the fossil assemblage from the Dinosaur Park Formation of Alberta as a model to investigate whether niche partitioning facilitated herbivorous dinosaur coexistence on Laramidia. Specifically, the question of feeding height stratification is examined in light of the role it plays in facilitating modern ungulate coexistence. Results Most herbivorous dinosaur species from the Dinosaur Park Formation were restricted to feeding no higher than approximately 1 m above the ground. There is minimal evidence for feeding height partitioning at this level, with ceratopsids capable of feeding slightly higher than ankylosaurs, but the ecological significance of this is ambiguous. Hadrosaurids were uniquely capable of feeding up to 2 m quadrupedally, or up to 5 m bipedally. There is no evidence for either feeding height stratification within any of these clades, or for change in these ecological relationships through the approximately 1.5 Ma record of the Dinosaur Park Formation. Conclusions Although we cannot reject the possibility, we find no good evidence that feeding height stratification, as revealed by reconstructed maximum feeding heights, played an important role in facilitating niche partitioning among the herbivorous dinosaurs of Laramidia. Most browsing pressure was concentrated in the herb layer, although hadrosaurids were capable of reaching shrubs and low-growing trees that were out of reach from ceratopsids, ankylosaurs, and other small herbivores, effectively dividing the herbivores in terms of relative abundance. Sympatric hadrosaurids may have avoided competing with one another by feeding differentially using bipedal and quadrupedal postures. These ecological relationships evidently proved to be evolutionarily stable because they characterize the herbivore assemblage of the Dinosaur Park Formation through time. If niche partitioning served to facilitate the rich diversity of these animals, it may have been achieved by other means in addition to feeding height stratification. Consideration of other feeding height proxies, including dental microwear and skull morphology, may help to alleviate problems of underdetermination identified here. PMID:23557203

  8. Partitioning of 2,6-Bis(1H-Benzimidazol-2-yl)pyridine Fluorophore into a Phospholipid Bilayer: Complementary Use of Fluorescence Quenching Studies and Molecular Dynamics Simulations

    PubMed Central

    Kyrychenko, Alexander; Sevriukov, Igor Yu.; Syzova, Zoya A.; Ladokhin, Alexey S.; Doroshenko, Andrey O.

    2014-01-01

    Successful use of fluorescence sensing in elucidating the biophysical properties of lipid membranes requires knowledge of the distribution and location of an emitting molecule in the bilayer. We report here that 2,6-bis(1H-benzimidazol-2-yl)pyridine (BBP), which is almost non-fluorescent in aqueous solutions, reveals a strong emission enhancement in a hydrophobic environment of a phospholipid bilayer, making it interesting for fluorescence probing of water content in a lipid membrane. Comparing the fluorescence behavior of BBP in a wide variety of solvents with those in phospholipid vesicles, we suggest that the hydrogen bonding interactions between a BBP fluorophore and water molecules play a crucial role in the observed “light switch effect”. Therefore, the loss of water-induced fluorescence quenching inside a membrane are thought to be due to deep penetration of BBP into the hydrophobic, water-free region of a bilayer. Characterized by strong quenching by transition metal ions in solution, BBP also demonstrated significant shielding from the action of the quencher in the presence of phospholipid vesicles. We used the increase in fluorescence intensity, measured upon titration of probe molecules with lipid vesicles, to estimate the partition constant and the Gibbs free energy (ΔG) of transfer of BBP from aqueous buffer into a membrane. Partitioning BBP revealed strongly favorable ΔG, which depends only slightly on the lipid composition of a bilayer, varying in a range from -6.5 to -7.0 kcal/mol. To elucidate the binding interactions of the probe with a membrane on the molecular level, a distribution and favorable location of BBP in a POPC bilayer were modeled via atomistic molecular dynamics (MD) simulations using two different approaches: (i) free, diffusion-driven partitioning of the probe molecules into a bilayer and (ii) constrained umbrella sampling of a penetration profile of the dye molecule across a bilayer. Both of these MD approaches agreed with regard to the preferred location of a BBP fluorophore within the interfacial region of a bilayer, located between the hydrocarbon acyl tails and the initial portion of the lipid headgroups. MD simulations also revealed restricted permeability of water molecules into this region of a POPC bilayer, determining the strong fluorescence enhancement observed experimentally for the membrane-partitioned form of BBP. PMID:21211898

  9. Feeding height stratification among the herbivorous dinosaurs from the Dinosaur Park Formation (upper Campanian) of Alberta, Canada.

    PubMed

    Mallon, Jordan C; Evans, David C; Ryan, Michael J; Anderson, Jason S

    2013-04-04

    Herbivore coexistence on the Late Cretaceous island continent of Laramidia has been a topic of great interest, stemming from the paradoxically high diversity and biomass of these animals in relation to the relatively small landmass available to them. Various hypotheses have been advanced to account for these facts, of which niche partitioning is among the most frequently invoked. However, despite its wide acceptance, this hypothesis has not been rigorously tested. This study uses the fossil assemblage from the Dinosaur Park Formation of Alberta as a model to investigate whether niche partitioning facilitated herbivorous dinosaur coexistence on Laramidia. Specifically, the question of feeding height stratification is examined in light of the role it plays in facilitating modern ungulate coexistence. Most herbivorous dinosaur species from the Dinosaur Park Formation were restricted to feeding no higher than approximately 1 m above the ground. There is minimal evidence for feeding height partitioning at this level, with ceratopsids capable of feeding slightly higher than ankylosaurs, but the ecological significance of this is ambiguous. Hadrosaurids were uniquely capable of feeding up to 2 m quadrupedally, or up to 5 m bipedally. There is no evidence for either feeding height stratification within any of these clades, or for change in these ecological relationships through the approximately 1.5 Ma record of the Dinosaur Park Formation. Although we cannot reject the possibility, we find no good evidence that feeding height stratification, as revealed by reconstructed maximum feeding heights, played an important role in facilitating niche partitioning among the herbivorous dinosaurs of Laramidia. Most browsing pressure was concentrated in the herb layer, although hadrosaurids were capable of reaching shrubs and low-growing trees that were out of reach from ceratopsids, ankylosaurs, and other small herbivores, effectively dividing the herbivores in terms of relative abundance. Sympatric hadrosaurids may have avoided competing with one another by feeding differentially using bipedal and quadrupedal postures. These ecological relationships evidently proved to be evolutionarily stable because they characterize the herbivore assemblage of the Dinosaur Park Formation through time. If niche partitioning served to facilitate the rich diversity of these animals, it may have been achieved by other means in addition to feeding height stratification. Consideration of other feeding height proxies, including dental microwear and skull morphology, may help to alleviate problems of underdetermination identified here.

  10. Method And Reactor For Production Of Aluminum By Carbothermic Reduction Of Alumina

    DOEpatents

    Aune, Jan Arthur; Johansen, Kai

    2004-10-19

    A hollow partition wall is employed to feed carbon material to an underflow of a carbothermic reduction furnace used to make aluminum. The partition wall divides a low temperature reaction zone where aluminum oxide is reacted with carbon to form aluminum carbide and a high temperature reaction zone where the aluminum carbide and remaining aluminum oxide are reacted to form aluminum and carbon monoxide.

  11. Seasonal influence on the response of the somatotropic axis to nutrient restriction and re-alimentation in captive Steller sea lions (Eumetopias jubatus).

    PubMed

    Richmond, Julie P; Jeanniard du Dot, Tiphaine; Rosen, David A S; Zinn, Steven A

    2010-03-01

    Fluctuations in availability of prey resources can impede acquisition of sufficient energy for maintenance and growth. By investigating the hormonal mechanisms of the somatotropic axis that link nutrition, fat metabolism, and lean tissue accretion, we can assess the physiological impact of decreased nutrient intake on growth. Further, species that undergo seasonal periods of reduced intake as a part of their normal life history may have a differential seasonal response to nutrient restriction. This experiment evaluated the influence of season and age on the response of the somatotropic axis, including growth hormone (GH), insulin-like growth factor (IGF)-I, and IGF-binding proteins (BP), to reduced nutrient intake and re-alimentation in Steller sea lions. Eight captive females (five juveniles, three sub-adults) were subject to 28-day periods of food restriction, controlled re-feeding, and ad libitum recovery in summer (long-day photoperiod) and winter (short-day photoperiod). Hormone concentrations were insensitive to type of fish fed (low fat pollock vs. high fat herring), but sensitive to energy intake. Body mass, fat, and IGF-I declined, whereas GH and IGFBP-2 increased during feed restriction. Reduced IGF-I and IGFBP with increased GH during controlled re-feeding suggest that animals did not reach positive energy balance until fed ad libitum. Increased IGF-I, IGFBP-2, IGFBP-3, and reduced GH observed in summer reflected seasonal differences in energy partitioning. There was a strong season and age effect in the response to restriction and re-alimentation, indicating that older, larger animals are better able to cope with stress associated with energy deficit, regardless of season.

  12. A partition function-based weighting scheme in force field parameter development using ab initio calculation results in global configurational space.

    PubMed

    Wu, Yao; Dai, Xiaodong; Huang, Niu; Zhao, Lifeng

    2013-06-05

    In force field parameter development using ab initio potential energy surfaces (PES) as target data, an important but often neglected matter is the lack of a weighting scheme with optimal discrimination power to fit the target data. Here, we developed a novel partition function-based weighting scheme, which not only fits the target potential energies exponentially like the general Boltzmann weighting method, but also reduces the effect of fitting errors leading to overfitting. The van der Waals (vdW) parameters of benzene and propane were reparameterized by using the new weighting scheme to fit the high-level ab initio PESs probed by a water molecule in global configurational space. The molecular simulation results indicate that the newly derived parameters are capable of reproducing experimental properties in a broader range of temperatures, which supports the partition function-based weighting scheme. Our simulation results also suggest that structural properties are more sensitive to vdW parameters than partial atomic charge parameters in these systems although the electrostatic interactions are still important in energetic properties. As no prerequisite conditions are required, the partition function-based weighting method may be applied in developing any types of force field parameters. Copyright © 2013 Wiley Periodicals, Inc.

  13. Partitioning of nitroxides in dispersed systems investigated by ultrafiltration, EPR and NMR spectroscopy.

    PubMed

    Krudopp, Heimke; Sönnichsen, Frank D; Steffen-Heins, Anja

    2015-08-15

    The partitioning behavior of paramagnetic nitroxides in dispersed systems can be determined by deconvolution of electron paramagnetic resonance (EPR) spectra giving equivalent results with the validated methods of ultrafiltration techniques (UF) and pulsed-field gradient nuclear magnetic resonance spectroscopy (PFG-NMR). The partitioning behavior of nitroxides with increasing lipophilicity was investigated in anionic, cationic and nonionic micellar systems and 10 wt% o/w emulsions. Apart from EPR spectra deconvolution, the PFG-NMR was used in micellar solutions as a non-destructive approach, while UF based on separation of very small volume of the aqueous phase. As a function of their substituent and lipophilicity, the proportions of nitroxides that were solubilized in the micellar or emulsion interface increased with increasing nitroxide lipophilicity for all emulsifier used. Comparing the different approaches, EPR deconvolution and UF revealed comparable nitroxide proportions that were solubilized in the interfaces. Those proportions were higher than found with PFG-NMR. For PFG-NMR self-diffusion experiments the reduced nitroxides were used revealing a high dynamic of hydroxylamines and emulsifiers. Deconvolution of EPR spectra turned out to be the preferred method for measuring the partitioning behavior of paramagnetic molecules as it enables distinguishing between several populations at their individual solubilization sites. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Applying graph partitioning methods in measurement-based dynamic load balancing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatele, Abhinav; Fourestier, Sebastien; Menon, Harshitha

    Load imbalance leads to an increasing waste of resources as an application is scaled to more and more processors. Achieving the best parallel efficiency for a program requires optimal load balancing which is a NP-hard problem. However, finding near-optimal solutions to this problem for complex computational science and engineering applications is becoming increasingly important. Charm++, a migratable objects based programming model, provides a measurement-based dynamic load balancing framework. This framework instruments and then migrates over-decomposed objects to balance computational load and communication at runtime. This paper explores the use of graph partitioning algorithms, traditionally used for partitioning physical domains/meshes, formore » measurement-based dynamic load balancing of parallel applications. In particular, we present repartitioning methods developed in a graph partitioning toolbox called SCOTCH that consider the previous mapping to minimize migration costs. We also discuss a new imbalance reduction algorithm for graphs with irregular load distributions. We compare several load balancing algorithms using microbenchmarks on Intrepid and Ranger and evaluate the effect of communication, number of cores and number of objects on the benefit achieved from load balancing. New algorithms developed in SCOTCH lead to better performance compared to the METIS partitioners for several cases, both in terms of the application execution time and fewer number of objects migrated.« less

  15. Bayesian clustering of DNA sequences using Markov chains and a stochastic partition model.

    PubMed

    Jääskinen, Väinö; Parkkinen, Ville; Cheng, Lu; Corander, Jukka

    2014-02-01

    In many biological applications it is necessary to cluster DNA sequences into groups that represent underlying organismal units, such as named species or genera. In metagenomics this grouping needs typically to be achieved on the basis of relatively short sequences which contain different types of errors, making the use of a statistical modeling approach desirable. Here we introduce a novel method for this purpose by developing a stochastic partition model that clusters Markov chains of a given order. The model is based on a Dirichlet process prior and we use conjugate priors for the Markov chain parameters which enables an analytical expression for comparing the marginal likelihoods of any two partitions. To find a good candidate for the posterior mode in the partition space, we use a hybrid computational approach which combines the EM-algorithm with a greedy search. This is demonstrated to be faster and yield highly accurate results compared to earlier suggested clustering methods for the metagenomics application. Our model is fairly generic and could also be used for clustering of other types of sequence data for which Markov chains provide a reasonable way to compress information, as illustrated by experiments on shotgun sequence type data from an Escherichia coli strain.

  16. Alternative measures of lipophilicity: from octanol-water partitioning to IAM retention.

    PubMed

    Giaginis, Costas; Tsantili-Kakoulidou, Anna

    2008-08-01

    This review describes lipophilicity parameters currently used in drug design and QSAR studies. After a short historical overview, the complex nature of lipophilicity as the outcome of polar/nonpolar inter- and intramolecular interactions is analysed and considered as the background for the discussion of the different lipophilicity descriptors. The first part focuses on octanol-water partitioning of neutral and ionisable compounds, evaluates the efficiency of predictions and provides a short description of the experimental methods for the determination of distribution coefficients. A next part is dedicated to reversed-phase chromatographic techniques, HPLC and TLC in lipophilicity assessment. The two methods are evaluated for their efficiency to simulate octanol-water and the progress achieved in the refinement of suitable chromatographic conditions, in particular in the field of HPLC, is outlined. Liposomes as direct models of biological membranes are examined and phospolipophilicity is compared to the traditional lipophilicity concept. Difficulties associated with liposome-water partitioning are discussed. The last part focuses on Immobilised Artificial Membrane (IAM) chromatography as an alternative which combines membrane simulation with rapid measurements. IAM chromatographic retention is compared to octanol-water and liposome-water partitioning as well as to reversed-phase retention and its potential to predict biopartitioning and biological activities is discussed.

  17. Does History Repeat Itself? Wavelets and the Phylodynamics of Influenza A

    PubMed Central

    Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.

    2012-01-01

    Unprecedented global surveillance of viruses will result in massive sequence data sets that require new statistical methods. These data sets press the limits of Bayesian phylogenetics as the high-dimensional parameters that comprise a phylogenetic tree increase the already sizable computational burden of these techniques. This burden often results in partitioning the data set, for example, by gene, and inferring the evolutionary dynamics of each partition independently, a compromise that results in stratified analyses that depend only on data within a given partition. However, parameter estimates inferred from these stratified models are likely strongly correlated, considering they rely on data from a single data set. To overcome this shortfall, we exploit the existing Monte Carlo realizations from stratified Bayesian analyses to efficiently estimate a nonparametric hierarchical wavelet-based model and learn about the time-varying parameters of effective population size that reflect levels of genetic diversity across all partitions simultaneously. Our methods are applied to complete genome influenza A sequences that span 13 years. We find that broad peaks and trends, as opposed to seasonal spikes, in the effective population size history distinguish individual segments from the complete genome. We also address hypotheses regarding intersegment dynamics within a formal statistical framework that accounts for correlation between segment-specific parameters. PMID:22160768

  18. Determination of water saturation using gas phase partitioning tracers and time-lapse electrical conductivity measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Timothy C.; Oostrom, Martinus; Truex, Michael J.

    2013-05-21

    Water saturation is an important indicator of contaminant distribution and plays a governing role in contaminant transport within the vadose zone. Understanding the water saturation distribution is critical for both remediation and contaminant flux monitoring in unsaturated environments. In this work we propose and demonstrate a method of remotely determining water saturation levels using gas phase partitioning tracers and time-lapse bulk electrical conductivity measurements. The theoretical development includes the partitioning chemistry for the tracers we demonstrate (ammonia and carbon dioxide), as well as a review of the petrophysical relationship governing how these tracers influence bulk conductivity. We also investigate methodsmore » of utilizing secondary information provided by electrical conductivity breakthrough magnitudes induced by the tracers. We test the method on clean, well characterized, intermediate-scale sand columns under controlled conditions. Results demonstrate the capability to predict partitioning coefficients and accurately monitor gas breakthrough curves along the length of the column according to the corresponding electrical conductivity response, leading to accurate water saturation estimates. This work is motivated by the need to develop effective characterization and monitoring techniques for contaminated deep vadose zone environments, and provides a proof-of-concept toward uniquely characterizing and monitoring water saturation levels at the field scale and in three-dimensions using electrical resistivity tomography.« less

  19. SAMPL5: 3D-RISM partition coefficient calculations with partial molar volume corrections and solute conformational sampling.

    PubMed

    Luchko, Tyler; Blinov, Nikolay; Limon, Garrett C; Joyce, Kevin P; Kovalenko, Andriy

    2016-11-01

    Implicit solvent methods for classical molecular modeling are frequently used to provide fast, physics-based hydration free energies of macromolecules. Less commonly considered is the transferability of these methods to other solvents. The Statistical Assessment of Modeling of Proteins and Ligands 5 (SAMPL5) distribution coefficient dataset and the accompanying explicit solvent partition coefficient reference calculations provide a direct test of solvent model transferability. Here we use the 3D reference interaction site model (3D-RISM) statistical-mechanical solvation theory, with a well tested water model and a new united atom cyclohexane model, to calculate partition coefficients for the SAMPL5 dataset. The cyclohexane model performed well in training and testing ([Formula: see text] for amino acid neutral side chain analogues) but only if a parameterized solvation free energy correction was used. In contrast, the same protocol, using single solute conformations, performed poorly on the SAMPL5 dataset, obtaining [Formula: see text] compared to the reference partition coefficients, likely due to the much larger solute sizes. Including solute conformational sampling through molecular dynamics coupled with 3D-RISM (MD/3D-RISM) improved agreement with the reference calculation to [Formula: see text]. Since our initial calculations only considered partition coefficients and not distribution coefficients, solute sampling provided little benefit comparing against experiment, where ionized and tautomer states are more important. Applying a simple [Formula: see text] correction improved agreement with experiment from [Formula: see text] to [Formula: see text], despite a small number of outliers. Better agreement is possible by accounting for tautomers and improving the ionization correction.

  20. SAMPL5: 3D-RISM partition coefficient calculations with partial molar volume corrections and solute conformational sampling

    NASA Astrophysics Data System (ADS)

    Luchko, Tyler; Blinov, Nikolay; Limon, Garrett C.; Joyce, Kevin P.; Kovalenko, Andriy

    2016-11-01

    Implicit solvent methods for classical molecular modeling are frequently used to provide fast, physics-based hydration free energies of macromolecules. Less commonly considered is the transferability of these methods to other solvents. The Statistical Assessment of Modeling of Proteins and Ligands 5 (SAMPL5) distribution coefficient dataset and the accompanying explicit solvent partition coefficient reference calculations provide a direct test of solvent model transferability. Here we use the 3D reference interaction site model (3D-RISM) statistical-mechanical solvation theory, with a well tested water model and a new united atom cyclohexane model, to calculate partition coefficients for the SAMPL5 dataset. The cyclohexane model performed well in training and testing (R=0.98 for amino acid neutral side chain analogues) but only if a parameterized solvation free energy correction was used. In contrast, the same protocol, using single solute conformations, performed poorly on the SAMPL5 dataset, obtaining R=0.73 compared to the reference partition coefficients, likely due to the much larger solute sizes. Including solute conformational sampling through molecular dynamics coupled with 3D-RISM (MD/3D-RISM) improved agreement with the reference calculation to R=0.93. Since our initial calculations only considered partition coefficients and not distribution coefficients, solute sampling provided little benefit comparing against experiment, where ionized and tautomer states are more important. Applying a simple pK_{ {a}} correction improved agreement with experiment from R=0.54 to R=0.66, despite a small number of outliers. Better agreement is possible by accounting for tautomers and improving the ionization correction.

  1. Partition dataset according to amino acid type improves the prediction of deleterious non-synonymous SNPs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Jing; Li, Yuan-Yuan; Shanghai Center for Bioinformation Technology, Shanghai 200235

    2012-03-02

    Highlights: Black-Right-Pointing-Pointer Proper dataset partition can improve the prediction of deleterious nsSNPs. Black-Right-Pointing-Pointer Partition according to original residue type at nsSNP is a good criterion. Black-Right-Pointing-Pointer Similar strategy is supposed promising in other machine learning problems. -- Abstract: Many non-synonymous SNPs (nsSNPs) are associated with diseases, and numerous machine learning methods have been applied to train classifiers for sorting disease-associated nsSNPs from neutral ones. The continuously accumulated nsSNP data allows us to further explore better prediction approaches. In this work, we partitioned the training data into 20 subsets according to either original or substituted amino acid type at the nsSNPmore » site. Using support vector machine (SVM), training classification models on each subset resulted in an overall accuracy of 76.3% or 74.9% depending on the two different partition criteria, while training on the whole dataset obtained an accuracy of only 72.6%. Moreover, the dataset was also randomly divided into 20 subsets, but the corresponding accuracy was only 73.2%. Our results demonstrated that partitioning the whole training dataset into subsets properly, i.e., according to the residue type at the nsSNP site, will improve the performance of the trained classifiers significantly, which should be valuable in developing better tools for predicting the disease-association of nsSNPs.« less

  2. K-Partite RNA Secondary Structures

    NASA Astrophysics Data System (ADS)

    Jiang, Minghui; Tejada, Pedro J.; Lasisi, Ramoni O.; Cheng, Shanhong; Fechser, D. Scott

    RNA secondary structure prediction is a fundamental problem in structural bioinformatics. The prediction problem is difficult because RNA secondary structures may contain pseudoknots formed by crossing base pairs. We introduce k-partite secondary structures as a simple classification of RNA secondary structures with pseudoknots. An RNA secondary structure is k-partite if it is the union of k pseudoknot-free sub-structures. Most known RNA secondary structures are either bipartite or tripartite. We show that there exists a constant number k such that any secondary structure can be modified into a k-partite secondary structure with approximately the same free energy. This offers a partial explanation of the prevalence of k-partite secondary structures with small k. We give a complete characterization of the computational complexities of recognizing k-partite secondary structures for all k ≥ 2, and show that this recognition problem is essentially the same as the k-colorability problem on circle graphs. We present two simple heuristics, iterated peeling and first-fit packing, for finding k-partite RNA secondary structures. For maximizing the number of base pair stackings, our iterated peeling heuristic achieves a constant approximation ratio of at most k for 2 ≤ k ≤ 5, and at most frac6{1-(1-6/k)^k} le frac6{1-e^{-6}} < 6.01491 for k ≥ 6. Experiment on sequences from PseudoBase shows that our first-fit packing heuristic outperforms the leading method HotKnots in predicting RNA secondary structures with pseudoknots. Source code, data set, and experimental results are available at http://www.cs.usu.edu/ mjiang/rna/kpartite/.

  3. Partitioning evapotranspiration via continuous sampling of water vapor isotopes over common row crops and candidate biofuel crops.

    NASA Astrophysics Data System (ADS)

    Miller, J. N.; Black, C. K.; Bernacchi, C.

    2014-12-01

    Global demand for renewable energy is accelerating land conversion from common row crops such as maize and soybean to cellulosic biofuel crops such as miscanthus and switchgrass. This land conversion is expected to alter ecohydrology via changes in evapotranspiration (ET). However, the direction in which evapotranspiration will shift, either partitioning more moisture through soil evaporation (E) or through plant transpiration (T) is uncertain. To investigate how land conversion from maize to miscanthus affects ET partitioning we measured the isotopic composition of water vapor via continuous air sampling. We obtained continuous diurnal measurements of δ2H and δ18O for miscanthus and maize on multiple days over the course of the growing season. Water vapor isotopes drawn from two heights were measured at 2 Hz using a cavity ringdown spectrometer and partitioned into components of E and T using a simple mixing equation. A second approach to partitioning was accomplished by subtracting transpiration measurements, obtained through sap flow sensors, from total ET, measured via eddy covariance. Preliminary results reveal that both methods compare favorably and that transpiration dominates variations in ET in miscanthus fields more so than in fields of maize.

  4. Fasting Insulin is Better Partitioned according to Family History of Type 2 Diabetes Mellitus than Post Glucose Load Insulin of Oral Glucose Tolerance Test in Young Adults.

    PubMed

    Francis, Saritha; Chandran, Sindhu Padinjareveedu; Nesheera, K K; Jacob, Jose

    2017-05-01

    Hyperinsulinemia is contributed by insulin resistance, hepatic insulin uptake, insulin secretion and rate of insulin degradation. Family history of type 2 diabetes mellitus has been reported to cause hyperinsulinemia. Correlation of fasting insulin with post glucose load Oral Glucose Tolerance Test (OGTT) insulin in young adults and their partitioning according to family history of type 2 diabetes. In this observational cross-sectional study, clinical evaluation and biochemical assays of insulin and diabetes related parameters, and secondary clinical influences on type 2 diabetes in volunteers were done for inclusion as participants (n=90) or their exclusion. Cut off levels of quantitative biochemical variables were fixed such that they included the effects of insulin resistance, but excluded other secondary clinical influences. Distribution was analysed by Shapiro-Wilk test; equality of variances by Levene's test; Log 10 transformations for conversion of groups to Gaussian distribution and for equality of variances in the groups compared. When the groups compared had Gaussian distribution and there was equality of variance, parametric methods were used. Otherwise, non parametric methods were used. Fasting insulin was correlating significantly with 30, 60 and 120 minute OGTT insulin showing that hyperinsulinemia in the fasting state was related to hyperinsulinemia in the post glucose load states. When fasting and post glucose load OGTT insulin were partitioned into those without and with family history of type 2 diabetes, maximum difference was seen in fasting insulin (p<0.001), followed by 120 (p=0.001) and 60 (p= 0.002) minute OGTT insulin. The 30 minute insulin could not be partitioned (p=0.574). Fasting, 60 and 120 minute OGTT insulin can be partitioned according to family history of type 2 diabetes, demonstrating stratification and heterogeneity in the insulin sample. Of these, fasting insulin was better partitioned and could be used for baseline reference interval calculations.

  5. A Model for Partitioning CO2 Flux and Calculating Transformation of Soil C Fractions

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Noormets, A.; Tu, C.; King, J.

    2011-12-01

    It has been recognized that mechanistic understanding of soil organic carbon (SOC) mineralization requires partitioning of SOM to different sub-pools, whose turnover kinetics differ. Different fractionation methods have been developed to separate and analyze SOC fractions with different turnover rates, but some recent studies have called to questions earlier assumptions about chemical structure of C compounds and their recalcitrance to decomposition. To our knowledge, there is also no model that would bring together the information on various indicators of recalcitrance in a kinetic model framework . Here we deploy an analytical framework to partition soil net CO2 emissions to three density fractions (F1, F2, and F3, in the order of increasing density) in a peat soil and follow mineralization-related transformations (from lighter to heavier fractions). We followed the changes in total C content [C] and 13C of each three density fractions through a 3-month incubation study. We partitioned the CO2 produced by the soil between the different fractions using 13C and [C] change data. Applying this approach to a factorial experiment, we found that partitioning of CO2 emission and transformation rates among fractions differed between the organic top soil and deeper sandy soil. At depth of 45-75cm, almost no C was released through CO2 emission for all three fractions, while at 0-30cm, emission reached 0.2 g C/g soil over the incubation period, an average of 99% of which was from F2. Mineralization-related transformation rate at 45-75cm was 0.02 g soil/g soil with no significant differences among fractions. At 0-30cm, out of one gram of initial bulk soil, an average of 0.31g F1 transformed to F2, whereas no F2 was transformed to F3. Although the current study was carried out on a high-organic soil, the partitioning method is applicable to all soil types.

  6. Partitioning of fluoranthene between free and bound forms in stormwater runoff and other urban discharges using passive dosing.

    PubMed

    Birch, Heidi; Mayer, Philipp; Lützhøft, Hans-Christian Holten; Mikkelsen, Peter Steen

    2012-11-15

    Partitioning of fluoranthene in stormwater runoff and other urban discharges was measured by a new analytical method based on passive dosing. Samples were collected at the inlet (n = 11) and outlet (n = 8) from a stormwater retention pond in Albertslund (Denmark), and for comparison samples were also obtained at a municipal wastewater treatment plant, a power plant, a contaminated site and a waste deposit in Copenhagen (n = 1 at each site). The freely dissolved concentration of (14)C-fluoranthene in the samples was controlled by equilibrium partitioning from a pre-loaded polymer and the total sample concentration measured. The measurements yielded free fractions of fluoranthene in stormwater in the range 0.04-0.15 in the inlet during the first part of the runoff events increasing to 0.3-0.5 at the end of the events and in the outlet from the retention pond. The enhanced capacity of the different stormwater samples for carrying fluoranthene was 2-23 relative to pure water and decreasing during rain events. The enhanced capacity of stormwater showed a different relationship with suspended solid concentrations than the other types of urban discharges. Partitioning of fluoranthene to dissolved organic carbon was lower than partitioning to particulate organic carbon. Partitioning of fluoranthene to particulate organic matter in the 19 stormwater samples yielded a log K(POM) of 5.18. The presented results can be used in stormwater quality modeling and assessment of efficiency of stormwater treatment systems. This work also shows the potential of the passive dosing method to obtain conversion factors between total concentrations, which are needed for comparison with water quality criteria, and freely dissolved concentrations, which are more related to toxicity and obtained by the use of most passive samplers. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Evapotranspiration Partitioning Using Rapid Measurements of Isotopic Composition of Water Vapor in a Semi Arid Evergreen Forest

    NASA Astrophysics Data System (ADS)

    Meuth, J. A.; Dominguez, F.

    2011-12-01

    Evapotranspiration partitioning into transpiration and evaporation is an important step in understanding the relative contribution of the vegetated land surface to total atmospheric moisture in an area. This type of study has rarely been done over long time periods focusing on small time scales of variation. The relative contributions of whole canopy transpiration and soil evaporation to total evapotranspiration were determined in a mid-latitude semi arid evergreen forest using stable isotope measurements of atmospheric water vapor. We used a cavity ringdown spectrometer to collect continuous 5-second average isotopic and water vapor measurements throughout the ecosystem boundary layer. In addition, we analyzed the isotopic composition of liquid water extracted from soil, leaf and stem samples to obtain relative contributions of transpiration and evaporation to whole canopy evapotranspriation. The results from this method provided many time periods throughout the day with statistically significant data. This method can be used to follow daily, monthly, or yearly cycles of evapotranspiration partitioning with relative ease and accuracy.

  8. End-to-end QoS bounds for RTP-based service subnetworks

    NASA Astrophysics Data System (ADS)

    Pitts, Jonathan M.; Schormans, John A.

    1999-11-01

    With the increasing focus on traffic prioritization to support voice-data integration in corporate intranets, practical methods are needed to dimension and manage cost efficient service partitions. This is particularly important for the provisioning of real time, delay sensitive services such as telephony and voice/video conferencing applications. Typically these can be provided over RTP/UDP/IP or ATM DBR/SBR bearers but, irrespective of the specific networking technology, the switches or routers need to implement some form of virtual buffer management with queue scheduling mechanisms to provide partitioning. The key requirement is for operators of such networks to be able to dimension the partitions and virtual buffer sizes for efficient resource utilization, instead of simply over-dimensioning. This paper draws on recent work at Queen Mary, University of London, supported by the UK Engineering and Physical Sciences Research Council, to investigate approximate analytical methods for assessing end to end delay variation bounds in cell based and packet based networks.

  9. ORGANIC-HIGH IONIC STRENGTH AQUEOUS SOLVENT SYSTEMS FOR SPIRAL COUNTER-CURRENT CHROMATOGRAPHY: GRAPHIC OPTIMIZATION OF PARTITION COEFFICIENT

    PubMed Central

    Zeng, Yun; Liu, Gang; Ma, Ying; Chen, Xiaoyuan; Ito, Yoichiro

    2012-01-01

    A new series of organic-high ionic strength aqueous two-phase solvents systems was designed for separation of highly polar compounds by spiral high-speed counter-current chromatography. A total of 21 solvent systems composed of 1-butanol-ethanol-saturated ammonium sulfate-water at various volume ratios are arranged according to an increasing order of polarity. Selection of the two-phase solvent system for a single compound or a multiple sample mixture can be achieved by two steps of partition coefficient measurements using a graphic method. The capability of the method is demonstrated by optimization of partition coefficient for seven highly polar samples including tartrazine (K=0.77), tryptophan (K=1.00), methyl green (K= 0.93), tyrosine (0.81), metanephrine (K=0.89), tyramine (K=0.98), and normetanephrine (K=0.96). Three sulfonic acid components in D&C Green No. 8 were successfully separated by HSCCC using the graphic selection of the two-phase solvent system. PMID:23467197

  10. Bladder cancer treatment response assessment in CT urography using two-channel deep-learning network

    NASA Astrophysics Data System (ADS)

    Cha, Kenny H.; Hadjiiski, Lubomir M.; Chan, Heang-Ping; Samala, Ravi K.; Cohan, Richard H.; Caoili, Elaine M.; Weizer, Alon Z.; Alva, Ajjai

    2018-02-01

    We are developing a CAD system for bladder cancer treatment response assessment in CT. We trained a 2- Channel Deep-learning Convolution Neural Network (2Ch-DCNN) to identify responders (T0 disease) and nonresponders to chemotherapy. The 87 lesions from 82 cases generated 18,600 training paired ROIs that were extracted from segmented bladder lesions in the pre- and post-treatment CT scans and partitioned for 2-fold cross validation. The paired ROIs were input to two parallel channels of the 2Ch-DCNN. We compared the 2Ch-DCNN with our hybrid prepost- treatment ROI DCNN method and the assessments by 2 experienced abdominal radiologists. The radiologist estimated the likelihood of stage T0 after viewing each pre-post-treatment CT pair. Receiver operating characteristic analysis was performed and the area under the curve (AUC) and the partial AUC at sensitivity <90% (AUC0.9) were compared. The test AUCs were 0.76+/-0.07 and 0.75+/-0.07 for the 2 partitions, respectively, for the 2Ch-DCNN, and were 0.75+/-0.08 and 0.75+/-0.07 for the hybrid ROI method. The AUCs for Radiologist 1 were 0.67+/-0.09 and 0.75+/-0.07 for the 2 partitions, respectively, and were 0.79+/-0.07 and 0.70+/-0.09 for Radiologist 2. For the 2Ch-DCNN, the AUC0.9s were 0.43 and 0.39 for the 2 partitions, respectively, and were 0.19 and 0.28 for the hybrid ROI method. For Radiologist 1, the AUC0.9s were 0.14 and 0.34 for partition 1 and 2, respectively, and were 0.33 and 0.23 for Radiologist 2. Our study demonstrated the feasibility of using a 2Ch-DCNN for the estimation of bladder cancer treatment response in CT.

  11. Integrating spot short-term measurements of carbon emissions and backward dietary energy partition calculations to estimate intake in lactating dairy cows fed ad libitum or restricted.

    PubMed

    Pereira, A B D; Utsumi, S A; Dorich, C D; Brito, A F

    2015-12-01

    The objective of this study was to use spot short-term measurements of CH4 (QCH4) and CO2 (QCO2) integrated with backward dietary energy partition calculations to estimate dry matter intake (DMI) in lactating dairy cows. Twelve multiparous cows averaging 173±37d in milk and 4 primiparous cows averaging 179±27d in milk were blocked by days in milk, parity, and DMI (as a percentage of body weight) and, within each block, randomly assigned to 1 of 2 treatments: ad libitum intake (AL) or restricted intake (RI=90% DMI) according to a crossover design. Each experimental period lasted 22d with 14d for treatments adaptation and 8d for data and sample collection. Diets contained (dry matter basis): 40% corn silage, 12% grass-legume haylage, and 48% concentrate. Spot short-term gas measurements were taken in 5-min sampling periods from 15 cows (1 cow refused sampling) using a portable, automated, open-circuit gas quantification system (GreenFeed, C-Lock Inc., Rapid City, SD) with intervals of 12h between the 2daily samples. Sampling points were advanced 2h from a day to the next to yield 16 gas samples per cow over 8d to account for diurnal variation in QCH4 and QCO2. The following equations were used sequentially to estimate DMI: (1) heat production (MJ/d)=(4.96 + 16.07 ÷ respiratory quotient) × QCO2; respiratory quotient=0.95; (2) metabolizable energy intake (MJ/d)=(heat production + milk energy) ± tissue energy balance; (3) digestible energy (DE) intake (MJ/d)=metabolizable energy + CH4 energy + urinary energy; (4) gross energy (GE) intake (MJ/d)=DE + [(DE ÷ in vitro true dry matter digestibility) - DE]; and (5) DMI (kg/d)=GE intake estimated ÷ diet GE concentration. Data were analyzed using the MIXED procedure of SAS (SAS Institute Inc., Cary, NC) and Fit Model procedure in JMP (α=0.05; SAS Institute Inc.). Cows significantly differed in DMI measured (23.8 vs. 22.4kg/d for AL and RI, respectively). Dry matter intake estimated using QCH4 and QCO2 coupled with dietary backward energy partition calculations (Equations 1 to 5 above) was highest in cows fed for AL (22.5 vs. 20.2kg/d). The resulting R(2) were 0.28 between DMI measured and DMI estimated by gaseous measurements, and 0.36 between DMI measured and DMI predicted by the National Research Council model (2001). Results showed that spot short-term measurements of QCH4 and QCO2 coupled with dietary backward estimations of energy partition underestimated DMI by 7.8%. However, the approach proposed herein was able to significantly discriminate differences in DMI between cows fed for AL or RI. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1313 and Method 1316

    EPA Science Inventory

    This document summarizes the results of an interlaboratory study conducted to generate precision estimates for two parallel batch leaching methods which are part of the Leaching Environmental Assessment Framework (LEAF). These methods are: (1) Method 1313: Liquid-Solid Partition...

  13. Fundamental studies on the feasibility of deep eutectic solvents for the selective partition of glaucarubinone present in the roots of Simarouba glauca.

    PubMed

    Kholiya, Faisal; Bhatt, Nidhi; Rathod, Meena R; Meena, Ramavatar; Prasad, Kamalesh

    2015-07-14

    Several deep eutectic solvents prepared by the complexation of choline chloride as the hydrogen bond acceptor and hydrogen bond donors such as urea, thiourea, ethylene glycol, and glycerol were employed to partition glaucarubinone, an antimalarial compound present in roots of the plant, Simarouba glauca. Among all the solvents, the deep eutectic solvent consisting of the mixture of choline chloride and urea the most suitable to partition the antimalarial compound from the extract selectively. Analytical tools such as high-performance liquid chromatography and electrospray ionization mass spectrometry were used for characterizations, and glaucarubinone extracted from the roots of the plant by conventional solvent extraction method was used as a reference for comparison. The hydrogen and noncovalent bonds formed between glaucarubinone and the deep eutectic solvents could be responsible for the selective partition of the drug molecule. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. [Determination of equilibrium solubility and n-octanol/water partition coefficient of pulchinenosiden D by HPLC].

    PubMed

    Rao, Xiao-Yong; Yin, Shan; Zhang, Guo-Song; Luo, Xiao-Jian; Jian, Hui; Feng, Yu-Lin; Yang, Shi-Lin

    2014-05-01

    To determine the equilibrium solubility of pulchinenosiden D in different solvents and its n-octanol/water partition coefficients. Combining shaking flask method and high performance liquid chromatography (HPLC) to detect the n-octanol/water partition coefficients of pulchinenosiden D, the equilibrium solubility of pulchinenosiden D in six organic solvents and different pH buffer solution were determined by HPLC analysis. n-Octanol/water partition coefficients of pulchinenosiden D in different pH were greater than zero, the equilibrium solubility of pulchinenosiden D was increased with increase the pH of the buffer solution. The maximum equilibrium solubility of pulchinenosiden D was 255.89 g x L(-1) in methanol, and minimum equilibrium solubility of pulchinenosiden D was 0.20 g x L(-1) in acetonitrile. Under gastrointestinal physiological conditions, pulchinenosiden D exists in molecular state and it has good absorption but poor water-solubility, so increasing the dissolution rate of pulchinenosiden D may enhance its bioavailability.

  15. Is the gas-particle partitioning in alpha-pinene secondary organic aerosol reversible?

    NASA Astrophysics Data System (ADS)

    Grieshop, Andrew P.; Donahue, Neil M.; Robinson, Allen L.

    2007-07-01

    This paper discusses the reversibility of gas-particle partitioning in secondary organic aerosol (SOA) formed from α-pinene ozonolysis in a smog chamber. Previously, phase partitioning has been studied quantitatively via SOA production experiments and qualitatively by perturbing temperature and observing particle evaporation. In this work, two methods were used to isothermally dilute the SOA: an external dilution sampler and an in-chamber technique. Dilution caused some evaporation of SOA, but repartitioning took place on a time scale of tens of minutes to hours-consistent with an uptake coefficient on the order of 0.001-0.01. However, given sufficient time, α-pinene SOA repartitions reversibly based on comparisons with data from conventional SOA yield experiments. Further, aerosol mass spectrometer (AMS) data indicate that the composition of SOA varies with partitioning. These results suggest that oligomerization observed in high-concentration laboratory experiments may be a reversible process and underscore the complexity of the kinetics of formation and evaporation of SOA.

  16. Thermal expansion and cation partitioning of MnFe2O4 (Jacobsite) from 1.6 to 1276 K studied by using neutron powder diffraction

    NASA Astrophysics Data System (ADS)

    Levy, Davide; Pastero, Linda; Hoser, Andreas; Viscovo, Gabriele

    2015-01-01

    MnFe2O4 is a low-cost and stable magnetic spinel ferrite. In this phase, the influence of the inversion degree on the magnetic properties is still not well understood. To understand this relationship, Mn-ferrite was synthesized by a chemical co-precipitation method modified in our laboratory and studied by using the Neutron Powder Diffraction from 1.6 K to 1243 K. A full refinement of both crystal and magnetic structures was performed in order to correlate the high-temperature cation partitioning, the Curie transition and the structure changes of the Mn-ferrite. In this work three main temperature intervals are detected, characterized by different Mn-ferrite behaviors: first, ranging from 1.6 K to 573 K, where MnFe2O4 is magnetic; second, from 573 K to 623 K, where MnFe2O4 becomes paramagnetic without cation partitioning; and lastly, from 673 K to 1243 K, where cation partitioning occurs.

  17. A parallel algorithm for multi-level logic synthesis using the transduction method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Lim, Chieng-Fai

    1991-01-01

    The Transduction Method has been shown to be a powerful tool in the optimization of multilevel networks. Many tools such as the SYLON synthesis system (X90), (CM89), (LM90) have been developed based on this method. A parallel implementation is presented of SYLON-XTRANS (XM89) on an eight processor Encore Multimax shared memory multiprocessor. It minimizes multilevel networks consisting of simple gates through parallel pruning, gate substitution, gate merging, generalized gate substitution, and gate input reduction. This implementation, called Parallel TRANSduction (PTRANS), also uses partitioning to break large circuits up and performs inter- and intra-partition dynamic load balancing. With this, good speedups and high processor efficiencies are achievable without sacrificing the resulting circuit quality.

  18. Novel methods for predicting gas-particle partitioning during the formation of secondary organic aerosol

    NASA Astrophysics Data System (ADS)

    Wania, F.; Lei, Y. D.; Wang, C.; Abbatt, J. P. D.; Goss, K.-U.

    2014-12-01

    Several methods have been presented in the literature to predict an organic chemical's equilibrium partitioning between the water insoluble organic matter (WIOM) component of aerosol and the gas phase, Ki,WIOM, as a function of temperature. They include (i) polyparameter linear free energy relationships calibrated with empirical aerosol sorption data, as well as (ii) the solvation models implemented in SPARC and (iii) the quantum-chemical software COSMOtherm, which predict solvation equilibria from molecular structure alone. We demonstrate that these methods can be used to predict Ki,WIOM for large numbers of individual molecules implicated in secondary organic aerosol (SOA) formation, including those with multiple functional groups. Although very different in their theoretical foundations, these methods give remarkably consistent results for the products of the reaction of normal alkanes with OH, i.e. their partition coefficients Ki,WIOM generally agree within one order of magnitude over a range of more than ten orders of magnitude. This level of agreement is much better than that achieved by different vapour pressure estimation methods that are more commonly used in the SOA community. Also, in contrast to the agreement between vapour pressure estimates, the agreement between the Ki,WIOM estimates does not deteriorate with increasing number of functional groups. Furthermore, these partitioning coefficients Ki,WIOM predicted SOA mass yields in agreement with those measured in chamber experiments of the oxidation of normal alkanes. If a Ki,WIOM prediction method was based on one or more surrogate molecules representing the solvation properties of the mixed OM phase of SOA, the choice of those molecule(s) was found to have a relatively minor effect on the predicted Ki,WIOM, as long as the molecule(s) are not very polar. This suggests that a single surrogate molecule, such as 1-octanol or a hypothetical SOA structure proposed by Kalberer et al. (2004), may often be sufficient to represent the WIOM component of the SOA phase, greatly simplifying the prediction. The presented methods could substitute for vapour-pressure-based methods in studies such as the explicit modelling of SOA formation from single precursor molecules in chamber experiments.

  19. Comments on "The multisynapse neural network and its application to fuzzy clustering".

    PubMed

    Yu, Jian; Hao, Pengwei

    2005-05-01

    In the above-mentioned paper, Wei and Fahn proposed a neural architecture, the multisynapse neural network, to solve constrained optimization problems including high-order, logarithmic, and sinusoidal forms, etc. As one of its main applications, a fuzzy bidirectional associative clustering network (FBACN) was proposed for fuzzy-partition clustering according to the objective-functional method. The connection between the objective-functional-based fuzzy c-partition algorithms and FBACN is the Lagrange multiplier approach. Unfortunately, the Lagrange multiplier approach was incorrectly applied so that FBACN does not equivalently minimize its corresponding constrained objective-function. Additionally, Wei and Fahn adopted traditional definition of fuzzy c-partition, which is not satisfied by FBACN. Therefore, FBACN can not solve constrained optimization problems, either.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Do, Hainam, E-mail: h.do@nottingham.ac.uk, E-mail: richard.wheatley@nottingham.ac.uk; Wheatley, Richard J., E-mail: h.do@nottingham.ac.uk, E-mail: richard.wheatley@nottingham.ac.uk

    A robust and model free Monte Carlo simulation method is proposed to address the challenge in computing the classical density of states and partition function of solids. Starting from the minimum configurational energy, the algorithm partitions the entire energy range in the increasing energy direction (“upward”) into subdivisions whose integrated density of states is known. When combined with the density of states computed from the “downward” energy partitioning approach [H. Do, J. D. Hirst, and R. J. Wheatley, J. Chem. Phys. 135, 174105 (2011)], the equilibrium thermodynamic properties can be evaluated at any temperature and in any phase. The methodmore » is illustrated in the context of the Lennard-Jones system and can readily be extended to other molecular systems and clusters for which the structures are known.« less

  1. Efficient O(N) integration for all-electron electronic structure calculation using numeric basis functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Havu, V.; Fritz Haber Institute of the Max Planck Society, Berlin; Blum, V.

    2009-12-01

    We consider the problem of developing O(N) scaling grid-based operations needed in many central operations when performing electronic structure calculations with numeric atom-centered orbitals as basis functions. We outline the overall formulation of localized algorithms, and specifically the creation of localized grid batches. The choice of the grid partitioning scheme plays an important role in the performance and memory consumption of the grid-based operations. Three different top-down partitioning methods are investigated, and compared with formally more rigorous yet much more expensive bottom-up algorithms. We show that a conceptually simple top-down grid partitioning scheme achieves essentially the same efficiency as themore » more rigorous bottom-up approaches.« less

  2. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Gumaste, U.; Ronaghi, M.

    1994-01-01

    Applications are described of high-performance parallel, computation for the analysis of complete jet engines, considering its multi-discipline coupled problem. The coupled problem involves interaction of structures with gas dynamics, heat conduction and heat transfer in aircraft engines. The methodology issues addressed include: consistent discrete formulation of coupled problems with emphasis on coupling phenomena; effect of partitioning strategies, augmentation and temporal solution procedures; sensitivity of response to problem parameters; and methods for interfacing multiscale discretizations in different single fields. The computer implementation issues addressed include: parallel treatment of coupled systems; domain decomposition and mesh partitioning strategies; data representation in object-oriented form and mapping to hardware driven representation, and tradeoff studies between partitioning schemes and fully coupled treatment.

  3. Check-Standard Testing Across Multiple Transonic Wind Tunnels with the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Deloach, Richard

    2012-01-01

    This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared.

  4. The challenge to unify treatment of high-temperature fatigue - A partisan proposal based on strainrange partitioning

    NASA Technical Reports Server (NTRS)

    Manson, S. S.

    1972-01-01

    The strainrange partitioning concept divides the imposed strain into four basic ranges involving time-dependent and time-independent components. It is shown that some of the results presented at the symposium can be better correlated on the basis of this concept than by alternative methods. It is also suggested that methods of data generation and analysis can be helpfully guided by this approach. Potential applicability of the concept to the treatment of frequency and hold-time effects, environmental influence, crack initiation and growth, thermal fatigue, and code specifications are briefly considered. A required experimental program is outlined.

  5. Ray tracing a three-dimensional scene using a hierarchical data structure

    DOEpatents

    Wald, Ingo; Boulos, Solomon; Shirley, Peter

    2012-09-04

    Ray tracing a three-dimensional scene made up of geometric primitives that are spatially partitioned into a hierarchical data structure. One example embodiment is a method for ray tracing a three-dimensional scene made up of geometric primitives that are spatially partitioned into a hierarchical data structure. In this example embodiment, the hierarchical data structure includes at least a parent node and a corresponding plurality of child nodes. The method includes a first act of determining that a first active ray in the packet hits the parent node and a second act of descending to each of the plurality of child nodes.

  6. Temperature-mortality relationship in dairy cattle in France based on an iso-hygro-thermal partition of the territory

    NASA Astrophysics Data System (ADS)

    Morignat, Eric; Gay, Emilie; Vinard, Jean-Luc; Calavas, Didier; Hénaux, Viviane

    2017-11-01

    The issue of global warming and more specifically its health impact on populations is increasingly concerning. The aim of our study was to evaluate the impact of temperature on dairy cattle mortality in France during the warm season (April-August). We therefore devised and implemented a spatial partitioning method to divide France into areas in which weather conditions were homogeneous, combining a multiple factor analysis with a clustering method using both weather and spatial data. We then used time-series regressions (2001-2008) to model the relationship between temperature humidity index (an index representing the temperature corrected by the relative humidity) and dairy cattle mortality within these areas. We found a significant effect of heat on dairy cattle mortality, but also an effect of cooler temperatures (to a lesser extent in some areas), which leads to a U-shaped relationship in the studied areas. Our partitioning approach based on weather criteria, associated with classic clustering methods, may contribute to better estimating temperature effects, a critical issue for animal health and welfare. Beyond the interest of its use in animal health, this approach can also be of interest in several situations in the frame of human health.

  7. A path integral methodology for obtaining thermodynamic properties of nonadiabatic systems using Gaussian mixture distributions

    NASA Astrophysics Data System (ADS)

    Raymond, Neil; Iouchtchenko, Dmitri; Roy, Pierre-Nicholas; Nooijen, Marcel

    2018-05-01

    We introduce a new path integral Monte Carlo method for investigating nonadiabatic systems in thermal equilibrium and demonstrate an approach to reducing stochastic error. We derive a general path integral expression for the partition function in a product basis of continuous nuclear and discrete electronic degrees of freedom without the use of any mapping schemes. We separate our Hamiltonian into a harmonic portion and a coupling portion; the partition function can then be calculated as the product of a Monte Carlo estimator (of the coupling contribution to the partition function) and a normalization factor (that is evaluated analytically). A Gaussian mixture model is used to evaluate the Monte Carlo estimator in a computationally efficient manner. Using two model systems, we demonstrate our approach to reduce the stochastic error associated with the Monte Carlo estimator. We show that the selection of the harmonic oscillators comprising the sampling distribution directly affects the efficiency of the method. Our results demonstrate that our path integral Monte Carlo method's deviation from exact Trotter calculations is dominated by the choice of the sampling distribution. By improving the sampling distribution, we can drastically reduce the stochastic error leading to lower computational cost.

  8. SL(2, C) group action on cohomological field theories

    NASA Astrophysics Data System (ADS)

    Basalaev, Alexey

    2018-01-01

    We introduce the S} (2,C) group action on a partition function of a cohomological field theory via a certain Givental's action. Restricted to the small phase space we describe the action via the explicit formulae on a CohFT genus g potential. We prove that applied to the total ancestor potential of a simple-elliptic singularity the action introduced coincides with the transformation of Milanov-Ruan changing the primitive form (cf. Milanov and Ruan in Gromov-Witten theory of elliptic orbifold P1 and quasi-modular forms, arXiv:1106.2321 , 2011).

  9. [Study for the revision of analytical method for tris (2,3-dibromopropyl) phosphate with restriction in textiles].

    PubMed

    Mimura, Mayumi; Nakashima, Harunobu; Yoshida, Jin; Yoshida, Toshiaki; Kawakami, Tsuyoshi; Isama, Kazuo

    2014-01-01

    The official analytical method for tris(2,3-dibromopropyl)phosphate (TDBPP), which is banned from use in textile products by the "Act on Control of Household Products Containing Harmful Substances", requires revision. This study examined an analytical method for TDBPP by GC/MS using a capillary column. Thermal decomposition of TDBPP was observed by GC/MS measurement using capillary column, unlike in the case of gas chromatography/flame photometric detector (GC/FPD) measurement based on a direct injection method using a capillary megabore column. A quadratic curve, Y=2572X(1.416), was obtained for the calibration curve of GC/FPD in the concentration range 2.0-100 μg/mL. The detection limit was 1.0 μg/mL under S/N=3. The reproducibility for repetitive injections was satisfactory. A pretreatment method was established using methanol extraction, followed by liquid-liquid partition and purification with a florisil cartridge column. The recovery rate of this method was ~100%. TDBPP was not detected in any of the five commercial products that this study analyzed. To understand the cause of TDBPP decomposition during GC/MS (electron ionization; EI) measurement using capillary column, GC/MS (chemical ionization; CI), GC/FPD, and gas chromatography/flame ionization detector (GC/FID) measurements were conducted. It was suggested that TDBPP might thermally decompose both during GC injection, especially through a splitless injection method, and in the column or ion sources. To attempt GC/MS measurement, an injection part comprising quartz liner was used and the column length was halved (15 m); thus, only one peak could be obtained.

  10. Joint multifractal analysis based on the partition function approach: analytical analysis, numerical simulation and empirical application

    NASA Astrophysics Data System (ADS)

    Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing

    2015-10-01

    Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.

  11. Communication: Charge-population based dispersion interactions for molecules and materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stöhr, Martin; Department Chemie, Technische Universität München, Lichtenbergstr. 4, D-85748 Garching; Michelitsch, Georg S.

    2016-04-21

    We introduce a system-independent method to derive effective atomic C{sub 6} coefficients and polarizabilities in molecules and materials purely from charge population analysis. This enables the use of dispersion-correction schemes in electronic structure calculations without recourse to electron-density partitioning schemes and expands their applicability to semi-empirical methods and tight-binding Hamiltonians. We show that the accuracy of our method is en par with established electron-density partitioning based approaches in describing intermolecular C{sub 6} coefficients as well as dispersion energies of weakly bound molecular dimers, organic crystals, and supramolecular complexes. We showcase the utility of our approach by incorporation of the recentlymore » developed many-body dispersion method [Tkatchenko et al., Phys. Rev. Lett. 108, 236402 (2012)] into the semi-empirical density functional tight-binding method and propose the latter as a viable technique to study hybrid organic-inorganic interfaces.« less

  12. Dosimetry and prescription in liver radioembolization with 90Y microspheres: 3D calculation of tumor-to-liver ratio from global 99mTc-MAA SPECT information

    NASA Astrophysics Data System (ADS)

    Mañeru, Fernando; Abós, Dolores; Bragado, Laura; Fuentemilla, Naiara; Caudepón, Fernando; Pellejero, Santiago; Miquelez, Santiago; Rubio, Anastasio; Goñi, Elena; Hernández-Vitoria, Araceli

    2017-12-01

    Dosimetry in liver radioembolization with 90Y microspheres is a fundamental tool, both for the optimization of each treatment and for improving knowledge of the treatment effects in the tissues. Different options are available for estimating the administered activity and the tumor/organ dose, among them the so-called partition method. The key factor in the partition method is the tumor/normal tissue activity uptake ratio (T/N), which is obtained by a single-photon emission computed tomography (SPECT) scan during a pre-treatment simulation. The less clear the distinction between healthy and tumor parenchyma within the liver, the more difficult it becomes to estimate the T/N ratio; therefore the use of the method is limited. This study presents a methodology to calculate the T/N ratio using global information from the SPECT. The T/N ratio is estimated by establishing uptake thresholds consistent with previously performed volumetry. This dose calculation method was validated against 3D voxel dosimetry, and was also compared with the standard partition method based on freehand regions of interest (ROI) outlining on SPECT slices. Both comparisons were done on a sample of 20 actual cases of hepatocellular carcinoma treated with resin microspheres. The proposed method and the voxel dosimetry method yield similar results, while the ROI-based method tends to over-estimate the dose to normal tissues. In addition, the variability associated with the ROI-based method is more extreme than the other methods. The proposed method is simpler than either the ROI or voxel dosimetry approaches and avoids the subjectivity associated with the manual selection of regions.

  13. Evapotranspiration partitioning for three agro-ecosystems with contrasting moisture conditions: a comparison of an isotope method and a two-source model calculation

    NASA Astrophysics Data System (ADS)

    Wei, Z.; Lee, X.; Wen, X.; Xiao, W.

    2017-12-01

    Quantification of the contribution of transpiration (T) to evapotranspiration (ET) is a requirement for understanding changes in carbon assimilation and water cycling in a changing environment. So far, few studies have examined seasonal variability of T/ET and compared different ET partitioning methods under natural conditions across diverse agro-ecosystems. In this study, we apply a two-source model to partition ET for three agro-ecosystems (rice, wheat and corn). The model-estimated T/ET ranges from 0 to 1, with a near continuous increase over time in the early growing season when leaf area index (LAI) is less than 2.5 and then convergence towards a stable value beyond LAI of 2.5. The seasonal change in T/ET can be described well as a function of LAI, implying that LAI is a first-order factor affecting ET partitioning. The two-source model results show that the growing-season (May - September for rice, April - June for wheat and June to September for corn) T/ET is 0.50, 0.84 and 0.64, while an isotopic approach shows that T/ET is 0.74, 0.93 and 0.81 for rice, wheat and maize, respectively. The two-source model results are supported by soil lysimeter and eddy covariance measurements made during the same time period for wheat (0.87). Uncertainty analysis suggests that further improvements to the Craig-Gordon model prediction of the evaporation isotope composition and to measurement of the isotopic composition of ET are necessary to achieve accurate flux partitioning at the ecosystem scale using water isotopes as tracers.

  14. Partitioning net ecosystem carbon exchange into net assimilation and respiration using 13CO2 measurements: A cost-effective sampling strategy

    NASA Astrophysics Data System (ADS)

    OgéE, J.; Peylin, P.; Ciais, P.; Bariac, T.; Brunet, Y.; Berbigier, P.; Roche, C.; Richard, P.; Bardoux, G.; Bonnefond, J.-M.

    2003-06-01

    The current emphasis on global climate studies has led the scientific community to set up a number of sites for measuring the long-term biosphere-atmosphere net CO2 exchange (net ecosystem exchange, NEE). Partitioning this flux into its elementary components, net assimilation (FA), and respiration (FR), remains necessary in order to get a better understanding of biosphere functioning and design better surface exchange models. Noting that FR and FA have different isotopic signatures, we evaluate the potential of isotopic 13CO2 measurements in the air (combined with CO2 flux and concentration measurements) to partition NEE into FR and FA on a routine basis. The study is conducted at a temperate coniferous forest where intensive isotopic measurements in air, soil, and biomass were performed in summer 1997. The multilayer soil-vegetation-atmosphere transfer model MuSICA is adapted to compute 13CO2 flux and concentration profiles. Using MuSICA as a "perfect" simulator and taking advantage of the very dense spatiotemporal resolution of the isotopic data set (341 flasks over a 24-hour period) enable us to test each hypothesis and estimate the performance of the method. The partitioning works better in midafternoon when isotopic disequilibrium is strong. With only 15 flasks, i.e., two 13CO2 nighttime profiles (to estimate the isotopic signature of FR) and five daytime measurements (to perform the partitioning) we get mean daily estimates of FR and FA that agree with the model within 15-20%. However, knowledge of the mesophyll conductance seems crucial and may be a limitation to the method.

  15. Experimental determination of the partitioning coefficient of β-pinene oxidation products in SOAs.

    PubMed

    Hohaus, Thorsten; Gensch, Iulia; Kimmel, Joel; Worsnop, Douglas R; Kiendler-Scharr, Astrid

    2015-06-14

    The composition of secondary organic aerosols (SOAs) formed by β-pinene ozonolysis was experimentally investigated in the Juelich aerosol chamber. Partitioning of oxidation products between gas and particles was measured through concurrent concentration measurements in both phases. Partitioning coefficients (Kp) of 2.23 × 10(-5) ± 3.20 × 10(-6) m(3) μg(-1) for nopinone, 4.86 × 10(-4) ± 1.80 × 10(-4) m(3) μg(-1) for apoverbenone, 6.84 × 10(-4) ± 1.52 × 10(-4) m(3) μg(-1) for oxonopinone and 2.00 × 10(-3) ± 1.13 × 10(-3) m(3) μg(-1) for hydroxynopinone were derived, showing higher values for more oxygenated species. The observed Kp values were compared with values predicted using two different semi-empirical approaches. Both methods led to an underestimation of the partitioning coefficients with systematic differences between the methods. Assuming that the deviation between the experiment and the model is due to non-ideality of the mixed solution in particles, activity coefficients of 4.82 × 10(-2) for nopinone, 2.17 × 10(-3) for apoverbenone, 3.09 × 10(-1) for oxonopinone and 7.74 × 10(-1) for hydroxynopinone would result using the vapour pressure estimation technique that leads to higher Kp. We discuss that such large non-ideality for nopinone could arise due to particle phase processes lowering the effective nopinone vapour pressure such as diol- or dimer formation. The observed high partitioning coefficients compared to modelled results imply an underestimation of SOA mass by applying equilibrium conditions.

  16. Heavy metal partitioning of suspended particulate matter-water and sediment-water in the Yangtze Estuary.

    PubMed

    Feng, Chenghong; Guo, Xiaoyu; Yin, Su; Tian, Chenhao; Li, Yangyang; Shen, Zhenyao

    2017-10-01

    The partitioning of ten heavy metals (As, Cd, Co, Cr, Cu, Hg, Ni, Pb, Sb, and Zn) between the water, suspended particulate matter (SPM), and sediments in seven channel sections during three hydrologic seasons in the Yangtze Estuary was comprehensively investigated. Special attention was paid to the role of tides, influential factors (concentrations of SPM and dissolved organic carbon, and particle size), and heavy metal speciation. The SPM-water and sediment-water partition coefficients (K p ) of the heavy metals exhibited similar changes along the channel sections, though the former were larger throughout the estuary. Because of the higher salinity, the K p values of most of the metals were higher in the north branch than in the south branch. The K p values of Cd, Co, and As generally decreased from the wet season to the dry season. Both the diagonal line method and paired samples t-test showed that no specific phase transfer of heavy metals existed during the flood and ebb tides, but the sediment-water K p was more concentrated for the diagonal line method, owing to the relatively smaller tidal influences on the sediment. The partition coefficients (especially the K p for SPM-water) had negative correlations with the dissolved organic carbon (DOC) but positive correlations were noted with the particle size for most of the heavy metals in sediment. Two types of significant correlations were observed between K p and metal speciation (i.e., exchangeable, carbonate, reducible, organic, and residual fractions), which can be used to identify the dominant phase-partition mechanisms (e.g., adsorption or desorption) of heavy metals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Establishing pediatric reference intervals for 13 biochemical analytes derived from normal subjects in a pediatric endocrinology clinic in Korea.

    PubMed

    Cho, Sun-Mi; Lee, Sang-Guk; Kim, Ho Seong; Kim, Jeong-Ho

    2014-12-01

    Defining pediatric reference intervals is one of the most difficult tasks for laboratory physicians. The continuously changing physiology of growing children makes their laboratory values moving targets. In addition, ethnic and behavioral differences might also cause variations. The aim of this study was to establish age- and sex-specific partitioned reference intervals for 13 serum biochemical analytes in Korean children. A total of 2474 patients, girls aged 2-14 years and boys aged 2-16 years, who underwent a short stature workup but were diagnosed as normal at the Pediatric Endocrinology Clinic of Severance Hospital (Seoul, Korea) between September 2010 and June 2012 were included in this study. The levels of serum calcium, inorganic phosphorus, blood urea nitrogen, creatinine, uric acid, glucose, total cholesterol, total protein, albumin, alkaline phosphatase, aspartic aminotransferase, alanine aminotransferase, and total bilirubin were measured using a Hitachi 7600 analyzer (Hitachi High-Technologies Corporation, Tokyo, Japan). Reference intervals were partitioned according to sex or age subgroups using the Harris and Boyd method. Most analytes except calcium and albumin required partitioning either by sex or age. Age-specific partitioned reference intervals for alkaline phosphatase, creatinine, and total bilirubin were established for both males and females after being partitioned by sex. Additional age-specific partitioning of aspartic aminotransferase in females and total protein and uric acid in males was also required. Inorganic phosphorus, total cholesterol, alanine aminotransferase, blood urea nitrogen, and glucose were partitioned only by sex. This study provided updated age- and sex-specific pediatric reference intervals for 13 basic serum chemistry analytes from a sufficient number of healthy children by using a modern analytical chemistry platform. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Development of TLSER model and QSAR model for predicting partition coefficients of hydrophobic organic chemicals between low density polyethylene film and water.

    PubMed

    Liu, Huihui; Wei, Mengbi; Yang, Xianhai; Yin, Cen; He, Xiao

    2017-01-01

    Partition coefficients are vital parameters for measuring accurately the chemicals concentrations by passive sampling devices. Given the wide use of low density polyethylene (LDPE) film in passive sampling, we developed a theoretical linear solvation energy relationship (TLSER) model and a quantitative structure-activity relationship (QSAR) model for the prediction of the partition coefficient of chemicals between LDPE and water (K pew ). For chemicals with the octanol-water partition coefficient (log K ow ) <8, a TLSER model with V x (McGowan volume) and qA - (the most negative charge on O, N, S, X atoms) as descriptors was developed, but the model had relatively low determination coefficient (R 2 ) and cross-validated coefficient (Q 2 ). In order to further explore the theoretical mechanisms involved in the partition process, a QSAR model with four descriptors (MLOGP (Moriguchi octanol-water partition coeff.), P_VSA_s_3 (P_VSA-like on I-state, bin 3), Hy (hydrophilic factor) and NssO (number of atoms of type ssO)) was established, and statistical analysis indicated that the model had satisfactory goodness-of-fit, robustness and predictive ability. For chemicals with log K OW >8, a TLSER model with V x and a QSAR model with MLOGP as descriptor were developed. This is the first paper to explore the models for highly hydrophobic chemicals. The applicability domain of the models, characterized by the Euclidean distance-based method and Williams plot, covered a large number of structurally diverse chemicals, which included nearly all the common hydrophobic organic compounds. Additionally, through mechanism interpretation, we explored the structural features those governing the partition behavior of chemicals between LDPE and water. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Balancing a U-Shaped Assembly Line by Applying Nested Partitions Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhagwat, Nikhil V.

    2005-01-01

    In this study, we applied the Nested Partitions method to a U-line balancing problem and conducted experiments to evaluate the application. From the results, it is quite evident that the Nested Partitions method provided near optimal solutions (optimal in some cases). Besides, the execution time is quite short as compared to the Branch and Bound algorithm. However, for larger data sets, the algorithm took significantly longer times for execution. One of the reasons could be the way in which the random samples are generated. In the present study, a random sample is a solution in itself which requires assignment ofmore » tasks to various stations. The time taken to assign tasks to stations is directly proportional to the number of tasks. Thus, if the number of tasks increases, the time taken to generate random samples for the different regions also increases. The performance index for the Nested Partitions method in the present study was the number of stations in the random solutions (samples) generated. The total idle time for the samples can be used as another performance index. ULINO method is known to have used a combination of bounds to come up with good solutions. This approach of combining different performance indices can be used to evaluate the random samples and obtain even better solutions. Here, we used deterministic time values for the tasks. In industries where majority of tasks are performed manually, the stochastic version of the problem could be of vital importance. Experimenting with different objective functions (No. of stations was used in this study) could be of some significance to some industries where in the cost associated with creation of a new station is not the same. For such industries, the results obtained by using the present approach will not be of much value. Labor costs, task incompletion costs or a combination of those can be effectively used as alternate objective functions.« less

  20. Mitochondrial thermogenesis and obesity.

    PubMed

    Gambert, Ségolène; Ricquier, Daniel

    2007-11-01

    Thermogenesis is activated at the expense of carbon molecules. Mitochondria play a dominant role in oxidation and parallel heat production since the recovery of oxidation energy is less than perfect. Recent data of mitochondriogenesis and mitochondrial thermogenesis may boost research into certain aspects of obesity. Recent studies have outlined the unexpected decreased thermogenesis that limits fat loss during prolonged food restriction. Activation of fat oxidation in skeletal muscle remains a strategy against fat accumulation, however. Certain adipose depots have the potential to promote thermogenesis, either using mitochondrial uncoupling protein or independently. Peroxisome proliferator-activated receptor gamma coactivators alpha and ss are important regulators of mitochondria thermogenesis. Brain mitochondria are involved in the control of refeeding after starvation. This dual action of mitochondria inform their role in thermogenesis and energy partitioning. The importance of thyroid hormones in mitochondria thermogenesis is also confirmed. The clinical and research implications of these findings are that the mechanisms inhibiting adaptive thermogenesis during diet restriction should be investigated. An important field of research is the contribution of transcriptional coactivators to adipocyte plasticity since adipocytes have an underestimated ability to oxidise fatty acids in addition to their role in triglyceride storage.

  1. Task partitioning in a robot swarm: object retrieval as a sequence of subtasks with direct object transfer.

    PubMed

    Pini, Giovanni; Brutschy, Arne; Scheidler, Alexander; Dorigo, Marco; Birattari, Mauro

    2014-01-01

    We study task partitioning in the context of swarm robotics. Task partitioning is the decomposition of a task into subtasks that can be tackled by different workers. We focus on the case in which a task is partitioned into a sequence of subtasks that must be executed in a certain order. This implies that the subtasks must interface with each other, and that the output of a subtask is used as input for the subtask that follows. A distinction can be made between task partitioning with direct transfer and with indirect transfer. We focus our study on the first case: The output of a subtask is directly transferred from an individual working on that subtask to an individual working on the subtask that follows. As a test bed for our study, we use a swarm of robots performing foraging. The robots have to harvest objects from a source, situated in an unknown location, and transport them to a home location. When a robot finds the source, it memorizes its position and uses dead reckoning to return there. Dead reckoning is appealing in robotics, since it is a cheap localization method and it does not require any additional external infrastructure. However, dead reckoning leads to errors that grow in time if not corrected periodically. We compare a foraging strategy that does not make use of task partitioning with one that does. We show that cooperation through task partitioning can be used to limit the effect of dead reckoning errors. This results in improved capability of locating the object source and in increased performance of the swarm. We use the implemented system as a test bed to study benefits and costs of task partitioning with direct transfer. We implement the system with real robots, demonstrating the feasibility of our approach in a foraging scenario.

  2. Assessing the effects of architectural variations on light partitioning within virtual wheat–pea mixtures

    PubMed Central

    Barillot, Romain; Escobar-Gutiérrez, Abraham J.; Fournier, Christian; Huynh, Pierre; Combes, Didier

    2014-01-01

    Background and Aims Predicting light partitioning in crop mixtures is a critical step in improving the productivity of such complex systems, and light interception has been shown to be closely linked to plant architecture. The aim of the present work was to analyse the relationships between plant architecture and light partitioning within wheat–pea (Triticum aestivum–Pisum sativum) mixtures. An existing model for wheat was utilized and a new model for pea morphogenesis was developed. Both models were then used to assess the effects of architectural variations in light partitioning. Methods First, a deterministic model (L-Pea) was developed in order to obtain dynamic reconstructions of pea architecture. The L-Pea model is based on L-systems formalism and consists of modules for ‘vegetative development’ and ‘organ extension’. A tripartite simulator was then built up from pea and wheat models interfaced with a radiative transfer model. Architectural parameters from both plant models, selected on the basis of their contribution to leaf area index (LAI), height and leaf geometry, were then modified in order to generate contrasting architectures of wheat and pea. Key results By scaling down the analysis to the organ level, it could be shown that the number of branches/tillers and length of internodes significantly determined the partitioning of light within mixtures. Temporal relationships between light partitioning and the LAI and height of the different species showed that light capture was mainly related to the architectural traits involved in plant LAI during the early stages of development, and in plant height during the onset of interspecific competition. Conclusions In silico experiments enabled the study of the intrinsic effects of architectural parameters on the partitioning of light in crop mixtures of wheat and pea. The findings show that plant architecture is an important criterion for the identification/breeding of plant ideotypes, particularly with respect to light partitioning. PMID:24907314

  3. Molecular phylogeny of the aquatic beetle family Noteridae (Coleoptera: Adephaga) with an emphasis on data partitioning strategies.

    PubMed

    Baca, Stephen M; Toussaint, Emmanuel F A; Miller, Kelly B; Short, Andrew E Z

    2017-02-01

    The first molecular phylogenetic hypothesis for the aquatic beetle family Noteridae is inferred using DNA sequence data from five gene fragments (mitochondrial and nuclear): COI, H3, 16S, 18S, and 28S. Our analysis is the most comprehensive phylogenetic reconstruction of Noteridae to date, and includes 53 species representing all subfamilies, tribes and 16 of the 17 genera within the family. We examine the impact of data partitioning on phylogenetic inference by comparing two different algorithm-based partitioning strategies: one using predefined subsets of the dataset, and another recently introduced method, which uses the k-means algorithm to iteratively divide the dataset into clusters of sites evolving at similar rates across sampled loci. We conducted both maximum likelihood and Bayesian inference analyses using these different partitioning schemes. Resulting trees are strongly incongruent with prior classifications of Noteridae. We recover variant tree topologies and support values among the implemented partitioning schemes. Bayes factors calculated with marginal likelihoods of Bayesian analyses support a priori partitioning over k-means and unpartitioned data strategies. Our study substantiates the importance of data partitioning in phylogenetic inference, and underscores the use of comparative analyses to determine optimal analytical strategies. Our analyses recover Noterini Thomson to be paraphyletic with respect to three other tribes. The genera Suphisellus Crotch and Hydrocanthus Say are also recovered as paraphyletic. Following the results of the preferred partitioning scheme, we here propose a revised classification of Noteridae, comprising two subfamilies, three tribes and 18 genera. The following taxonomic changes are made: Notomicrinae sensu n. (= Phreatodytinae syn. n.) is expanded to include the tribe Phreatodytini; Noterini sensu n. (= Neohydrocoptini syn. n., Pronoterini syn. n., Tonerini syn. n.) is expanded to include all genera of the Noterinae; The genus Suphisellus Crotch is expanded to include species of Pronoterus Sharp syn. n.; and the former subgenus Sternocanthus Guignot stat. rev. is resurrected from synonymy and elevated to genus rank. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. A computing method for spatial accessibility based on grid partition

    NASA Astrophysics Data System (ADS)

    Ma, Linbing; Zhang, Xinchang

    2007-06-01

    An accessibility computing method and process based on grid partition was put forward in the paper. As two important factors impacting on traffic, density of road network and relative spatial resistance for difference land use was integrated into computing traffic cost in each grid. A* algorithms was inducted to searching optimum traffic cost of grids path, a detailed searching process and definition of heuristic evaluation function was described in the paper. Therefore, the method can be implemented more simply and its data source is obtained more easily. Moreover, by changing heuristic searching information, more reasonable computing result can be obtained. For confirming our research, a software package was developed with C# language under ArcEngine9 environment. Applying the computing method, a case study on accessibility of business districts in Guangzhou city was carried out.

  5. [A Simultaneous Determination Method with Acetonitrile-n-Hexane Partitioning and Solid-Phase Extraction for Pesticide Residues in Livestock and Marine Products by GC-MS].

    PubMed

    Yoshizaki, Mayuko; Kobayashi, Yukari; Shimizu, Masanori; Maruyama, Kouichi

    2015-01-01

    A simultaneous determination method was examined for 312 pesticides (including isomers) in muscle of livestock and marine products by GC-MS. The pesticide residues extracted from samples with acetone and n-hexane were purified by acetonitrile-n-hexane partitioning, and C18 and SAX/PSA solid-phase extraction without using GPC. Matrix components such as cholesterol were effectively removed. In recovery tests performed by this method using pork, beef, chicken and shrimp, 237-257 pesticides showed recoveries within the range of 70-120% in each sample. Validity was confirmed for 214 of the target pesticides by means of a validation test using pork. In comparison with the Japanese official method using GPC, the treatment time of samples and the quantity of solvent were reduced substantially.

  6. Estimating Atomic Contributions to Hydration and Binding Using Free Energy Perturbation.

    PubMed

    Irwin, Benedict W J; Huggins, David J

    2018-06-12

    We present a general method called atom-wise free energy perturbation (AFEP), which extends a conventional molecular dynamics free energy perturbation (FEP) simulation to give the contribution to a free energy change from each atom. AFEP is derived from an expansion of the Zwanzig equation used in the exponential averaging method by defining that the system total energy can be partitioned into contributions from each atom. A partitioning method is assumed and used to group terms in the expansion to correspond to individual atoms. AFEP is applied to six example free energy changes to demonstrate the method. Firstly, the hydration free energies of methane, methanol, methylamine, methanethiol, and caffeine in water. AFEP highlights the atoms in the molecules that interact favorably or unfavorably with water. Finally AFEP is applied to the binding free energy of human immunodeficiency virus type 1 protease to lopinavir, and AFEP reveals the contribution of each atom to the binding free energy, indicating candidate areas of the molecule to improve to produce a more strongly binding inhibitor. FEP gives a single value for the free energy change and is already a very useful method. AFEP gives a free energy change for each "part" of the system being simulated, where part can mean individual atoms, chemical groups, amino acids, or larger partitions depending on what the user is trying to measure. This method should have various applications in molecular dynamics studies of physical, chemical, or biochemical phenomena, specifically in the field of computational drug discovery.

  7. Analysis of polychlorinated biphenyls in transformer oil by using liquid-liquid partitioning in a microfluidic device.

    PubMed

    Aota, Arata; Date, Yasumoto; Terakado, Shingo; Sugiyama, Hideo; Ohmura, Naoya

    2011-10-15

    Polychlorinated biphenyls (PCBs) that are present in transformer oil are a common global problem because of their toxicity and environmental persistence. The development of a rapid, low-cost method for measurement of PCBs in oil has been a matter of priority because of the large number of PCB-contaminated transformers still in service. Although one of the rapid, low-cost methods involves an immunoassay, which uses multilayer column separation, hexane evaporation, dimethyl sulfoxide (DMSO) partitioning, antigen-antibody reaction, and a measurement system, there is a demand for more cost-effective and simpler procedures. In this paper, we report a DMSO partitioning method that utilizes a microfluidic device with microrecesses along the microchannel. In this method, PCBs are extracted and enriched into the DMSO confined in the microrecesses under the oil flow condition. The enrichment factor was estimated to be 2.69, which agreed well with the anticipated value. The half-maximal inhibitory concentration of PCBs in oil was found to be 0.38 mg/kg, which satisfies the much stricter criterion of 0.5 mg/kg in Japan. The developed method can realize the pretreatment of oil without the use of centrifugation for phase separation. Furthermore, the amount of expensive reagents required can be reduced considerably. Therefore, our method can serve as a powerful tool for achieving a simpler, low-cost procedure and an on-site analysis system. © 2011 American Chemical Society

  8. Prolonged restricted sitting effects in UH-60 helicopters.

    PubMed

    Games, Kenneth E; Lakin, Joni M; Quindry, John C; Weimar, Wendi H; Sefton, JoEllen M

    2015-01-01

    Advances in flight technologies and the demand for long-range flight have increased mission lengths for U.S. Army Black Hawk UH-60 crewmembers. Prolonged mission times have increased reports of pilot discomfort and symptoms of paresthesia thought to be due to UH-60 seat design and areas of locally high pressure. Discomfort created by the seat-system decreases situational awareness, putting aviators and support crew at risk of injury. Therefore, the purpose of this study was to examine the effects of prolonged restricted sitting in a UH-60 on discomfort, sensory function, and vascular measures in the lower extremities. There were 15 healthy men (age = 23.4 ± 3.1 yr) meeting physical flight status requirements who sat in an unpadded, UH-60 pilot's seat for 4 h while completing a common cognitive task. During the session, subjective discomfort, sensory function, and vascular function were measured. Across 4 h of restricted sitting, subjective discomfort increased using the Category Partitioning Scale (30.27 point increase) and McGill Pain Questionnaire (8.53 point increase); lower extremity sensory function was diminished along the S1 dermatome; and skin temperature decreased on both the lateral (2.85°C decrease) and anterior (2.78°C decrease) aspects of the ankle. The results suggest that prolonged sitting in a UH-60 seat increases discomfort, potentially through a peripheral nervous or vascular system mechanism. Further research is needed to understand the etiology and onset of pain and paresthesia during prolonged sitting in UH-60 pilot seats. Games KE, Lakin JM, Quindry JC, Weimar WH, Sefton JM. Prolonged restricted sitting effects in UH-60 helicopters.

  9. Soil-solution partitioning of DOC in acid organic soils: Results from a UK field acidification and alkalization experiment

    NASA Astrophysics Data System (ADS)

    Oulehle, Filip; Jones, Timothy; Burden, Annette; Evans, Chris

    2013-04-01

    Dissolved organic carbon (DOC) is an important component of the global carbon (C) cycle and has profound impacts on water chemistry and metabolism in lakes and rivers. Reported increases of DOC concentration in surface waters across Europe and Northern America have been attributed to several drivers; from changing climate and land-use to eutrophication and declining acid deposition. The last of these suggests that acidic deposition suppressed the solubility of DOC, and that this historic suppression is now being reversed by reducing emissions of acidifying pollutants. We studied a set of four parallel acidification and alkalization experiments in organic rich soils which, after three years of manipulation, have shown clear soil solution DOC responses to acidity change. We tested whether these DOC concentration changes were related to changes in the acid/base properties of DOC. Based on laboratory determination of DOC site density (S.D. = amount of carboxylic groups per milligram DOC) and charge density (C.D. = organic acid anion concentration per milligram DOC) we found that the change in DOC soil-solution partitioning was tightly related to the change in degree of dissociation (α = C.D./S.D. ratio) of organic acids (R2=0.74, p<0.01). Carbon turnover in soil organic matter (SOM), determined by soil respiration and β-D-glucosidase enzyme activity measurements, also appears to have some impact on DOC leaching, via constraints on the actual supply of available DOC from SOM; when the turnover rate of C in SOM is low, the effect of α on DOC leaching is reduced. Thus, differences in the magnitude of DOC changes seen across different environments might be explained by interactions between physicochemical restrictions of DOC soil-solution partitioning, and SOM carbon turnover effects on DOC supply.

  10. There's No Place Like Home: Crown-of-Thorns Outbreaks in the Central Pacific Are Regionally Derived and Independent Events

    PubMed Central

    Timmers, Molly A.; Bird, Christopher E.; Skillings, Derek J.; Smouse, Peter E.; Toonen, Robert J.

    2012-01-01

    One of the most significant biological disturbances on a tropical coral reef is a population outbreak of the fecund, corallivorous crown-of-thorns sea star, Acanthaster planci. Although the factors that trigger an initial outbreak may vary, successive outbreaks within and across regions are assumed to spread via the planktonic larvae released from a primary outbreak. This secondary outbreak hypothesis is predominantly based on the high dispersal potential of A. planci and the assertion that outbreak populations (a rogue subset of the larger population) are genetically more similar to each other than they are to low-density non-outbreak populations. Here we use molecular techniques to evaluate the spatial scale at which A. planci outbreaks can propagate via larval dispersal in the central Pacific Ocean by inferring the location and severity of gene flow restrictions from the analysis of mtDNA control region sequence (656 specimens, 17 non-outbreak and six outbreak locations, six archipelagos, and three regions). Substantial regional, archipelagic, and subarchipelagic-scale genetic structuring of A. planci populations indicate that larvae rarely realize their dispersal potential and outbreaks in the central Pacific do not spread across the expanses of open ocean. On a finer scale, genetic partitioning was detected within two of three islands with multiple sampling sites. The finest spatial structure was detected at Pearl & Hermes Atoll, between the lagoon and forereef habitats (<10 km). Despite using a genetic marker capable of revealing subtle partitioning, we found no evidence that outbreaks were a rogue genetic subset of a greater population. Overall, outbreaks that occur at similar times across population partitions are genetically independent and likely due to nutrient inputs and similar climatic and ecological conditions that conspire to fuel plankton blooms. PMID:22363570

  11. Root-shoot growth responses during interspecific competition quantified using allometric modelling.

    PubMed

    Robinson, David; Davidson, Hazel; Trinder, Clare; Brooker, Rob

    2010-12-01

    Plant competition studies are restricted by the difficulty of quantifying root systems of competitors. Analyses are usually limited to above-ground traits. Here, a new approach to address this issue is reported. Root system weights of competing plants can be estimated from: shoot weights of competitors; combined root weights of competitors; and slopes (scaling exponents, α) and intercepts (allometric coefficients, β) of ln-regressions of root weight on shoot weight of isolated plants. If competition induces no change in root : shoot growth, α and β values of competing and isolated plants will be equal. Measured combined root weight of competitors will equal that estimated allometrically from measured shoot weights of each competing plant. Combined root weights can be partitioned directly among competitors. If, as will be more usual, competition changes relative root and shoot growth, the competitors' combined root weight will not equal that estimated allometrically and cannot be partitioned directly. However, if the isolated-plant α and β values are adjusted until the estimated combined root weight of competitors matches the measured combined root weight, the latter can be partitioned among competitors using their new α and β values. The approach is illustrated using two herbaceous species, Dactylis glomerata and Plantago lanceolata. Allometric modelling revealed a large and continuous increase in the root : shoot ratio by Dactylis, but not Plantago, during competition. This was associated with a superior whole-plant dry weight increase in Dactylis, which was ultimately 2·5-fold greater than that of Plantago. Whole-plant growth dominance of Dactylis over Plantago, as deduced from allometric modelling, occurred 14-24 d earlier than suggested by shoot data alone. Given reasonable assumptions, allometric modelling can analyse competitive interactions in any species mixture, and overcomes a long-standing problem in studies of competition.

  12. An additional k-means clustering step improves the biological features of WGCNA gene co-expression networks.

    PubMed

    Botía, Juan A; Vandrovcova, Jana; Forabosco, Paola; Guelfi, Sebastian; D'Sa, Karishma; Hardy, John; Lewis, Cathryn M; Ryten, Mina; Weale, Michael E

    2017-04-12

    Weighted Gene Co-expression Network Analysis (WGCNA) is a widely used R software package for the generation of gene co-expression networks (GCN). WGCNA generates both a GCN and a derived partitioning of clusters of genes (modules). We propose k-means clustering as an additional processing step to conventional WGCNA, which we have implemented in the R package km2gcn (k-means to gene co-expression network, https://github.com/juanbot/km2gcn ). We assessed our method on networks created from UKBEC data (10 different human brain tissues), on networks created from GTEx data (42 human tissues, including 13 brain tissues), and on simulated networks derived from GTEx data. We observed substantially improved module properties, including: (1) few or zero misplaced genes; (2) increased counts of replicable clusters in alternate tissues (x3.1 on average); (3) improved enrichment of Gene Ontology terms (seen in 48/52 GCNs) (4) improved cell type enrichment signals (seen in 21/23 brain GCNs); and (5) more accurate partitions in simulated data according to a range of similarity indices. The results obtained from our investigations indicate that our k-means method, applied as an adjunct to standard WGCNA, results in better network partitions. These improved partitions enable more fruitful downstream analyses, as gene modules are more biologically meaningful.

  13. Partitioning of Evapotranspiration Using a Stable Water Isotope Technique in a High Temperature Agricultural Production System

    NASA Astrophysics Data System (ADS)

    Lu, X.; Liang, L.; Wang, L.; Jenerette, D.; Grantz, D. A.

    2015-12-01

    Agricultural production in the hot and arid low desert systems of southern California relies heavily on irrigation. A better understanding of how much and to what extent the irrigation water is transpired by crops relative to being lost through evaporation will contribute to better management of increasingly limited agricultural water resources. In this study, we examined the evapotranspiration (ET) partitioning over a field of forage sorghum (S. bicolor) during a growing season with several irrigation cycles. In several field campaigns we used continuous measurements of near-surface variations in the stable isotopic composition of water vapor (δ2H). We employed custom built transparent chambers coupled with a laser-based isotope analyzer and used Keeling plot and mass balance methods for surface flux partitioning. The preliminary results show that δT is more enriched than δE in the early growing season, and becomes less enriched than δE later in the season as canopy cover increases. There is an increase in the contribution of transpiration to ET as (1) leaf area index increases, and (2) as soil surface moisture declines. These results are consistent with theory, and extend these measurements to an environment that experiences extreme soil surface temperatures. The data further support the use of chamber based methods with stable isotopic analysis for characterization of ET partitioning in challenging field environments.

  14. Epidemic Reconstruction in a Phylogenetics Framework: Transmission Trees as Partitions of the Node Set

    PubMed Central

    Hall, Matthew; Woolhouse, Mark; Rambaut, Andrew

    2015-01-01

    The use of genetic data to reconstruct the transmission tree of infectious disease epidemics and outbreaks has been the subject of an increasing number of studies, but previous approaches have usually either made assumptions that are not fully compatible with phylogenetic inference, or, where they have based inference on a phylogeny, have employed a procedure that requires this tree to be fixed. At the same time, the coalescent-based models of the pathogen population that are employed in the methods usually used for time-resolved phylogeny reconstruction are a considerable simplification of epidemic process, as they assume that pathogen lineages mix freely. Here, we contribute a new method that is simultaneously a phylogeny reconstruction method for isolates taken from an epidemic, and a procedure for transmission tree reconstruction. We observe that, if one or more samples is taken from each host in an epidemic or outbreak and these are used to build a phylogeny, a transmission tree is equivalent to a partition of the set of nodes of this phylogeny, such that each partition element is a set of nodes that is connected in the full tree and contains all the tips corresponding to samples taken from one and only one host. We then implement a Monte Carlo Markov Chain (MCMC) procedure for simultaneous sampling from the spaces of both trees, utilising a newly-designed set of phylogenetic tree proposals that also respect node partitions. We calculate the posterior probability of these partitioned trees based on a model that acknowledges the population structure of an epidemic by employing an individual-based disease transmission model and a coalescent process taking place within each host. We demonstrate our method, first using simulated data, and then with sequences taken from the H7N7 avian influenza outbreak that occurred in the Netherlands in 2003. We show that it is superior to established coalescent methods for reconstructing the topology and node heights of the phylogeny and performs well for transmission tree reconstruction when the phylogeny is well-resolved by the genetic data, but caution that this will often not be the case in practice and that existing genetic and epidemiological data should be used to configure such analyses whenever possible. This method is available for use by the research community as part of BEAST, one of the most widely-used packages for reconstruction of dated phylogenies. PMID:26717515

  15. Aqueous two-phase partition applied to the isolation of plasma membranes and Golgi apparatus from cultured mammalian cells.

    PubMed

    Morré, D M; Morre, D J

    2000-06-23

    Partitioning in dextran-poly(ethylene)glycol (PEG) aqueous-aqueous phase systems represents a mature technology with many applications to separations of cells and to the preparation of membranes from mammalian cells. Most applications to membrane isolation and purification have focused on plasma membranes, plasma membrane domains and separation of right side-out and inside-out plasma membrane vesicles. The method exploits a combination of membrane properties, including charge and hydrophobicity. Purification is based upon differential distributions of the constituents in a sample between the two principal compartments of the two phases (upper and lower) and at the interface. The order of affinity of animal cell membranes for the upper phase is: endoplasmic reticulum

  16. Modeling of adipose/blood partition coefficient for environmental chemicals.

    PubMed

    Papadaki, K C; Karakitsios, S P; Sarigiannis, D A

    2017-12-01

    A Quantitative Structure Activity Relationship (QSAR) model was developed in order to predict the adipose/blood partition coefficient of environmental chemical compounds. The first step of QSAR modeling was the collection of inputs. Input data included the experimental values of adipose/blood partition coefficient and two sets of molecular descriptors for 67 organic chemical compounds; a) the descriptors from Linear Free Energy Relationship (LFER) and b) the PaDEL descriptors. The datasets were split to training and prediction set and were analysed using two statistical methods; Genetic Algorithm based Multiple Linear Regression (GA-MLR) and Artificial Neural Networks (ANN). The models with LFER and PaDEL descriptors, coupled with ANN, produced satisfying performance results. The fitting performance (R 2 ) of the models, using LFER and PaDEL descriptors, was 0.94 and 0.96, respectively. The Applicability Domain (AD) of the models was assessed and then the models were applied to a large number of chemical compounds with unknown values of adipose/blood partition coefficient. In conclusion, the proposed models were checked for fitting, validity and applicability. It was demonstrated that they are stable, reliable and capable to predict the values of adipose/blood partition coefficient of "data poor" chemical compounds that fall within the applicability domain. Copyright © 2017. Published by Elsevier Ltd.

  17. Aqueous two-phase partition applied to the isolation of plasma membranes and Golgi apparatus from cultured mammalian cells

    NASA Technical Reports Server (NTRS)

    Morre, D. M.; Morre, D. J.

    2000-01-01

    Partitioning in dextran-poly(ethylene)glycol (PEG) aqueous-aqueous phase systems represents a mature technology with many applications to separations of cells and to the preparation of membranes from mammalian cells. Most applications to membrane isolation and purification have focused on plasma membranes, plasma membrane domains and separation of right side-out and inside-out plasma membrane vesicles. The method exploits a combination of membrane properties, including charge and hydrophobicity. Purification is based upon differential distributions of the constituents in a sample between the two principal compartments of the two phases (upper and lower) and at the interface. The order of affinity of animal cell membranes for the upper phase is: endoplasmic reticulum

  18. Exponentially fitted symplectic Runge-Kutta-Nyström methods derived by partitioned Runge-Kutta methods

    NASA Astrophysics Data System (ADS)

    Monovasilis, Th.; Kalogiratou, Z.; Simos, T. E.

    2013-10-01

    In this work we derive symplectic EF/TF RKN methods by symplectic EF/TF PRK methods. Also EF/TF symplectic RKN methods are constructed directly from classical symplectic RKN methods. Several numerical examples will be given in order to decide which is the most favourable implementation.

  19. Interspecific resource partitioning in sympatric ursids

    USGS Publications Warehouse

    Belant, Jerrold L.; Kielland, Knut; Follmann, Erich H.; Adams, Layne G.

    2006-01-01

    The fundamental niche of a species is rarely if ever realized because the presence of other species restricts it to a narrower range of ecological conditions. The effects of this narrower range of conditions define how resources are partitioned. Resource partitioning has been inferred but not demonstrated previously for sympatric ursids. We estimated assimilated diet in relation to body condition (body fat and lean and total body mass) and reproduction for sympatric brown bears (Ursus arctos) and American black bears (U. americanus) in south‐central Alaska, 1998–2000. Based on isotopic analysis of blood and keratin in claws, salmon (Oncorhynchus spp.) predominated in brown bear diets (>53% annually) whereas black bears assimilated 0–25% salmon annually. Black bears did not exploit salmon during a year with below average spawning numbers, probably because brown bears deterred black bear access to salmon. Proportion of salmon in assimilated diet was consistent across years for brown bears and represented the major portion of their diet. Body size of brown bears in the study area approached mean body size of several coastal brown bear populations, demonstrating the importance of salmon availability to body condition. Black bears occurred at a comparable density (mass : mass), but body condition varied and was related directly to the amount of salmon assimilated in their diet. Both species gained most lean body mass during spring and all body fat during summer when salmon were present. Improved body condition (i.e., increased percentage body fat) from salmon consumption reduced catabolism of lean body mass during hibernation, resulting in better body condition the following spring. Further, black bear reproduction was directly related to body condition; reproductive rates were reduced when body condition was lower. High body fat content across years for brown bears was reflected in consistently high reproductive levels. We suggest that the fundamental niche of black bears was constrained by brown bears through partitioning of food resources, which varied among years. Reduced exploitation of salmon caused black bears to rely more extensively on less reliable or nutritious food sources (e.g., moose [Alces alces], berries) resulting in lowered body condition and subsequent reproduction.

  20. Hydrolysis of glyoxal in water-restricted environments: formation of organic aerosol precursors through formic acid catalysis.

    PubMed

    Hazra, Montu K; Francisco, Joseph S; Sinha, Amitabha

    2014-06-12

    The hydrolysis of glyoxal involving one to three water molecules and also in the presence of a water molecule and formic acid has been investigated. Our results show that glyoxal-diol is the major product of the hydrolysis and that formic acid, through its ability to facilitate intermolecular hydrogen atom transfer, is considerably more efficient than water as a catalyst in the hydrolysis process. Additionally, once the glyoxal-diol is formed, the barrier for further hydrolysis to form the glyoxal-tetrol is effectively reduced to zero in the presence of a single water and formic acid molecule. There are two important implications arising from these findings. First, the results suggest that under the catalytic influence of formic acid, glyoxal hydrolysis can impact the growth of atmospheric aerosols. As a result of enhanced hydrogen bonding, mediated through their polar OH functional groups, the diol and tetrol products are expected to have significantly lower vapor pressure than the parent glyoxal molecule; hence they can more readily partition into the particle phase and contribute to the growth of secondary organic aerosols. In addition, our findings provide insight into how glyoxal-diol and glyoxal-tetrol might be formed under atmospheric conditions associated with water-restricted environments and strongly suggest that the formation of these precursors for secondary organic aerosol growth is not likely restricted solely to the bulk aqueous phase as is currently assumed.

  1. Photosynthate partitioning to starch in Arabidopsis thaliana is insensitive to light intensity but sensitive to photoperiod due to a restriction on growth in the light in short photoperiods.

    PubMed

    Mengin, Virginie; Pyl, Eva-Theresa; Alexandre Moraes, Thiago; Sulpice, Ronan; Krohn, Nicole; Encke, Beatrice; Stitt, Mark

    2017-11-01

    Photoperiod duration can be predicted from previous days, but irradiance fluctuates in an unpredictable manner. To investigate how allocation to starch responds to changes in these two environmental variables, Arabidopsis Col-0 was grown in a 6 h and a 12 h photoperiod at three different irradiances. The absolute rate of starch accumulation increased when photoperiod duration was shortened and when irradiance was increased. The proportion of photosynthate allocated to starch increased strongly when photoperiod duration was decreased but only slightly when irradiance was decreased. There was a small increase in the daytime level of sucrose and twofold increases in glucose, fructose and glucose 6-phosphate at a given irradiance in short photoperiods compared to long photoperiods. The rate of starch accumulation correlated strongly with sucrose and glucose levels in the light, irrespective of whether these sugars were responding to a change in photoperiod or irradiance. Whole plant carbon budget modelling revealed a selective restriction of growth in the light period in short photoperiods. It is proposed that photoperiod sensing, possibly related to the duration of the night, restricts growth in the light period in short photoperiods, increasing allocation to starch and providing more carbon reserves to support metabolism and growth in the long night. © 2017 John Wiley & Sons Ltd.

  2. Quantum speedup of Monte Carlo methods.

    PubMed

    Montanaro, Ashley

    2015-09-08

    Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently.

  3. Sensitivity evaluation of dynamic speckle activity measurements using clustering methods.

    PubMed

    Etchepareborda, Pablo; Federico, Alejandro; Kaufmann, Guillermo H

    2010-07-01

    We evaluate and compare the use of competitive neural networks, self-organizing maps, the expectation-maximization algorithm, K-means, and fuzzy C-means techniques as partitional clustering methods, when the sensitivity of the activity measurement of dynamic speckle images needs to be improved. The temporal history of the acquired intensity generated by each pixel is analyzed in a wavelet decomposition framework, and it is shown that the mean energy of its corresponding wavelet coefficients provides a suited feature space for clustering purposes. The sensitivity obtained by using the evaluated clustering techniques is also compared with the well-known methods of Konishi-Fujii, weighted generalized differences, and wavelet entropy. The performance of the partitional clustering approach is evaluated using simulated dynamic speckle patterns and also experimental data.

  4. Efficient estimation of diffusion during dendritic solidification

    NASA Technical Reports Server (NTRS)

    Yeum, K. S.; Poirier, D. R.; Laxmanan, V.

    1989-01-01

    A very efficient finite difference method has been developed to estimate the solute redistribution during solidification with diffusion in the solid. This method is validated by comparing the computed results with the results of an analytical solution derived by Kobayashi (1988) for the assumptions of a constant diffusion coefficient, a constant equilibrium partition ratio, and a parabolic rate of the advancement of the solid/liquid interface. The flexibility of the method is demonstrated by applying it to the dendritic solidification of a Pb-15 wt pct Sn alloy, for which the equilibrium partition ratio and diffusion coefficient vary substantially during solidification. The fraction eutectic at the end of solidification is also obtained by estimating the fraction solid, in greater resolution, where the concentration of solute in the interdendritic liquid reaches the eutectic composition of the alloy.

  5. Quantum speedup of Monte Carlo methods

    PubMed Central

    Montanaro, Ashley

    2015-01-01

    Monte Carlo methods use random sampling to estimate numerical quantities which are hard to compute deterministically. One important example is the use in statistical physics of rapidly mixing Markov chains to approximately compute partition functions. In this work, we describe a quantum algorithm which can accelerate Monte Carlo methods in a very general setting. The algorithm estimates the expected output value of an arbitrary randomized or quantum subroutine with bounded variance, achieving a near-quadratic speedup over the best possible classical algorithm. Combining the algorithm with the use of quantum walks gives a quantum speedup of the fastest known classical algorithms with rigorous performance bounds for computing partition functions, which use multiple-stage Markov chain Monte Carlo techniques. The quantum algorithm can also be used to estimate the total variation distance between probability distributions efficiently. PMID:26528079

  6. Normalized Cut Algorithm for Automated Assignment of Protein Domains

    NASA Technical Reports Server (NTRS)

    Samanta, M. P.; Liang, S.; Zha, H.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We present a novel computational method for automatic assignment of protein domains from structural data. At the core of our algorithm lies a recently proposed clustering technique that has been very successful for image-partitioning applications. This grap.,l-theory based clustering method uses the notion of a normalized cut to partition. an undirected graph into its strongly-connected components. Computer implementation of our method tested on the standard comparison set of proteins from the literature shows a high success rate (84%), better than most existing alternative In addition, several other features of our algorithm, such as reliance on few adjustable parameters, linear run-time with respect to the size of the protein and reduced complexity compared to other graph-theory based algorithms, would make it an attractive tool for structural biologists.

  7. Improve load balancing and coding efficiency of tiles in high efficiency video coding by adaptive tile boundary

    NASA Astrophysics Data System (ADS)

    Chan, Chia-Hsin; Tu, Chun-Chuan; Tsai, Wen-Jiin

    2017-01-01

    High efficiency video coding (HEVC) not only improves the coding efficiency drastically compared to the well-known H.264/AVC but also introduces coding tools for parallel processing, one of which is tiles. Tile partitioning is allowed to be arbitrary in HEVC, but how to decide tile boundaries remains an open issue. An adaptive tile boundary (ATB) method is proposed to select a better tile partitioning to improve load balancing (ATB-LoadB) and coding efficiency (ATB-Gain) with a unified scheme. Experimental results show that, compared to ordinary uniform-space partitioning, the proposed ATB can save up to 17.65% of encoding times in parallel encoding scenarios and can reduce up to 0.8% of total bit rates for coding efficiency.

  8. Investigation of the seismic resistance of interior building partitions, phase 1

    NASA Astrophysics Data System (ADS)

    Anderson, R. W.; Yee, Y. C.; Savulian, G.; Barclay, B.; Lee, G.

    1981-02-01

    The effective participation of wood-framed interior shear wall partitions when determining the ultimate resistance capacity of two- and three-story masonry apartment buildings to seismic loading was investigated. Load vs. deflection tests were performed on 8 ft by 8 ft wall panel specimens constructed of four different facing materials, including wood lath and plaster, gypsum lath and plaster, and gypsum wallboard with joints placed either horizontally or vertically. The wood lath and plaster construction is found to be significantly stronger and stiffer than the other three specimens. Analyses of the test panels using finite element methods to predict their static resistance characteristics indicates that the facing material acts as the primary shear-resisting structural element. Resistance of shear wall partitions to lateral loads was assessed.

  9. Implementation of a partitioned algorithm for simulation of large CSI problems

    NASA Technical Reports Server (NTRS)

    Alvin, Kenneth F.; Park, K. C.

    1991-01-01

    The implementation of a partitioned numerical algorithm for determining the dynamic response of coupled structure/controller/estimator finite-dimensional systems is reviewed. The partitioned approach leads to a set of coupled first and second-order linear differential equations which are numerically integrated with extrapolation and implicit step methods. The present software implementation, ACSIS, utilizes parallel processing techniques at various levels to optimize performance on a shared-memory concurrent/vector processing system. A general procedure for the design of controller and filter gains is also implemented, which utilizes the vibration characteristics of the structure to be solved. Also presented are: example problems; a user's guide to the software; the procedures and algorithm scripts; a stability analysis for the algorithm; and the source code for the parallel implementation.

  10. An Energy-Based Approach for Detection and Characterization of Subtle Entities Within Laser Scanning Point-Clouds

    NASA Astrophysics Data System (ADS)

    Arav, Reuma; Filin, Sagi

    2016-06-01

    Airborne laser scans present an optimal tool to describe geomorphological features in natural environments. However, a challenge arises in the detection of such phenomena, as they are embedded in the topography, tend to blend into their surroundings and leave only a subtle signature within the data. Most object-recognition studies address mainly urban environments and follow a general pipeline where the data are partitioned into segments with uniform properties. These approaches are restricted to man-made domain and are capable to handle limited features that answer a well-defined geometric form. As natural environments present a more complex set of features, the common interpretation of the data is still manual at large. In this paper, we propose a data-aware detection scheme, unbound to specific domains or shapes. We define the recognition question as an energy optimization problem, solved by variational means. Our approach, based on the level-set method, characterizes geometrically local surfaces within the data, and uses these characteristics as potential field for minimization. The main advantage here is that it allows topological changes of the evolving curves, such as merging and breaking. We demonstrate the proposed methodology on the detection of collapse sinkholes.

  11. a Super Voxel-Based Riemannian Graph for Multi Scale Segmentation of LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    Li, Minglei

    2018-04-01

    Automatically segmenting LiDAR points into respective independent partitions has become a topic of great importance in photogrammetry, remote sensing and computer vision. In this paper, we cast the problem of point cloud segmentation as a graph optimization problem by constructing a Riemannian graph. The scale space of the observed scene is explored by an octree-based over-segmentation with different depths. The over-segmentation produces many super voxels which restrict the structure of the scene and will be used as nodes of the graph. The Kruskal coordinates are used to compute edge weights that are proportional to the geodesic distance between nodes. Then we compute the edge-weight matrix in which the elements reflect the sectional curvatures associated with the geodesic paths between super voxel nodes on the scene surface. The final segmentation results are generated by clustering similar super voxels and cutting off the weak edges in the graph. The performance of this method was evaluated on LiDAR point clouds for both indoor and outdoor scenes. Additionally, extensive comparisons to state of the art techniques show that our algorithm outperforms on many metrics.

  12. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1314 and Method 1315

    EPA Science Inventory

    This report summarizes the results of an interlaboratory study conducted to generate precision estimates for two leaching methods under review by the U.S. EPA’s OSWER for inclusion into the EPA’s SW-846: Method 1314: Liquid-Solid Partitioning as a Function of Liquid...

  13. Load Balancing Unstructured Adaptive Grids for CFD Problems

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid

    1996-01-01

    Mesh adaption is a powerful tool for efficient unstructured-grid computations but causes load imbalance among processors on a parallel machine. A dynamic load balancing method is presented that balances the workload across all processors with a global view. After each parallel tetrahedral mesh adaption, the method first determines if the new mesh is sufficiently unbalanced to warrant a repartitioning. If so, the adapted mesh is repartitioned, with new partitions assigned to processors so that the redistribution cost is minimized. The new partitions are accepted only if the remapping cost is compensated by the improved load balance. Results indicate that this strategy is effective for large-scale scientific computations on distributed-memory multiprocessors.

  14. SEU System Analysis: Not Just the Sum of All Parts

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Label, Kenneth

    2014-01-01

    Single event upset (SEU) analysis of complex systems is challenging. Currently, system SEU analysis is performed by component level partitioning and then either: the most dominant SEU cross-sections (SEUs) are used in system error rate calculations; or the partition SEUs are summed to eventually obtain a system error rate. In many cases, system error rates are overestimated because these methods generally overlook system level derating factors. The problem with overestimating is that it can cause overdesign and consequently negatively affect the following: cost, schedule, functionality, and validation/verification. The scope of this presentation is to discuss the risks involved with our current scheme of SEU analysis for complex systems; and to provide alternative methods for improvement.

  15. Achieving microaggregation for secure statistical databases using fixed-structure partitioning-based learning automata.

    PubMed

    Fayyoumi, Ebaa; Oommen, B John

    2009-10-01

    We consider the microaggregation problem (MAP) that involves partitioning a set of individual records in a microdata file into a number of mutually exclusive and exhaustive groups. This problem, which seeks for the best partition of the microdata file, is known to be NP-hard and has been tackled using many heuristic solutions. In this paper, we present the first reported fixed-structure-stochastic-automata-based solution to this problem. The newly proposed method leads to a lower value of the information loss (IL), obtains a better tradeoff between the IL and the disclosure risk (DR) when compared with state-of-the-art methods, and leads to a superior value of the scoring index, which is a criterion involving a combination of the IL and the DR. The scheme has been implemented, tested, and evaluated for different real-life and simulated data sets. The results clearly demonstrate the applicability of learning automata to the MAP and its ability to yield a solution that obtains the best tradeoff between IL and DR when compared with the state of the art.

  16. Modelling a real-world buried valley system with vertical non-stationarity using multiple-point statistics

    NASA Astrophysics Data System (ADS)

    He, Xiulan; Sonnenborg, Torben O.; Jørgensen, Flemming; Jensen, Karsten H.

    2017-03-01

    Stationarity has traditionally been a requirement of geostatistical simulations. A common way to deal with non-stationarity is to divide the system into stationary sub-regions and subsequently merge the realizations for each region. Recently, the so-called partition approach that has the flexibility to model non-stationary systems directly was developed for multiple-point statistics simulation (MPS). The objective of this study is to apply the MPS partition method with conventional borehole logs and high-resolution airborne electromagnetic (AEM) data, for simulation of a real-world non-stationary geological system characterized by a network of connected buried valleys that incise deeply into layered Miocene sediments (case study in Denmark). The results show that, based on fragmented information of the formation boundaries, the MPS partition method is able to simulate a non-stationary system including valley structures embedded in a layered Miocene sequence in a single run. Besides, statistical information retrieved from the AEM data improved the simulation of the geology significantly, especially for the deep-seated buried valley sediments where borehole information is sparse.

  17. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    PubMed

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  18. Parallel adaptive wavelet collocation method for PDEs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nejadmalayeri, Alireza, E-mail: Alireza.Nejadmalayeri@gmail.com; Vezolainen, Alexei, E-mail: Alexei.Vezolainen@Colorado.edu; Brown-Dymkoski, Eric, E-mail: Eric.Browndymkoski@Colorado.edu

    2015-10-01

    A parallel adaptive wavelet collocation method for solving a large class of Partial Differential Equations is presented. The parallelization is achieved by developing an asynchronous parallel wavelet transform, which allows one to perform parallel wavelet transform and derivative calculations with only one data synchronization at the highest level of resolution. The data are stored using tree-like structure with tree roots starting at a priori defined level of resolution. Both static and dynamic domain partitioning approaches are developed. For the dynamic domain partitioning, trees are considered to be the minimum quanta of data to be migrated between the processes. This allowsmore » fully automated and efficient handling of non-simply connected partitioning of a computational domain. Dynamic load balancing is achieved via domain repartitioning during the grid adaptation step and reassigning trees to the appropriate processes to ensure approximately the same number of grid points on each process. The parallel efficiency of the approach is discussed based on parallel adaptive wavelet-based Coherent Vortex Simulations of homogeneous turbulence with linear forcing at effective non-adaptive resolutions up to 2048{sup 3} using as many as 2048 CPU cores.« less

  19. Clustering Financial Time Series by Network Community Analysis

    NASA Astrophysics Data System (ADS)

    Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio

    In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.

  20. A comparison of approaches for estimating bottom-sediment mass in large reservoirs

    USGS Publications Warehouse

    Juracek, Kyle E.

    2006-01-01

    Estimates of sediment and sediment-associated constituent loads and yields from drainage basins are necessary for the management of reservoir-basin systems to address important issues such as reservoir sedimentation and eutrophication. One method for the estimation of loads and yields requires a determination of the total mass of sediment deposited in a reservoir. This method involves a sediment volume-to-mass conversion using bulk-density information. A comparison of four computational approaches (partition, mean, midpoint, strategic) for using bulk-density information to estimate total bottom-sediment mass in four large reservoirs indicated that the differences among the approaches were not statistically significant. However, the lack of statistical significance may be a result of the small sample size. Compared to the partition approach, which was presumed to provide the most accurate estimates of bottom-sediment mass, the results achieved using the strategic, mean, and midpoint approaches differed by as much as ?4, ?20, and ?44 percent, respectively. It was concluded that the strategic approach may merit further investigation as a less time consuming and less costly alternative to the partition approach.

  1. Prediction of soil organic carbon partition coefficients by soil column liquid chromatography.

    PubMed

    Guo, Rongbo; Liang, Xinmiao; Chen, Jiping; Wu, Wenzhong; Zhang, Qing; Martens, Dieter; Kettrup, Antonius

    2004-04-30

    To avoid the limitation of the widely used prediction methods of soil organic carbon partition coefficients (KOC) from hydrophobic parameters, e.g., the n-octanol/water partition coefficients (KOW) and the reversed phase high performance liquid chromatographic (RP-HPLC) retention factors, the soil column liquid chromatographic (SCLC) method was developed for KOC prediction. The real soils were used as the packing materials of RP-HPLC columns, and the correlations between the retention factors of organic compounds on soil columns (ksoil) and KOC measured by batch equilibrium method were studied. Good correlations were achieved between ksoil and KOC for three types of soils with different properties. All the square of the correlation coefficients (R2) of the linear regression between log ksoil and log KOC were higher than 0.89 with standard deviations of less than 0.21. In addition, the prediction of KOC from KOW and the RP-HPLC retention factors on cyanopropyl (CN) stationary phase (kCN) was comparatively evaluated for the three types of soils. The results show that the prediction of KOC from kCN and KOW is only applicable to some specific types of soils. The results obtained in the present study proved that the SCLC method is appropriate for the KOC prediction for different types of soils, however the applicability of using hydrophobic parameters to predict KOC largely depends on the properties of soil concerned.

  2. Comparison of prediction methods for octanol-air partition coefficients of diverse organic compounds.

    PubMed

    Fu, Zhiqiang; Chen, Jingwen; Li, Xuehua; Wang, Ya'nan; Yu, Haiying

    2016-04-01

    The octanol-air partition coefficient (KOA) is needed for assessing multimedia transport and bioaccumulability of organic chemicals in the environment. As experimental determination of KOA for various chemicals is costly and laborious, development of KOA estimation methods is necessary. We investigated three methods for KOA prediction, conventional quantitative structure-activity relationship (QSAR) models based on molecular structural descriptors, group contribution models based on atom-centered fragments, and a novel model that predicts KOA via solvation free energy from air to octanol phase (ΔGO(0)), with a collection of 939 experimental KOA values for 379 compounds at different temperatures (263.15-323.15 K) as validation or training sets. The developed models were evaluated with the OECD guidelines on QSAR models validation and applicability domain (AD) description. Results showed that although the ΔGO(0) model is theoretically sound and has a broad AD, the prediction accuracy of the model is the poorest. The QSAR models perform better than the group contribution models, and have similar predictability and accuracy with the conventional method that estimates KOA from the octanol-water partition coefficient and Henry's law constant. One QSAR model, which can predict KOA at different temperatures, was recommended for application as to assess the long-range transport potential of chemicals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Bayesian approach to estimate AUC, partition coefficient and drug targeting index for studies with serial sacrifice design.

    PubMed

    Wang, Tianli; Baron, Kyle; Zhong, Wei; Brundage, Richard; Elmquist, William

    2014-03-01

    The current study presents a Bayesian approach to non-compartmental analysis (NCA), which provides the accurate and precise estimate of AUC 0 (∞) and any AUC 0 (∞) -based NCA parameter or derivation. In order to assess the performance of the proposed method, 1,000 simulated datasets were generated in different scenarios. A Bayesian method was used to estimate the tissue and plasma AUC 0 (∞) s and the tissue-to-plasma AUC 0 (∞) ratio. The posterior medians and the coverage of 95% credible intervals for the true parameter values were examined. The method was applied to laboratory data from a mice brain distribution study with serial sacrifice design for illustration. Bayesian NCA approach is accurate and precise in point estimation of the AUC 0 (∞) and the partition coefficient under a serial sacrifice design. It also provides a consistently good variance estimate, even considering the variability of the data and the physiological structure of the pharmacokinetic model. The application in the case study obtained a physiologically reasonable posterior distribution of AUC, with a posterior median close to the value estimated by classic Bailer-type methods. This Bayesian NCA approach for sparse data analysis provides statistical inference on the variability of AUC 0 (∞) -based parameters such as partition coefficient and drug targeting index, so that the comparison of these parameters following destructive sampling becomes statistically feasible.

  4. A new basal sauropod from the pre-Toarcian Jurassic of South Africa: evidence of niche-partitioning at the sauropodomorph–sauropod boundary?

    PubMed Central

    McPhee, Blair W.; Bonnan, Matthew F.; Yates, Adam M.; Neveling, Johann; Choiniere, Jonah N.

    2015-01-01

    The early evolution of sauropod dinosaurs remains poorly understood, with a paucity of unequivocal sauropod taxa known from the first twenty million years of the Jurassic. Recently, the Early Jurassic of South Africa has yielded an assemblage of dental and post-cranial remains displaying a more apomorphic character suite than any other similarly aged sauropodomorph. These remains are interpreted as a new species of basal sauropod and recovered cladistically as the sister taxon to Vulcanodon +more derived Sauropoda, underscoring its importance for our understanding of this pivotal period of sauropod evolution. Key changes in the dentition, axial skeleton and forelimb of this new species suggest a genuine functional distinction occurring at the sauropodiform-sauropod boundary. With reference to these changes, we propose a scenario in which interdependent refinements of the locomotory and feeding apparatus occurred in tandem with, or were effected by, restrictions in the amount of vertical forage initially available to the earliest sauropods. The hypothesized instance of niche-partitioning between basal sauropodan taxa and higher-browsing non-sauropodan sauropodomorphs may partially explain the rarity of true sauropods in the basal rocks of the Jurassic, while having the added corollary of couching the origins of Sauropoda in terms of an ecologically delimited ‘event’. PMID:26288028

  5. Light and dark adaptation mechanisms in the compound eyes of Myrmecia ants that occupy discrete temporal niches.

    PubMed

    Narendra, Ajay; Greiner, Birgit; Ribi, Willi A; Zeil, Jochen

    2016-08-15

    Ants of the Australian genus Myrmecia partition their foraging niche temporally, allowing them to be sympatric with overlapping foraging requirements. We used histological techniques to study the light and dark adaptation mechanisms in the compound eyes of diurnal (Myrmecia croslandi), crepuscular (M. tarsata, M. nigriceps) and nocturnal ants (M. pyriformis). We found that, except in the day-active species, all ants have a variable primary pigment cell pupil that constricts the crystalline cone in bright light to control for light flux. We show for the nocturnal M. pyriformis that the constriction of the crystalline cone by the primary pigment cells is light dependent whereas the opening of the aperture is regulated by an endogenous rhythm. In addition, in the light-adapted eyes of all species, the retinular cell pigment granules radially migrate towards the rhabdom, a process that in both the day-active M. croslandi and the night-active M. pyriformis is driven by ambient light intensity. Visual system properties thus do not restrict crepuscular and night-active ants to their temporal foraging niche, while day-active ants require high light intensities to operate. We discuss the ecological significance of these adaptation mechanisms and their role in temporal niche partitioning. © 2016. Published by The Company of Biologists Ltd.

  6. Two modified symplectic partitioned Runge-Kutta methods for solving the elastic wave equation

    NASA Astrophysics Data System (ADS)

    Su, Bo; Tuo, Xianguo; Xu, Ling

    2017-08-01

    Based on a modified strategy, two modified symplectic partitioned Runge-Kutta (PRK) methods are proposed for the temporal discretization of the elastic wave equation. The two symplectic schemes are similar in form but are different in nature. After the spatial discretization of the elastic wave equation, the ordinary Hamiltonian formulation for the elastic wave equation is presented. The PRK scheme is then applied for time integration. An additional term associated with spatial discretization is inserted into the different stages of the PRK scheme. Theoretical analyses are conducted to evaluate the numerical dispersion and stability of the two novel PRK methods. A finite difference method is used to approximate the spatial derivatives since the two schemes are independent of the spatial discretization technique used. The numerical solutions computed by the two new schemes are compared with those computed by a conventional symplectic PRK. The numerical results, which verify the new method, are superior to those generated by traditional conventional methods in seismic wave modeling.

  7. On a modification method of Lefschetz thimbles

    NASA Astrophysics Data System (ADS)

    Tsutsui, Shoichiro; Doi, Takahiro M.

    2018-03-01

    The QCD at finite density is not well understood yet, where standard Monte Carlo simulation suffers from the sign problem. In order to overcome the sign problem, the method of Lefschetz thimble has been explored. Basically, the original sign problem can be less severe in a complexified theory due to the constancy of the imaginary part of an action on each thimble. However, global phase factors assigned on each thimble still remain. Their interference is not negligible in a situation where a large number of thimbles contribute to the partition function, and this could also lead to a sign problem. In this study, we propose a method to resolve this problem by modifying the structure of Lefschetz thimbles such that only a single thimble is relevant to the partition function. It can be shown that observables measured in the original and modified theories are connected by a simple identity. We exemplify that our method works well in a toy model.

  8. A novel method to partition evapotranspiration based on the concept of underlying water use efficiency

    NASA Astrophysics Data System (ADS)

    Zhou, Sha; Yu, Bofu; Zhang, Yao; Huang, Yuefei; Wang, Guangqian

    2017-04-01

    Evapotranspiration (ET) is dominated by transpiration (T) in the terrestrial water cycle. However, continuous measurement of transpiration is still difficult, and the effect of vegetation on ET partitioning is unclear. The concept of underlying water use efficiency (uWUE) was used to develop a new method for ET partitioning by assuming that the maximum, or the potential uWUE is related to T while the averaged or apparent uWUE is related to ET. T/ET was thus estimated as the ratio of the apparent over the potential uWUE using half-hourly flux data from 17 AmeriFlux sites. The estimated potential uWUE was shown to be essentially constant for the 14 sites with a single vegetation type, and was broadly consistent with the uWUE evaluated at the leaf scale. The annual T/ET was the highest for croplands, i.e., 0.69 for corn and 0.62 for soybean, followed by grasslands (0.60) and evergreen needle leaf forests (0.56), and was the lowest for deciduous broadleaf forests (0.52). The enhanced vegetation index (EVI) was shown to be significantly correlated with T/ET and could explain about 75% of the variation in T/ET among the 71 site-years. The coefficients of determination between EVI and T/ET were 0.84 and 0.82 for corn and soybean, respectively, and 0.77 for deciduous broadleaf forests and grasslands, but only 0.37 for evergreen needle leaf forests. This ET partitioning method is sound in principle and simple to apply in practice, and would enhance the value and role of global FLUXNET in estimating T/ET variations and monitoring ecosystem dynamics.

  9. Multi-jagged: A scalable parallel spatial partitioning algorithm

    DOE PAGES

    Deveci, Mehmet; Rajamanickam, Sivasankaran; Devine, Karen D.; ...

    2015-03-18

    Geometric partitioning is fast and effective for load-balancing dynamic applications, particularly those requiring geometric locality of data (particle methods, crash simulations). We present, to our knowledge, the first parallel implementation of a multidimensional-jagged geometric partitioner. In contrast to the traditional recursive coordinate bisection algorithm (RCB), which recursively bisects subdomains perpendicular to their longest dimension until the desired number of parts is obtained, our algorithm does recursive multi-section with a given number of parts in each dimension. By computing multiple cut lines concurrently and intelligently deciding when to migrate data while computing the partition, we minimize data movement compared to efficientmore » implementations of recursive bisection. We demonstrate the algorithm's scalability and quality relative to the RCB implementation in Zoltan on both real and synthetic datasets. Our experiments show that the proposed algorithm performs and scales better than RCB in terms of run-time without degrading the load balance. Lastly, our implementation partitions 24 billion points into 65,536 parts within a few seconds and exhibits near perfect weak scaling up to 6K cores.« less

  10. The role of leaf height in plant competition for sunlight: analysis of a canopy partitioning model.

    PubMed

    Nevai, Andrew L; Vance, Richard R

    2008-01-01

    A global method of nullcline endpoint analysis is employed to determine the outcome of competition for sunlight between two hypothetical plant species with clonal growth form that differ solely in the height at which they place their leaves above the ground. This difference in vertical leaf placement, or canopy partitioning, produces species differences in sunlight energy capture and stem metabolic maintenance costs. The competitive interaction between these two species is analyzed by considering a special case of a canopy partitioning model (RR Vance and AL Nevai, J. Theor. Biol. 2007, 245:210-219; AL Nevai and RR Vance, J. Math. Biol. 2007, 55:105-145). Nullcline endpoint analysis is used to partition parameter space into regions within which either competitive exclusion or competitive coexistence occurs. The principal conclusion is that two clonal plant species which compete for sunlight and place their leaves at different heights above the ground but differ in no other way can, under suitable parameter values, experience stable coexistence even though they occupy an environment which varies neither over horizontal space nor through time.

  11. High-speed extended-term time-domain simulation for online cascading analysis of power system

    NASA Astrophysics Data System (ADS)

    Fu, Chuan

    A high-speed extended-term (HSET) time domain simulator (TDS), intended to become a part of an energy management system (EMS), has been newly developed for use in online extended-term dynamic cascading analysis of power systems. HSET-TDS includes the following attributes for providing situational awareness of high-consequence events: (i) online analysis, including n-1 and n-k events, (ii) ability to simulate both fast and slow dynamics for 1-3 hours in advance, (iii) inclusion of rigorous protection-system modeling, (iv) intelligence for corrective action ID, storage, and fast retrieval, and (v) high-speed execution. Very fast on-line computational capability is the most desired attribute of this simulator. Based on the process of solving algebraic differential equations describing the dynamics of power system, HSET-TDS seeks to develop computational efficiency at each of the following hierarchical levels, (i) hardware, (ii) strategies, (iii) integration methods, (iv) nonlinear solvers, and (v) linear solver libraries. This thesis first describes the Hammer-Hollingsworth 4 (HH4) implicit integration method. Like the trapezoidal rule, HH4 is symmetrically A-Stable but it possesses greater high-order precision (h4 ) than the trapezoidal rule. Such precision enables larger integration steps and therefore improves simulation efficiency for variable step size implementations. This thesis provides the underlying theory on which we advocate use of HH4 over other numerical integration methods for power system time-domain simulation. Second, motivated by the need to perform high speed extended-term time domain simulation (HSET-TDS) for on-line purposes, this thesis presents principles for designing numerical solvers of differential algebraic systems associated with power system time-domain simulation, including DAE construction strategies (Direct Solution Method), integration methods(HH4), nonlinear solvers(Very Dishonest Newton), and linear solvers(SuperLU). We have implemented a design appropriate for HSET-TDS, and we compare it to various solvers, including the commercial grade PSSE program, with respect to computational efficiency and accuracy, using as examples the New England 39 bus system, the expanded 8775 bus system, and PJM 13029 buses system. Third, we have explored a stiffness-decoupling method, intended to be part of parallel design of time domain simulation software for super computers. The stiffness-decoupling method is able to combine the advantages of implicit methods (A-stability) and explicit method(less computation). With the new stiffness detection method proposed herein, the stiffness can be captured. The expanded 975 buses system is used to test simulation efficiency. Finally, several parallel strategies for super computer deployment to simulate power system dynamics are proposed and compared. Design A partitions the task via scale with the stiffness decoupling method, waveform relaxation, and parallel linear solver. Design B partitions the task via the time axis using a highly precise integration method, the Kuntzmann-Butcher Method - order 8 (KB8). The strategy of partitioning events is designed to partition the whole simulation via the time axis through a simulated sequence of cascading events. For all strategies proposed, a strategy of partitioning cascading events is recommended, since the sub-tasks for each processor are totally independent, and therefore minimum communication time is needed.

  12. Chromatographic and spectroscopic methods for the determination of solvent properties of room temperature ionic liquids.

    PubMed

    Poole, Colin F

    2004-05-28

    Room temperature ionic liquids are novel solvents with favorable environmental and technical features. Synthetic routes to over 200 room temperature ionic liquids are known but for most ionic liquids physicochemical data are generally lacking or incomplete. Chromatographic and spectroscopic methods afford suitable tools for the study of solvation properties under conditions that approximate infinite dilution. Gas-liquid chromatography is suitable for the determination of gas-liquid partition coefficients and activity coefficients as well as thermodynamic constants derived from either of these parameters and their variation with temperature. The solvation parameter model can be used to define the contribution from individual intermolecular interactions to the gas-liquid partition coefficient. Application of chemometric procedures to a large database of system constants for ionic liquids indicates their unique solvent properties: low cohesion for ionic liquids with weakly associated ions compared with non-ionic liquids of similar polarity; greater hydrogen-bond basicity than typical polar non-ionic solvents; and a range of dipolarity/polarizability that encompasses the same range as occupied by the most polar non-ionic liquids. These properties can be crudely related to ion structures but further work is required to develop a comprehensive approach for the design of ionic liquids for specific applications. Data for liquid-liquid partition coefficients is scarce by comparison with gas-liquid partition coefficients. Preliminary studies indicate the possibility of using the solvation parameter model for interpretation of liquid-liquid partition coefficients determined by shake-flask procedures as well as the feasibility of using liquid-liquid chromatography for the convenient and rapid determination of liquid-liquid partition coefficients. Spectroscopic measurements of solvatochromic and fluorescent probe molecules in room temperature ionic liquids provide insights into solvent intermolecular interactions although interpretation of the different and generally uncorrelated "polarity" scales is sometimes ambiguous. All evidence points to the ionic liquids as a unique class of polar solvents suitable for technical development. In terms of designer solvents, however, further work is needed to fill the gaps in our knowledge of the relationship between ion structures and physicochemical properties.

  13. Functional roles of the major chloroplast lipids in the violaxanthin cycle.

    PubMed

    Yamamoto, Harry Y

    2006-08-01

    Monogalactosyldiacylglyceride (MGDG) and digalactosyldiacylglyceride (DGDG) are the major membrane lipids of chloroplasts. The question of the specialized functions of these unique lipids has received limited attention. One function is to support violaxanthin de-epoxidase (VDE) activity, an enzyme of the violaxanthin cycle. To understand better the properties of this system, the effects of galactolipids and phosphatidylcholines on VDE activity were examined by two independent methods. The results show that the micelle-forming lipid (MGDG) and bilayer forming lipids (DGDG and phosphatidylcholines) support VDE activity differently. MGDG supported rapid and complete de-epoxidation starting at a threshold lipid concentration (10 microM) coincident with complete solubilization of violaxanthin. In contrast, DGDG supported slow but nevertheless complete to nearly complete de-epoxidation at a lower lipid concentration (6.7 microM) that did not completely solubilize violaxanthin. Phosphotidylcholines showed similar effects as DGDG except that de-epoxidation was incomplete. Since VDE requires solubilized violaxanthin, aggregated violaxanthin in DGDG at low concentration must become solubilized as de-epoxidation proceeds. High lipid concentrations had lower activity possibly due to formation of multilayered structures (liposomes) that restrict accessibility of violaxanthin to VDE. MGDG micelles do not present such restrictions. The results indicate VDE operates throughout the lipid phase of the single bilayer thylakoid membrane and is not limited to putative MGDG micelle domains. Additionally, the results also explain the differential partitioning of violaxanthin between the envelope and thylakoid as due to the relative solubilities of violaxanthin and zeaxanthin in MGDG, DGDG and phospholipids. The violaxanthin cycle is hypothesized to be a linked system of the thylakoid and envelope for signal transduction of light stress.

  14. High-Order Implicit-Explicit Multi-Block Time-stepping Method for Hyperbolic PDEs

    NASA Technical Reports Server (NTRS)

    Nielsen, Tanner B.; Carpenter, Mark H.; Fisher, Travis C.; Frankel, Steven H.

    2014-01-01

    This work seeks to explore and improve the current time-stepping schemes used in computational fluid dynamics (CFD) in order to reduce overall computational time. A high-order scheme has been developed using a combination of implicit and explicit (IMEX) time-stepping Runge-Kutta (RK) schemes which increases numerical stability with respect to the time step size, resulting in decreased computational time. The IMEX scheme alone does not yield the desired increase in numerical stability, but when used in conjunction with an overlapping partitioned (multi-block) domain significant increase in stability is observed. To show this, the Overlapping-Partition IMEX (OP IMEX) scheme is applied to both one-dimensional (1D) and two-dimensional (2D) problems, the nonlinear viscous Burger's equation and 2D advection equation, respectively. The method uses two different summation by parts (SBP) derivative approximations, second-order and fourth-order accurate. The Dirichlet boundary conditions are imposed using the Simultaneous Approximation Term (SAT) penalty method. The 6-stage additive Runge-Kutta IMEX time integration schemes are fourth-order accurate in time. An increase in numerical stability 65 times greater than the fully explicit scheme is demonstrated to be achievable with the OP IMEX method applied to 1D Burger's equation. Results from the 2D, purely convective, advection equation show stability increases on the order of 10 times the explicit scheme using the OP IMEX method. Also, the domain partitioning method in this work shows potential for breaking the computational domain into manageable sizes such that implicit solutions for full three-dimensional CFD simulations can be computed using direct solving methods rather than the standard iterative methods currently used.

  15. How psychological framing affects economic market prices in the lab and field.

    PubMed

    Sonnemann, Ulrich; Camerer, Colin F; Fox, Craig R; Langer, Thomas

    2013-07-16

    A fundamental debate in social sciences concerns how individual judgments and choices, resulting from psychological mechanisms, are manifested in collective economic behavior. Economists emphasize the capacity of markets to aggregate information distributed among traders into rational equilibrium prices. However, psychologists have identified pervasive and systematic biases in individual judgment that they generally assume will affect collective behavior. In particular, recent studies have found that judged likelihoods of possible events vary systematically with the way the entire event space is partitioned, with probabilities of each of N partitioned events biased toward 1/N. Thus, combining events into a common partition lowers perceived probability, and unpacking events into separate partitions increases their perceived probability. We look for evidence of such bias in various prediction markets, in which prices can be interpreted as probabilities of upcoming events. In two highly controlled experimental studies, we find clear evidence of partition dependence in a 2-h laboratory experiment and a field experiment on National Basketball Association (NBA) and Federation Internationale de Football Association (FIFA World Cup) sports events spanning several weeks. We also find evidence consistent with partition dependence in nonexperimental field data from prediction markets for economic derivatives (guessing the values of important macroeconomic statistics) and horse races. Results in any one of the studies might be explained by a specialized alternative theory, but no alternative theories can explain the results of all four studies. We conclude that psychological biases in individual judgment can affect market prices, and understanding those effects requires combining a variety of methods from psychology and economics.

  16. How psychological framing affects economic market prices in the lab and field

    PubMed Central

    Sonnemann, Ulrich; Camerer, Colin F.; Fox, Craig R.; Langer, Thomas

    2013-01-01

    A fundamental debate in social sciences concerns how individual judgments and choices, resulting from psychological mechanisms, are manifested in collective economic behavior. Economists emphasize the capacity of markets to aggregate information distributed among traders into rational equilibrium prices. However, psychologists have identified pervasive and systematic biases in individual judgment that they generally assume will affect collective behavior. In particular, recent studies have found that judged likelihoods of possible events vary systematically with the way the entire event space is partitioned, with probabilities of each of N partitioned events biased toward 1/N. Thus, combining events into a common partition lowers perceived probability, and unpacking events into separate partitions increases their perceived probability. We look for evidence of such bias in various prediction markets, in which prices can be interpreted as probabilities of upcoming events. In two highly controlled experimental studies, we find clear evidence of partition dependence in a 2-h laboratory experiment and a field experiment on National Basketball Association (NBA) and Federation Internationale de Football Association (FIFA World Cup) sports events spanning several weeks. We also find evidence consistent with partition dependence in nonexperimental field data from prediction markets for economic derivatives (guessing the values of important macroeconomic statistics) and horse races. Results in any one of the studies might be explained by a specialized alternative theory, but no alternative theories can explain the results of all four studies. We conclude that psychological biases in individual judgment can affect market prices, and understanding those effects requires combining a variety of methods from psychology and economics. PMID:23818628

  17. Pesticides in the atmosphere: a comparison of gas-particle partitioning and particle size distribution of legacy and current-use pesticides

    NASA Astrophysics Data System (ADS)

    Degrendele, C.; Okonski, K.; Melymuk, L.; Landlová, L.; Kukučka, P.; Audy, O.; Kohoutek, J.; Čupr, P.; Klánová, J.

    2016-02-01

    This study presents a comparison of seasonal variation, gas-particle partitioning, and particle-phase size distribution of organochlorine pesticides (OCPs) and current-use pesticides (CUPs) in air. Two years (2012/2013) of weekly air samples were collected at a background site in the Czech Republic using a high-volume air sampler. To study the particle-phase size distribution, air samples were also collected at an urban and rural site in the area of Brno, Czech Republic, using a cascade impactor separating atmospheric particulates according to six size fractions. Major differences were found in the atmospheric distribution of OCPs and CUPs. The atmospheric concentrations of CUPs were driven by agricultural activities while secondary sources such as volatilization from surfaces governed the atmospheric concentrations of OCPs. Moreover, clear differences were observed in gas-particle partitioning; CUP partitioning was influenced by adsorption onto mineral surfaces while OCPs were mainly partitioning to aerosols through absorption. A predictive method for estimating the gas-particle partitioning has been derived and is proposed for polar and non-polar pesticides. Finally, while OCPs and the majority of CUPs were largely found on fine particles, four CUPs (carbendazim, isoproturon, prochloraz, and terbuthylazine) had higher concentrations on coarse particles ( > 3.0 µm), which may be related to the pesticide application technique. This finding is particularly important and should be further investigated given that large particles result in lower risks from inhalation (regardless the toxicity of the pesticide) and lower potential for long-range atmospheric transport.

  18. The design and research of anti-color-noise chaos M-ary communication system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Yongqing, E-mail: fuyongqing@hrbeu.edu.cn; Li, Xingyuan; Li, Yanan

    Previously a novel chaos M-ary digital communication method based on spatiotemporal chaos Hamilton oscillator has been proposed. Without chaos synchronization circumstance, it has performance improvement in bandwidth efficiency, transmission efficiency and anti-white-noise performance compared with traditional communication method. In this paper, the channel noise influence on chaotic modulation signals and the construction problem of anti-color-noise chaotic M-ary communication system are studied. The formula of zone partition demodulator’s boundary in additive white Gaussian noise is derived, besides, the problem about how to determine the boundary of zone partition demodulator in additive color noise is deeply studied; Then an approach on constructingmore » anti-color-noise chaos M-ary communication system is proposed, in which a pre-distortion filter is added after the chaos baseband modulator in the transmitter and whitening filter is added before zone partition demodulator in the receiver. Finally, the chaos M-ary communication system based on Hamilton oscillator is constructed and simulated in different channel noise. The result shows that the proposed method in this paper can improve the anti-color-noise performance of the whole communication system compared with the former system, and it has better anti-fading and resisting disturbance performance than Quadrature Phase Shift Keying system.« less

  19. Applications of Space-Filling-Curves to Cartesian Methods for CFD

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael J.; Berger, Marsha J.; Murman, Scott M.

    2003-01-01

    The proposed paper presents a variety novel uses of Space-Filling-Curves (SFCs) for Cartesian mesh methods in 0. While these techniques will be demonstrated using non-body-fitted Cartesian meshes, most are applicable on general body-fitted meshes -both structured and unstructured. We demonstrate the use of single O(N log N) SFC-based reordering to produce single-pass (O(N)) algorithms for mesh partitioning, multigrid coarsening, and inter-mesh interpolation. The intermesh interpolation operator has many practical applications including warm starts on modified geometry, or as an inter-grid transfer operator on remeshed regions in moving-body simulations. Exploiting the compact construction of these operators, we further show that these algorithms are highly amenable to parallelization. Examples using the SFC-based mesh partitioner show nearly linear speedup to 512 CPUs even when using multigrid as a smoother. Partition statistics are presented showing that the SFC partitions are, on-average, within 10% of ideal even with only around 50,000 cells in each subdomain. The inter-mesh interpolation operator also has linear asymptotic complexity and can be used to map a solution with N unknowns to another mesh with M unknowns with O(max(M,N)) operations. This capability is demonstrated both on moving-body simulations and in mapping solutions to perturbed meshes for finite-difference-based gradient design methods.

  20. Odour-causing compounds in air samples: gas-liquid partition coefficients and determination using solid-phase microextraction and GC with mass spectrometric detection.

    PubMed

    Godayol, Anna; Alonso, Mònica; Sanchez, Juan M; Anticó, Enriqueta

    2013-03-01

    A quantification method based on solid-phase microextraction followed by GC coupled to MS was developed for the determination of gas-liquid partition coefficients and for the air monitoring of a group of odour-causing compounds that had previously been found in wastewater samples including dimethyl disulphide, phenol, indole, skatole, octanal, nonanal, benzothiazole and some terpenes. Using a divinylbenzene/carboxen/polydimethylsiloxane fibre, adsorption kinetics have been studied to define an extraction time that would avoid coating saturation. It was found that for an extraction time of 10 min, external calibration could be performed in the range of 0.4-100 μg/m(3), with detection limits between 0.1 and 20 μg/m(3). Inter-day precision of the developed method was evaluated (n = 5) and RSD values between 12 and 24% were obtained for all compounds. The proposed method has been applied to the analysis of air samples surrounding a wastewater treatment plant in Catalonia (Spain). In all air samples evaluated, dimethyl disulphide, limonene and phenol were detected, and the first two were the compounds that showed the highest partition coefficients. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Displacement Threshold Energy and Recovery in an Al-Ti Nanolayered System with Intrinsic Point Defect Partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerboth, Matthew D.; Setyawan, Wahyu; Henager, Charles H.

    2014-01-07

    A method is established and validated using molecular dynamics (MD) to determine the displacement threshold energies as Ed in nanolayered, multilayered systems of dissimilar metals. The method is applied to specifically oriented nanolayered films of Al-Ti where the crystal structure and interface orientations are varied in atomic models and Ed is calculated. Methods for defect detection are developed and discussed based on prior research in the literature and based on specific crystallographic directions available in the nanolayered systems. These are compared and contrasted to similar calculations in corresponding bulk materials, including fcc Al, fcc Ti, hcp Al, and hcp Ti.more » In all cases, the calculated Ed in the multilayers are intermediate to the corresponding bulk values but exhibit some important directionality. In the nanolayer, defect detection demonstrated systematic differences in the behavior of Ed in each layer. Importantly, collision cascade damage exhibits significant defect partitioning within the Al and Ti layers that is hypothesized to be an intrinsic property of dissimilar nanolayered systems. This type of partitioning could be partly responsible for observed asymmetric radiation damage responses in many multilayered systems. In addition, a pseudo-random direction was introduced to approximate the average Ed without performing numerous simulations with random directions.« less

  2. A novel aqueous micellar two-phase system composed of surfactant and sorbitol for purification of pectinase enzyme from Psidium guajava and recycling phase components.

    PubMed

    Amid, Mehrnoush; Murshid, Fara Syazana; Manap, Mohd Yazid; Hussin, Muhaini

    2015-01-01

    A novel aqueous two-phase system composed of a surfactant and sorbitol was employed for the first time to purify pectinase from Psidium guajava. The influences of different parameters, including the type and concentration of the surfactant and the concentration and composition of the surfactant/sorbitol ratio, on the partitioning behavior and recovery of pectinase were investigated. Moreover, the effects of system pH and the crude load on purification fold and the yield of purified pectinase were studied. The experimental results indicated that the pectinase was partitioned into surfactant-rich top phase, and the impurities were partitioned into the sorbitol-rich bottom phase with the novel method involving an ATPS composed of 26% (w/w) Triton X-100 and 23% (w/w) sorbitol at 54.2% of the TLL crude load of 20% (w/w) at pH 6.0. The enzyme was successfully recovered by this method with a high purification factor of 15.2 and a yield of 98.3%, whereas the phase components were also recovered and recycled at rates above 96%. This study demonstrated that this novel ATPS method can be used as an efficient and economical alternative to the traditional ATPS for the purification and recovery of the valuable enzyme.

  3. Chemical amplification based on fluid partitioning

    DOEpatents

    Anderson, Brian L [Lodi, CA; Colston, Jr., Billy W.; Elkin, Chris [San Ramon, CA

    2006-05-09

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  4. Partitioning sources of variation in vertebrate species richness

    USGS Publications Warehouse

    Boone, R.B.; Krohn, W.B.

    2000-01-01

    Aim: To explore biogeographic patterns of terrestrial vertebrates in Maine, USA using techniques that would describe local and spatial correlations with the environment. Location: Maine, USA. Methods: We delineated the ranges within Maine (86,156 km2) of 275 species using literature and expert review. Ranges were combined into species richness maps, and compared to geomorphology, climate, and woody plant distributions. Methods were adapted that compared richness of all vertebrate classes to each environmental correlate, rather than assessing a single explanatory theory. We partitioned variation in species richness into components using tree and multiple linear regression. Methods were used that allowed for useful comparisons between tree and linear regression results. For both methods we partitioned variation into broad-scale (spatially autocorrelated) and fine-scale (spatially uncorrelated) explained and unexplained components. By partitioning variance, and using both tree and linear regression in analyses, we explored the degree of variation in species richness for each vertebrate group that Could be explained by the relative contribution of each environmental variable. Results: In tree regression, climate variation explained richness better (92% of mean deviance explained for all species) than woody plant variation (87%) and geomorphology (86%). Reptiles were highly correlated with environmental variation (93%), followed by mammals, amphibians, and birds (each with 84-82% deviance explained). In multiple linear regression, climate was most closely associated with total vertebrate richness (78%), followed by woody plants (67%) and geomorphology (56%). Again, reptiles were closely correlated with the environment (95%), followed by mammals (73%), amphibians (63%) and birds (57%). Main conclusions: Comparing variation explained using tree and multiple linear regression quantified the importance of nonlinear relationships and local interactions between species richness and environmental variation, identifying the importance of linear relationships between reptiles and the environment, and nonlinear relationships between birds and woody plants, for example. Conservation planners should capture climatic variation in broad-scale designs; temperatures may shift during climate change, but the underlying correlations between the environment and species richness will presumably remain.

  5. Comment on: "Split kinetic energy method for quantum systems with competing potentials", Ann. Phys. 327 (2012) 2061

    NASA Astrophysics Data System (ADS)

    Fernández, Francisco M.

    2018-06-01

    We show that the kinetic-energy partition method (KEP) is a particular example of the well known Rayleigh-Ritz variational method. We discuss some of the KEP results and compare them with those coming from other approaches.

  6. CRYOGENIC TRAPPING OF OXIDIZED MERCURY SPECIES FROM COMBUSTION FLUE GAS. (R827649)

    EPA Science Inventory

    To further understand the speciation and partitioning of mercury species in combustion systems, it is necessary to be able to identify and quantitate the various forms of oxidized mercury. Currently accepted methods for speciating mercury (Ontario Hydro Method, EPA Method 29, ...

  7. Chemical amplification based on fluid partitioning in an immiscible liquid

    DOEpatents

    Anderson, Brian L.; Colston, Bill W.; Elkin, Christopher J.

    2010-09-28

    A system for nucleic acid amplification of a sample comprises partitioning the sample into partitioned sections and performing PCR on the partitioned sections of the sample. Another embodiment of the invention provides a system for nucleic acid amplification and detection of a sample comprising partitioning the sample into partitioned sections, performing PCR on the partitioned sections of the sample, and detecting and analyzing the partitioned sections of the sample.

  8. Variable length adjacent partitioning for PTS based PAPR reduction of OFDM signal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibraheem, Zeyid T.; Rahman, Md. Mijanur; Yaakob, S. N.

    2015-05-15

    Peak-to-Average power ratio (PAPR) is a major drawback in OFDM communication. It leads the power amplifier into nonlinear region operation resulting into loss of data integrity. As such, there is a strong motivation to find techniques to reduce PAPR. Partial Transmit Sequence (PTS) is an attractive scheme for this purpose. Judicious partitioning the OFDM data frame into disjoint subsets is a pivotal component of any PTS scheme. Out of the existing partitioning techniques, adjacent partitioning is characterized by an attractive trade-off between cost and performance. With an aim of determining effects of length variability of adjacent partitions, we performed anmore » investigation into the performances of a variable length adjacent partitioning (VL-AP) and fixed length adjacent partitioning in comparison with other partitioning schemes such as pseudorandom partitioning. Simulation results with different modulation and partitioning scenarios showed that fixed length adjacent partition had better performance compared to variable length adjacent partitioning. As expected, simulation results showed a slightly better performance of pseudorandom partitioning technique compared to fixed and variable adjacent partitioning schemes. However, as the pseudorandom technique incurs high computational complexities, adjacent partitioning schemes were still seen as favorable candidates for PAPR reduction.« less

  9. Flexible ordering of antibody class switch and V(D)J joining during B-cell ontogeny

    PubMed Central

    Kumar, Satyendra; Wuerffel, Robert; Achour, Ikbel; Lajoie, Bryan; Sen, Ranjan; Dekker, Job; Feeney, Ann J.; Kenter, Amy L.

    2013-01-01

    V(D)J joining is mediated by RAG recombinase during early B-lymphocyte development in the bone marrow (BM). Activation-induced deaminase initiates isotype switching in mature B cells of secondary lymphoid structures. Previous studies questioned the strict ontological partitioning of these processes. We show that pro-B cells undergo robust switching to a subset of immunoglobulin H (IgH) isotypes. Chromatin studies reveal that in pro-B cells, the spatial organization of the Igh locus may restrict switching to this subset of isotypes. We demonstrate that in the BM, V(D)J joining and switching are interchangeably inducible, providing an explanation for the hyper-IgE phenotype of Omenn syndrome. PMID:24240234

  10. 43 CFR 2094.1 - Methods of measuring; restrictions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Methods of measuring; restrictions. 2094.1... Resource Values; Shore Space § 2094.1 Methods of measuring; restrictions. (a) In the consideration of.... (b) The same method of measuring shore space will be used in the case of special surveys, where legal...

  11. 43 CFR 2094.1 - Methods of measuring; restrictions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Methods of measuring; restrictions. 2094.1... Resource Values; Shore Space § 2094.1 Methods of measuring; restrictions. (a) In the consideration of.... (b) The same method of measuring shore space will be used in the case of special surveys, where legal...

  12. Three list scheduling temporal partitioning algorithm of time space characteristic analysis and compare for dynamic reconfigurable computing

    NASA Astrophysics Data System (ADS)

    Chen, Naijin

    2013-03-01

    Level Based Partitioning (LBP) algorithm, Cluster Based Partitioning (CBP) algorithm and Enhance Static List (ESL) temporal partitioning algorithm based on adjacent matrix and adjacent table are designed and implemented in this paper. Also partitioning time and memory occupation based on three algorithms are compared. Experiment results show LBP partitioning algorithm possesses the least partitioning time and better parallel character, as far as memory occupation and partitioning time are concerned, algorithms based on adjacent table have less partitioning time and less space memory occupation.

  13. Instantons on ALE spaces and orbifold partitions

    NASA Astrophysics Data System (ADS)

    Dijkgraaf, Robbert; Sułkowski, Piotr

    2008-03-01

    We consider Script N = 4 theories on ALE spaces of Ak-1 type. As is well known, their partition functions coincide with Ak-1 affine characters. We show that these partition functions are equal to the generating functions of some peculiar classes of partitions which we introduce under the name 'orbifold partitions'. These orbifold partitions turn out to be related to the generalized Frobenius partitions introduced by G. E. Andrews some years ago. We relate the orbifold partitions to the blended partitions and interpret explicitly in terms of a free fermion system.

  14. Venous tree separation in the liver: graph partitioning using a non-ising model.

    PubMed

    O'Donnell, Thomas; Kaftan, Jens N; Schuh, Andreas; Tietjen, Christian; Soza, Grzegorz; Aach, Til

    2011-01-01

    Entangled tree-like vascular systems are commonly found in the body (e.g., in the peripheries and lungs). Separation of these systems in medical images may be formulated as a graph partitioning problem given an imperfect segmentation and specification of the tree roots. In this work, we show that the ubiquitous Ising-model approaches (e.g., Graph Cuts, Random Walker) are not appropriate for tackling this problem and propose a novel method based on recursive minimal paths for doing so. To motivate our method, we focus on the intertwined portal and hepatic venous systems in the liver. Separation of these systems is critical for liver intervention planning, in particular when resection is involved. We apply our method to 34 clinical datasets, each containing well over a hundred vessel branches, demonstrating its effectiveness.

  15. A study of methods of prediction and measurement of the transmission sound through the walls of light aircraft

    NASA Technical Reports Server (NTRS)

    Forssen, B.; Wang, Y. S.; Crocker, M. J.

    1981-01-01

    Several aspects were studied. The SEA theory was used to develop a theoretical model to predict the transmission loss through an aircraft window. This work mainly consisted of the writing of two computer programs. One program predicts the sound transmission through a plexiglass window (the case of a single partition). The other program applies to the case of a plexiglass window window with a window shade added (the case of a double partition with an air gap). The sound transmission through a structure was measured in experimental studies using several different methods in order that the accuracy and complexity of all the methods could be compared. Also, the measurements were conducted on the simple model of a fuselage (a cylindrical shell), on a real aircraft fuselage, and on stiffened panels.

  16. A study of methods of prediction and measurement of the transmission sound through the walls of light aircraft

    NASA Astrophysics Data System (ADS)

    Forssen, B.; Wang, Y. S.; Crocker, M. J.

    1981-12-01

    Several aspects were studied. The SEA theory was used to develop a theoretical model to predict the transmission loss through an aircraft window. This work mainly consisted of the writing of two computer programs. One program predicts the sound transmission through a plexiglass window (the case of a single partition). The other program applies to the case of a plexiglass window window with a window shade added (the case of a double partition with an air gap). The sound transmission through a structure was measured in experimental studies using several different methods in order that the accuracy and complexity of all the methods could be compared. Also, the measurements were conducted on the simple model of a fuselage (a cylindrical shell), on a real aircraft fuselage, and on stiffened panels.

  17. Anharmonic effects in the quantum cluster equilibrium method

    NASA Astrophysics Data System (ADS)

    von Domaros, Michael; Perlt, Eva

    2017-03-01

    The well-established quantum cluster equilibrium (QCE) model provides a statistical thermodynamic framework to apply high-level ab initio calculations of finite cluster structures to macroscopic liquid phases using the partition function. So far, the harmonic approximation has been applied throughout the calculations. In this article, we apply an important correction in the evaluation of the one-particle partition function and account for anharmonicity. Therefore, we implemented an analytical approximation to the Morse partition function and the derivatives of its logarithm with respect to temperature, which are required for the evaluation of thermodynamic quantities. This anharmonic QCE approach has been applied to liquid hydrogen chloride and cluster distributions, and the molar volume, the volumetric thermal expansion coefficient, and the isobaric heat capacity have been calculated. An improved description for all properties is observed if anharmonic effects are considered.

  18. Lossless medical image compression using geometry-adaptive partitioning and least square-based prediction.

    PubMed

    Song, Xiaoying; Huang, Qijun; Chang, Sheng; He, Jin; Wang, Hao

    2018-06-01

    To improve the compression rates for lossless compression of medical images, an efficient algorithm, based on irregular segmentation and region-based prediction, is proposed in this paper. Considering that the first step of a region-based compression algorithm is segmentation, this paper proposes a hybrid method by combining geometry-adaptive partitioning and quadtree partitioning to achieve adaptive irregular segmentation for medical images. Then, least square (LS)-based predictors are adaptively designed for each region (regular subblock or irregular subregion). The proposed adaptive algorithm not only exploits spatial correlation between pixels but it utilizes local structure similarity, resulting in efficient compression performance. Experimental results show that the average compression performance of the proposed algorithm is 10.48, 4.86, 3.58, and 0.10% better than that of JPEG 2000, CALIC, EDP, and JPEG-LS, respectively. Graphical abstract ᅟ.

  19. Differential partition of virulent Aeromonas salmonicida and attenuated derivatives possessing specific cell surface alterations in polymer aqueous-phase systems

    NASA Technical Reports Server (NTRS)

    Van Alstine, J. M.; Trust, T. J.; Brooks, D. E.

    1986-01-01

    Two-polymer aqueous-phase systems in which partitioning of biological matter between the phases occurs according to surface properties such as hydrophobicity, charge, and lipid composition are used to compare the surface properties of strains of the fish pathogen Aeromonas salmonicida. The differential ability of strains to produce a surface protein array crucial to their virulence, the A layer, and to produce smooth lipopolysaccharide is found to be important in the partitioning behavior of Aeromonas salmonicida. The presence of the A layer is shown to decrease the surface hydrophilicity of the pathogen, and to increase specifically its surface affinity for fatty acid esters of polyethylene glycol. The method has application to the analysis of surface properties crucial to bacterial virulence, and to the selection of strains and mutants with specific surface characteristics.

  20. Spatial-temporal causal modeling: a data centric approach to climate change attribution (Invited)

    NASA Astrophysics Data System (ADS)

    Lozano, A. C.

    2010-12-01

    Attribution of climate change has been predominantly based on simulations using physical climate models. These approaches rely heavily on the employed models and are thus subject to their shortcomings. Given the physical models’ limitations in describing the complex system of climate, we propose an alternative approach to climate change attribution that is data centric in the sense that it relies on actual measurements of climate variables and human and natural forcing factors. We present a novel class of methods to infer causality from spatial-temporal data, as well as a procedure to incorporate extreme value modeling into our methodology in order to address the attribution of extreme climate events. We develop a collection of causal modeling methods using spatio-temporal data that combine graphical modeling techniques with the notion of Granger causality. “Granger causality” is an operational definition of causality from econometrics, which is based on the premise that if a variable causally affects another, then the past values of the former should be helpful in predicting the future values of the latter. In its basic version, our methodology makes use of the spatial relationship between the various data points, but treats each location as being identically distributed and builds a unique causal graph that is common to all locations. A more flexible framework is then proposed that is less restrictive than having a single causal graph common to all locations, while avoiding the brittleness due to data scarcity that might arise if one were to independently learn a different graph for each location. The solution we propose can be viewed as finding a middle ground by partitioning the locations into subsets that share the same causal structures and pooling the observations from all the time series belonging to the same subset in order to learn more robust causal graphs. More precisely, we make use of relationships between locations (e.g. neighboring relationship) by defining a relational graph in which related locations are connected (note that this relational graph, which represents relationships among the different locations, is distinct from the causal graph, which represents causal relationships among the individual variables - e.g. temperature, pressure- within a multivariate time series). We then define a hidden Markov Random Field (hMRF), assigning a hidden state to each node (location), with the state assignment guided by the prior information encoded in the relational graph. Nodes that share the same state in the hMRF model will have the same causal graph. State assignment can thus shed light on unknown relations among locations (e.g. teleconnection). While the model has been described in terms of hard location partitioning to facilitate its exposition, in fact a soft partitioning is maintained throughout learning. This leads to a form of transfer learning, which makes our model applicable even in situations where partitioning the locations might not seem appropriate. We first validate the effectiveness of our methodology on synthetic datasets, and then apply it to actual climate measurement data. The experimental results show that our approach offers a useful alternative to the simulation-based approach for climate modeling and attribution, and has the capability to provide valuable scientific insights from a new perspective.

Top