Forgács, Bálint; Bohrn, Isabel; Baudewig, Jürgen; Hofmann, Markus J; Pléh, Csaba; Jacobs, Arthur M
2012-11-15
The right hemisphere's role in language comprehension is supported by results from several neuropsychology and neuroimaging studies. Special interest surrounds right temporoparietal structures, which are thought to be involved in processing novel metaphorical expressions, primarily due to the coarse semantic coding of concepts. In this event related fMRI experiment we aimed at assessing the extent of semantic distance processing in the comprehension of figurative meaning to clarify the role of the right hemisphere. Four categories of German noun noun compound words were presented in a semantic decision task: a) conventional metaphors; b) novel metaphors; c) conventional literal, and; d) novel literal expressions, controlled for length, frequency, imageability, arousal, and emotional valence. Conventional literal and metaphorical compounds increased BOLD signal change in right temporoparietal regions, suggesting combinatorial semantic processing, in line with the coarse semantic coding theory, but at odds with the graded salience hypothesis. Both novel literal and novel metaphorical expressions increased activity in left inferior frontal areas, presumably as a result of phonetic, morphosyntactic, and semantic unification processes, challenging predictions regarding right hemispheric involvement in processing unusual meanings. Meanwhile, both conventional and novel metaphorical expressions induced BOLD signal change in left hemispherical regions, suggesting that even novel metaphor processing involves more than linking semantically distant concepts. Copyright © 2012 Elsevier Inc. All rights reserved.
2015-03-01
the cm1·ent program has failed to meet targeted retention across conununities while ove1paying nearly $5,300,000 dming FY-2013, according to Eric...Kelso. This thesis examines the potential improvements of applying unif01m-price auction, Quality Adjusted Discount (QUAD), and Combinatorial Retention ...responses, we developed individual quality scores and reservation prices to apply three auction mechanisms to the retention goals and costs of the
NASA Astrophysics Data System (ADS)
Besold, Tarek R.; Kühnberger, Kai-Uwe; Plaza, Enric
2017-10-01
Concept blending - a cognitive process which allows for the combination of certain elements (and their relations) from originally distinct conceptual spaces into a new unified space combining these previously separate elements, and enables reasoning and inference over the combination - is taken as a key element of creative thought and combinatorial creativity. In this article, we summarise our work towards the development of a computational-level and algorithmic-level account of concept blending, combining approaches from computational analogy-making and case-based reasoning (CBR). We present the theoretical background, as well as an algorithmic proposal integrating higher-order anti-unification matching and generalisation from analogy with amalgams from CBR. The feasibility of the approach is then exemplified in two case studies.
Rights-Based Education for South Asian Sponsored Wives in International Arranged Marriages
ERIC Educational Resources Information Center
Merali, Noorfarah
2008-01-01
The Family Class Category of Canada's Immigration Policy exists with the key objective of family unification. Among Canada's second largest immigrant group, the South Asians, the cultural practice of arranged marriage is applied across international borders, leading to spousal sponsorship. Existing research on South Asian sponsored wives suggests…
A Joint Prosodic Origin of Language and Music
Brown, Steven
2017-01-01
Vocal theories of the origin of language rarely make a case for the precursor functions that underlay the evolution of speech. The vocal expression of emotion is unquestionably the best candidate for such a precursor, although most evolutionary models of both language and speech ignore emotion and prosody altogether. I present here a model for a joint prosodic precursor of language and music in which ritualized group-level vocalizations served as the ancestral state. This precursor combined not only affective and intonational aspects of prosody, but also holistic and combinatorial mechanisms of phrase generation. From this common stage, there was a bifurcation to form language and music as separate, though homologous, specializations. This separation of language and music was accompanied by their (re)unification in songs with words. PMID:29163276
Consciousness and values in the quantum universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, H.P.
1985-01-01
Application of quantum mechanical description to neurophysiological processes appears to provide for a natural unification of the physical and humanistic sciences. The categories of thought used to represent physical and psychical processes become united, and the mechanical conception of man created by classical physics is replaced by a profoundly different quantum conception. This revised image of man allows human values to be rooted in contemporary science.
Seven Modeling Perspectives on Teaching and Learning: Some Interrelations and Cognitive Effects
ERIC Educational Resources Information Center
Easley, J. A., Jr.
1977-01-01
The categories of models associated with the seven perspectives are designated as combinatorial models, sampling models, cybernetic models, game models, critical thinking models, ordinary language analysis models, and dynamic structural models. (DAG)
Learning to Predict Combinatorial Structures
NASA Astrophysics Data System (ADS)
Vembu, Shankar
2009-12-01
The major challenge in designing a discriminative learning algorithm for predicting structured data is to address the computational issues arising from the exponential size of the output space. Existing algorithms make different assumptions to ensure efficient, polynomial time estimation of model parameters. For several combinatorial structures, including cycles, partially ordered sets, permutations and other graph classes, these assumptions do not hold. In this thesis, we address the problem of designing learning algorithms for predicting combinatorial structures by introducing two new assumptions: (i) The first assumption is that a particular counting problem can be solved efficiently. The consequence is a generalisation of the classical ridge regression for structured prediction. (ii) The second assumption is that a particular sampling problem can be solved efficiently. The consequence is a new technique for designing and analysing probabilistic structured prediction models. These results can be applied to solve several complex learning problems including but not limited to multi-label classification, multi-category hierarchical classification, and label ranking.
An Investigation of Partizan Misere Games
NASA Astrophysics Data System (ADS)
Allen, Meghan Rose
2010-08-01
Combinatorial games are played under two different play conventions: normal play, where the last player to move wins, and mis play, where the last player to move loses. Combinatorial games are also classified into impartial positions and partizan positions, where a position is impartial if both players have the same available moves and partizan otherwise. Mis play games lack many of the useful calculational and theoretical properties of normal play games. Until Plambeck's indistinguishability quotient and mis monoid theory were developed in 2004, research on mis play games had stalled. This thesis investigates partizan combinatorial mis play games, by taking Plambeck's indistinguishability and mis monoid theory for impartial positions and extending it to partizan ones, as well as examining the difficulties in constructing a category of mis play games in a similar manner to Joyal's category of normal play games. This thesis succeeds in finding an infinite set of positions which each have finite mis monoid, examining conditions on positions for when * + * is equivalent to 0, finding a set of positions which have Tweedledum-Tweedledee type strategy, and the two most important results of this thesis: giving necessary and sufficient conditions on a set of positions Upsilon such that the mis monoid of Upsilon is the same as the mis monoid of * and giving a construction theorem which builds all positions ξ such that the mis monoid of ξ is the same as the mis monoid of *.
Semantic Structures of One-Step Word Problems Involving Multiplication or Division.
ERIC Educational Resources Information Center
Schmidt, Siegbert; Weiser, Werner
1995-01-01
Proposes a four-category classification of semantic structures of one-step word problems involving multiplication and division: forming the n-th multiple of measures, combinatorial multiplication, composition of operators, and multiplication by formula. This classification is compatible with semantic structures of addition and subtraction word…
The Cause of Category-Based Distortions in Spatial Memory: A Distribution Analysis
ERIC Educational Resources Information Center
Sampaio, Cristina; Wang, Ranxiao Frances
2017-01-01
Recall of remembered locations reliably reflects a compromise between a target's true position and its region's prototypical position. The effect is quite robust, and a standard interpretation for these data is that the metric and categorical codings blend in a Bayesian combinatory fashion. However, there has been no direct experimental evidence…
Supersymmetry, Supergravity, and Unification
NASA Astrophysics Data System (ADS)
Nath, Pran
2016-12-01
Dedication; Preface; 1. A brief history of unification; 2. Gravitation; 3. Non-abelian gauge theory; 4. Spontaneous breaking of global and local symmetries; 5. The Standard Model; 6. Anomalies; 7. Effective Lagrangians; 8. Supersymmetry; 9. Grand unification; 10. MSSM Lagrangian; 11. N = 1 supergravity; 12. Coupling of supergravity with matter and gauge fields; 13. Supergravity grand unification; 14. Phenomenology of supergravity grand unification; 15. CP violation in supergravity unified theories; 16. Proton stability in supergravity unified theories; 17. Cosmology, astroparticle physics and SUGRA unification; 18. Extended supergravities and supergravities from superstrings; 19. Specialized topics; 20. The future of unification; 21. Appendices; 22. Notations, conventions, and formulae; 23. Physical constants; 24. List of books and reviews for further reading; Index.
Gravitational effects in models of grand unification
NASA Astrophysics Data System (ADS)
Reeb, David
Grand unified theories constitute an attractive idea bringing further coherence into our understanding of the fundamental forces of Nature beyond the well-accepted Standard Model. This dissertation contains a systematic study of the unification of gauge couplings associated with these forces in the presence of one or several effective dimension-5 operators cHG munuGmunu/4MPl, which are induced into the grand unified theory through gravitational interactions at the Planck scale. These operators alter the usually assumed condition for gauge coupling unification and can, depending on the Higgs content H of the theory and on its vacuum expectation value, lead to grand unification in models other than commonly believed and at scales Mx significantly different than naively expected. After presenting a general framework to treat such effects, we compute, for the case of SU(5) and SO(10) unification groups, the associated group theory constants necessary for the study of concrete models. We investigate the size of these effects in non-supersymmetric unification models and find that there exist regions of natural Wilson coefficients c in parameter space that achieve successful unification of the gauge couplings, while easily satisfying the bounds on the unification scale coming from the non-observation of proton decay. Both of these requirements are widely assumed to be violated in non-supersymmetric models of grand unification, but, as we show, can be fulfilled due to the effects coming from gravitational dimension-5 operators. A comparison to supersymmetric unification models shows that their parameter space for successful grand unification is no more natural than the one for the non-supersymmetric models. The main conclusion of this dissertation is that fairly minimal unification models are possible, i.e., with small unification groups and without supersymmetric particles. Whereas the observation of proton decay seems to be the only possible evidence for grand unification presently reachable, we should know within the next few years whether or not low-energy supersymmetry is realized in Nature. This dissertation includes previously published co-authored material.
SAT Encoding of Unification in EL
NASA Astrophysics Data System (ADS)
Baader, Franz; Morawska, Barbara
Unification in Description Logics has been proposed as a novel inference service that can, for example, be used to detect redundancies in ontologies. In a recent paper, we have shown that unification in EL is NP-complete, and thus of a complexity that is considerably lower than in other Description Logics of comparably restricted expressive power. In this paper, we introduce a new NP-algorithm for solving unification problems in EL, which is based on a reduction to satisfiability in propositional logic (SAT). The advantage of this new algorithm is, on the one hand, that it allows us to employ highly optimized state-of-the-art SAT solvers when implementing an EL-unification algorithm. On the other hand, this reduction provides us with a proof of the fact that EL-unification is in NP that is much simpler than the one given in our previous paper on EL-unification.
Moseley, Rachel L.; Pulvermüller, Friedemann
2014-01-01
Noun/verb dissociations in the literature defy interpretation due to the confound between lexical category and semantic meaning; nouns and verbs typically describe concrete objects and actions. Abstract words, pertaining to neither, are a critical test case: dissociations along lexical-grammatical lines would support models purporting lexical category as the principle governing brain organisation, whilst semantic models predict dissociation between concrete words but not abstract items. During fMRI scanning, participants read orthogonalised word categories of nouns and verbs, with or without concrete, sensorimotor meaning. Analysis of inferior frontal/insula, precentral and central areas revealed an interaction between lexical class and semantic factors with clear category differences between concrete nouns and verbs but not abstract ones. Though the brain stores the combinatorial and lexical-grammatical properties of words, our data show that topographical differences in brain activation, especially in the motor system and inferior frontal cortex, are driven by semantics and not by lexical class. PMID:24727103
Moseley, Rachel L; Pulvermüller, Friedemann
2014-05-01
Noun/verb dissociations in the literature defy interpretation due to the confound between lexical category and semantic meaning; nouns and verbs typically describe concrete objects and actions. Abstract words, pertaining to neither, are a critical test case: dissociations along lexical-grammatical lines would support models purporting lexical category as the principle governing brain organisation, whilst semantic models predict dissociation between concrete words but not abstract items. During fMRI scanning, participants read orthogonalised word categories of nouns and verbs, with or without concrete, sensorimotor meaning. Analysis of inferior frontal/insula, precentral and central areas revealed an interaction between lexical class and semantic factors with clear category differences between concrete nouns and verbs but not abstract ones. Though the brain stores the combinatorial and lexical-grammatical properties of words, our data show that topographical differences in brain activation, especially in the motor system and inferior frontal cortex, are driven by semantics and not by lexical class. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Cho, Gi-Chol; Hagiwara, Kaoru
1998-02-01
The string theory predicts the unification of the gauge couplings and gravity. The minimal supersymmetric Standard Model, however, gives the unification scale ~2x1016 GeV which is significantly smaller than the string scale ~5x1017 GeV of the weak coupling heterotic string theory. We study the unification scale of the non-supersymmetric minimal Standard Model quantitatively at the two-loop level. We find that the unification scale should be at most ~4x1016 GeV and the desired Kac-Moody level of the hyper-charge coupling should be 1.33<~kY<~1.35.
International Scientific Terminology and Neologisms in the Course of Unification.
ERIC Educational Resources Information Center
Stoberski, Zygmunt
1978-01-01
Provides a list of international medical and pharmaceutical terminology in three stages of development: (1) established international terms; (2) neologisms in the course of unification; and (3) recent neologisms in the course of unification. (AM)
PUP: An Architecture to Exploit Parallel Unification in Prolog
1988-03-01
environment stacking mo del similar to the Warren Abstract Machine [23] since it has been shown to be super ior to other known models (see [21]). The storage...execute in groups of independent operations. Unifications belonging to different group s may not overlap. Also unification operations belonging to the...since all parallel operations on the unification units must complete before any of the units can star t executing the next group of parallel
Internationalization or globalization of higher education.
Rezaei, Habibolah; Yousefi, Alireza; Larijani, Bagher; Dehnavieh, Reza; Rezaei, Nima; Adibi, Peyman
2018-01-01
Studies about globalization and internationalization demonstrate different attitudes in explaining these concepts. Since there is no consensus among Iranian specialists about these concepts, the purpose of this study is to explain the concepts of internationalization and globalization in Iran. This study is a systematic review done in the first half of 2016. To explain the concept of globalization and internationalization, articles in Scientific Information D atabase, Magiran database, and Google Scholar were searched with the keywords such as globalization, scientific exchange, international cooperation, curriculum exchange, student exchange, faculty exchange, multinational cooperation, transnational cooperation, and collaborative research. Articles, used in this study, were in Persian and were devoted to internationalization and globalization between 2001 and 2016. The criterion of discarding the articles was duplicity. As many as 180 Persian articles were found on this topic. After discarding repetitive articles, 64 remained. Among those, 39 articles mentioned the differences between globalization and internationalization. Definitions of globalization were categorized in four categories, including globalization, globalizing, globalization of higher education, and globalizing of higher education. Definitions about internationalization were categorized in five categories such as internationalization, internationalization of higher education, internationalization of the curriculum, internationalization of curriculum studies, and internationalization of curriculum profession. The spectrum of the globalization of higher education moves from dissonance and multipolarization to unification and single polarization of the world. One end of the spectrum, which is unification and single polarization of the world, is interpreted as globalization. The other side of the spectrum, which is dissonance and multipolarization, is interpreted as globalizing. The definition of internalization is the same as that of globalizing. In other words, it is possible to say that internalization is similar to globalizing but different from globalization.
Internationalization or globalization of higher education
Rezaei, Habibolah; Yousefi, Alireza; Larijani, Bagher; Dehnavieh, Reza; Rezaei, Nima; Adibi, Peyman
2018-01-01
INTRODUCTION: Studies about globalization and internationalization demonstrate different attitudes in explaining these concepts. Since there is no consensus among Iranian specialists about these concepts, the purpose of this study is to explain the concepts of internationalization and globalization in Iran. MATERIALS AND METHODS: This study is a systematic review done in the first half of 2016. To explain the concept of globalization and internationalization, articles in Scientific Information D atabase, Magiran database, and Google Scholar were searched with the keywords such as globalization, scientific exchange, international cooperation, curriculum exchange, student exchange, faculty exchange, multinational cooperation, transnational cooperation, and collaborative research. Articles, used in this study, were in Persian and were devoted to internationalization and globalization between 2001 and 2016. The criterion of discarding the articles was duplicity. RESULTS: As many as 180 Persian articles were found on this topic. After discarding repetitive articles, 64 remained. Among those, 39 articles mentioned the differences between globalization and internationalization. Definitions of globalization were categorized in four categories, including globalization, globalizing, globalization of higher education, and globalizing of higher education. Definitions about internationalization were categorized in five categories such as internationalization, internationalization of higher education, internationalization of the curriculum, internationalization of curriculum studies, and internationalization of curriculum profession. CONCLUSION: The spectrum of the globalization of higher education moves from dissonance and multipolarization to unification and single polarization of the world. One end of the spectrum, which is unification and single polarization of the world, is interpreted as globalization. The other side of the spectrum, which is dissonance and multipolarization, is interpreted as globalizing. The definition of internalization is the same as that of globalizing. In other words, it is possible to say that internalization is similar to globalizing but different from globalization. PMID:29417068
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-25
... Proposed Information Collection to OMB; Family Unification Program (FUP) AGENCY: Office of the Chief... Paperwork Reduction Act. The Department is soliciting public comments on the subject proposal. Application for the Family Unification Program: Makes Housing Choice Vouchers available to eligible families to...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... Proposed Information Collection to OMB Family Unification Program (FUP) AGENCY: Office of the Chief... Paperwork Reduction Act. The Department is soliciting public comments on the subject proposal. Application for the Family Unification Program: Makes Housing Choice Vouchers available to eligible families to...
NASA Astrophysics Data System (ADS)
Barger, V.; Jiang, Jing; Langacker, Paul; Li, Tianjun
We use a new approach to study string scale gauge coupling unification systematically, allowing both the possibility of noncanonical U(1)Y normalization and the existence of vector-like particles whose quantum numbers are the same as those of the Standard Model (SM) fermions and their Hermitian conjugates and the SM adjoint particles. We first give all the independent sets (Yi) of particles that can be employed to achieve SU(3)C and SU(2)L string scale gauge coupling unification and calculate their masses. Second, for a noncanonical U(1)Y normalization, we obtain string scale SU(3)C ×SU(2)L ×U(1)Y gauge coupling unification by choosing suitable U(1)Y normalizations for each of the Yi sets. Alternatively, for the canonical U(1)Y normalization, we achieve string scale gauge coupling unification by considering suitable combinations of the Yi sets or by introducing additional independent sets (Zi), that do not affect the SU(3)C ×SU(2)L unification at tree level, and then choosing suitable combinations, one from the Yi sets and one from the Zi sets. We also briefly discuss string scale gauge coupling unification in models with higher Kac-Moody levels for SU(2)L or SU(3)C.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-04
.... This information collection will support research on the role of Family Unification Program vouchers in... Use: This information collection will support research on the role of Family Unification Program... agencies (PHA) that have an allotment of Family Unification program vouchers (n=300) to determine whether...
Anti-gravity: The key to 21st century physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noyes, H.P.
1993-01-01
The masses coupling constants and cosmological parameters obtained using our discrete and combinatorial physics based on discrimination between bit-strings indicate that we can achieve the unification of quantum mechanics with relativity which had become the goal of twentieth century physics. To broaden our case we show that limitations on measurement of the position and velocity of an individual massive particle observed in a colliding beam scattering experiment imply real, rational commutation relations between position and velocity. Prior to this limit being pushed down to quantum effects, the lower bound is set by the available technology, but is otherwise scale invariant.more » Replacing force by force per unit mass and force per unit charge allows us to take over the Feynman-Dyson proof of the Maxwell Equations and extend it to weak gravity. The crossing symmetry of the individual scattering processes when one or more particles are replaced by anti-particles predicts both Coulomb attraction (for charged particles) and a Newtonian repulsion between any particle and its anti-particle. Previous quantum results remain intact, and predict the expected relativistic fine structure and spin dependencies. Experimental confirmation of this anti-gravity prediction would inaugurate the physics of the twenty-first century.« less
Anti-gravity: The key to 21st century physics
NASA Astrophysics Data System (ADS)
Noyes, H. P.
1993-01-01
The masses coupling constants and cosmological parameters obtained using our discrete and combinatorial physics based on discrimination between bit-strings indicate that we can achieve the unification of quantum mechanics with relativity which had become the goal of twentieth century physics. To broaden our case we show that limitations on measurement of the position and velocity of an individual massive particle observed in a colliding beam scattering experiment imply real, rational commutation relations between position and velocity. Prior to this limit being pushed down to quantum effects, the lower bound is set by the available technology, but is otherwise scale invariant. Replacing force by force per unit mass and force per unit charge allows us to take over the Feynman-Dyson proof of the Maxwell Equations and extend it to weak gravity. The crossing symmetry of the individual scattering processes when one or more particles are replaced by anti-particles predicts both Coulomb attraction (for charged particles) and a Newtonian repulsion between any particle and its anti-particle. Previous quantum results remain intact, and predict the expected relativistic fine structure and spin dependencies. Experimental confirmation of this anti-gravity prediction would inaugurate the physics of the twenty-first century.
Vector-like quarks and leptons, SU(5) ⊗ SU(5) grand unification, and proton decay
NASA Astrophysics Data System (ADS)
Lee, Chang-Hun; Mohapatra, Rabindra N.
2017-02-01
SU(5) ⊗ SU(5) provides a minimal grand unification scheme for fermions and gauge forces if there are vector-like quarks and leptons in nature. We explore the gauge coupling unification in a non-supersymmetric model of this type, and study its implications for proton decay. The properties of vector-like quarks and intermediate scales that emerge from coupling unification play a central role in suppressing proton decay. We find that in this model, the familiar decay mode p → e +π0 may have a partial lifetime within the reach of currently planned experiments.
Unification of gauge and Yukawa couplings
NASA Astrophysics Data System (ADS)
Abdalgabar, Ammar; Khojali, Mohammed Omer; Cornell, Alan S.; Cacciapaglia, Giacomo; Deandrea, Aldo
2018-01-01
The unification of gauge and top Yukawa couplings is an attractive feature of gauge-Higgs unification models in extra-dimensions. This feature is usually considered difficult to obtain based on simple group theory analyses. We reconsider a minimal toy model including the renormalisation group running at one loop. Our results show that the gauge couplings unify asymptotically at high energies, and that this may result from the presence of an UV fixed point. The Yukawa coupling in our toy model is enhanced at low energies, showing that a genuine unification of gauge and Yukawa couplings may be achieved.
NASA Astrophysics Data System (ADS)
Jansen, Peter A.; Watter, Scott
2012-03-01
Connectionist language modelling typically has difficulty with syntactic systematicity, or the ability to generalise language learning to untrained sentences. This work develops an unsupervised connectionist model of infant grammar learning. Following the semantic boostrapping hypothesis, the network distils word category using a developmentally plausible infant-scale database of grounded sensorimotor conceptual representations, as well as a biologically plausible semantic co-occurrence activation function. The network then uses this knowledge to acquire an early benchmark clausal grammar using correlational learning, and further acquires separate conceptual and grammatical category representations. The network displays strongly systematic behaviour indicative of the general acquisition of the combinatorial systematicity present in the grounded infant-scale language stream, outperforms previous contemporary models that contain primarily noun and verb word categories, and successfully generalises broadly to novel untrained sensorimotor grounded sentences composed of unfamiliar nouns and verbs. Limitations as well as implications to later grammar learning are discussed.
A model of comprehensive unification
NASA Astrophysics Data System (ADS)
Reig, Mario; Valle, José W. F.; Vaquera-Araujo, C. A.; Wilczek, Frank
2017-11-01
Comprehensive - that is, gauge and family - unification using spinors has many attractive features, but it has been challenged to explain chirality. Here, by combining an orbifold construction with more traditional ideas, we address that difficulty. Our candidate model features three chiral families and leads to an acceptable result for quantitative unification of couplings. A potential target for accelerator and astronomical searches emerges.
SU(5) unification with TeV-scale leptoquarks
Cox, Peter; Kusenko, Alexander; Sumensari, Olcyr; ...
2017-03-07
It has previously been noted that SU(5) unification can be achieved via the simple addition of light scalar leptoquarks from two split 10 multiplets. We explore the parameter space of this model in detail and find that unification requires at least one leptoquark to have mass below ≈ 16 TeV. We point out that introducing splitting of the 24 allows the unification scale to be raised beyond 10 16 GeV, while a U(1) PQ symmetry can be imposed to forbid dangerous proton decay mediated by the light leptoquarks. The latest bounds from LHC searches are combined and we find thatmore » a leptoquark as light as 400 GeV is still permitted. Finally, we discuss the interesting possibility that the leptoquarks required for unification could also be responsible for the 2.6σ deviation observed in the ratio R K at LHCb.« less
ERIC Educational Resources Information Center
Blankenship, Glen
This manual is designed to offer support for the instructional resources guides on "Germany since Unification." It provides the basis for a full-day inservice training session on the use of those materials. The format can be modified to meet the needs of leaders, audiences, and time frames. Using the materials developed by teachers and…
Candiano, Giovanni; Santucci, Laura; Petretto, Andrea; Lavarello, Chiara; Inglese, Elvira; Bruschi, Maurizio; Ghiggeri, Gian Marco; Boschetti, Egisto; Righetti, Pier Giorgio
2015-01-01
Combinatorial peptide ligand libraries (CPLLs) tend to bind complex molecules such as dyes due to their aromatic, heterocyclic, hydrophobic, and ionic nature that may affect the protein capture specificity. In this experimental work Alcian Blue 8GX, a positively charged phthalocyanine dye well-known to bind to glycoproteins and to glucosaminoglycans, was adsorbed on a chemically modified CPLL solid phase, and the behavior of the resulting conjugate was then investigated. The control and dye-adsorbed beads were used to harvest the human urinary proteome at physiological pH, this resulting in a grand total of 1151 gene products identified after the capture. Although the Alcian Blue-modified CPLL incremented the total protein capture by 115 species, it particularly enriched some families among the harvested proteins, such as glycoproteins and nucleotide-binding proteins. This study teaches that it is possible, via the two combined harvest mechanisms, to drive the CPLL capture toward the enrichment of specific protein categories.
Dissection of combinatorial control by the Met4 transcriptional complex.
Lee, Traci A; Jorgensen, Paul; Bognar, Andrew L; Peyraud, Caroline; Thomas, Dominique; Tyers, Mike
2010-02-01
Met4 is the transcriptional activator of the sulfur metabolic network in Saccharomyces cerevisiae. Lacking DNA-binding ability, Met4 must interact with proteins called Met4 cofactors to target promoters for transcription. Two types of DNA-binding cofactors (Cbf1 and Met31/Met32) recruit Met4 to promoters and one cofactor (Met28) stabilizes the DNA-bound Met4 complexes. To dissect this combinatorial system, we systematically deleted each category of cofactor(s) and analyzed Met4-activated transcription on a genome-wide scale. We defined a core regulon for Met4, consisting of 45 target genes. Deletion of both Met31 and Met32 eliminated activation of the core regulon, whereas loss of Met28 or Cbf1 interfered with only a subset of targets that map to distinct sectors of the sulfur metabolic network. These transcriptional dependencies roughly correlated with the presence of Cbf1 promoter motifs. Quantitative analysis of in vivo promoter binding properties indicated varying levels of cooperativity and interdependency exists between members of this combinatorial system. Cbf1 was the only cofactor to remain fully bound to target promoters under all conditions, whereas other factors exhibited different degrees of regulated binding in a promoter-specific fashion. Taken together, Met4 cofactors use a variety of mechanisms to allow differential transcription of target genes in response to various cues.
NASA Astrophysics Data System (ADS)
Popov, Oleg; White, G. A.
2017-10-01
Leptoquarks have been proposed as a possible explanation of anomalies in B bar ↦D* τ ν bar decays, the apparent anomalies in (g - 2) μ experiments and a violation of lepton universality. Motivated by this, we examine other motivations of leptoquarks: radiatively induced neutrino masses in the presence of a discrete symmetry that prevents a tree level see-saw mechanism, gauge coupling unification, and vacuum stability at least up to the unification scale. We present a new model for radiatively generating a neutrino mass which can significantly improve gauge coupling unification at one loop. We discuss this, and other models in the light of recent work on flavour anomalies.
NASA Astrophysics Data System (ADS)
Wu, Chenglai; Liu, Xiaohong; Lin, Zhaohui; Rhoades, Alan M.; Ullrich, Paul A.; Zarzycki, Colin M.; Lu, Zheng; Rahimi-Esfarjani, Stefan R.
2017-10-01
The reliability of climate simulations and projections, particularly in the regions with complex terrains, is greatly limited by the model resolution. In this study we evaluate the variable-resolution Community Earth System Model (VR-CESM) with a high-resolution (0.125°) refinement over the Rocky Mountain region. The VR-CESM results are compared with observations, as well as CESM simulation at a quasi-uniform 1° resolution (UNIF) and Canadian Regional Climate Model version 5 (CRCM5) simulation at a 0.11° resolution. We find that VR-CESM is effective at capturing the observed spatial patterns of temperature, precipitation, and snowpack in the Rocky Mountains with the performance comparable to CRCM5, while UNIF is unable to do so. VR-CESM and CRCM5 simulate better the seasonal variations of precipitation than UNIF, although VR-CESM still overestimates winter precipitation whereas CRCM5 and UNIF underestimate it. All simulations distribute more winter precipitation along the windward (west) flanks of mountain ridges with the greatest overestimation in VR-CESM. VR-CESM simulates much greater snow water equivalent peaks than CRCM5 and UNIF, although the peaks are still 10-40% less than observations. Moreover, the frequency of heavy precipitation events (daily precipitation ≥ 25 mm) in VR-CESM and CRCM5 is comparable to observations, whereas the same events in UNIF are an order of magnitude less frequent. In addition, VR-CESM captures the observed occurrence frequency and seasonal variation of rain-on-snow days and performs better than UNIF and CRCM5. These results demonstrate the VR-CESM's capability in regional climate modeling over the mountainous regions and its promising applications for climate change studies.
Grand unification and low scale implications: D2 parity for unification and neutrino masses
NASA Astrophysics Data System (ADS)
Tavartkiladze, Zurab
2014-06-01
The Grand Unified SU(5)-SU(5)' model, augmented with D2 Parity, is considered. The latter play crucial role for phenomenology. The model has several novel properties and gives interesting phenomenological implications. The charged leptons together with right handed (or sterile) neutrinos emerge es composite states. Within considered scenario, we study the charged fermion and neutrino mass generation. Moreover, we show that the model gives successful gauge coupling unification.
German Perceptions of the United States at Unification
1991-01-01
No comment No comment Country +5 to -5 (percent) yes no (percent) France +2.8 A 9 87 4 Austria +3.3 3 20 76 4 Soviet Union +1.3 3 36 61 4 Italy +1.8 4 6 89 5 Poland +0.1 4 58 39 3 USA +1.6 5 1 95 4 Sweden +2.8 4 3 92 5 Cuba +0.3 4 1 94 5 England +2.0 4 2 93 5 Hungary +2.0 3 38 59 3 Question 2: I’d like you to evaluate several countries from different standpoints. There are five categories that you can use to rate each country accord- ing to a numbering system. ŕ" means a "very good" rating-, Ś"
Higgs boson, sparticle masses and neutralino Dark Matter in Yukawa unified models
NASA Astrophysics Data System (ADS)
Un, Cem Salih
This dissertation collects our results that we obtain for a class of Yukawa unified SO(10) grand unified theories with non-universal soft supersymmetry breaking (SSB) gaugino mass parameters. As known for a long time, in contrast to its non-supersymmetrical version, SO(10) grand unified theories predict Yukawa coupling unification as well as gauge coupling and matter field unifications. The models considered in this thesis are assumed to be in the framework of gravity mediated supersymmetry breaking, and boundary conditions among the SSB terms are set by the group theoretical structure and breaking patterns of SO(10) at the grand unification scale (MGUT). In addition, we assume universality in the SSB mass terms assigned to the sfermion generations. Since Yukawa coupling unification implies contradictory mass relations for the first two generations, we consider a model with a larger Higgs sector. In this case, we assume that the MSSM Higgs doublets solely reside in 10 dimensional representation (10 H) of SO(10) and extra Higgs fields negligibly couple to the third generation sfermions in order to maintain Yukawa coupling unification for the third generation (when we mention Yukawa unification throughout this thesis, we mean Yukawa unification for the third family, a.k.a. t -b-tau Yukawa unification). First we consider a supersymmetric grand unified model in which SO(10) breaks into the MSSM via non-renormalizable dimension-5 operators involving non-singlet F--terms. In our case, we consider an F--term belonging to 54 dimensional representation of SO(10) and it develops a non-zero vacuum expectation value that non-trivially generates the SSB gaugino masses such that M 1 : M2 : M3 = --1 : --3 : 2. We consider the case with mu, M 1, M2 > 0 and M3 < 0 such that muM2 >0 and muM 3 < 0 always hold. This model with non-universal and relative-sign gaugino masses has one less parameter by setting the masses of Higgs doublets to be equivalent to each other at MGUT than those in the standard approach to Yukawa coupling unification. We briefly show also that Yukawa unification is possible even with one less parameter, if one considers a case in which all scalars of the MSSM including the Higgs doublets are assigned with the same SSB mass term. In the case of relative-sign SSB mass terms, the gaugino mass relation forms a subspace of SU(4)c x SU(2)L x SU(2) R (4-2-2). Even though 4-2-2 does not require gauge coupling unification, if one assumes that 4-2-2 breaks into the MSSM at an energy scale ˜ MGUT, then it can hold gauge coupling unification as well as Yukawa unification. As a generalization of the previous model, 4-2-2 results in a heavy spectrum for the color particles (˜ 3 TeV ) as well. We conclude this thesis by considering the anomalous magnetic moment of muon (muon g -- 2). First, we examine the conditions that are necessary in order to be consistent with the experimental measurements. Since the supersymmetric contribution to muon g -- 2 evolves as 1/M, where M is mass of the sparticle running in the loop, the MSSM needs to have light smuons and gauginos (bino and wino), while the 125 GeV Higgs boson requires heavier spectra. In order to resolve this conflict, we consider a case in which the first two generations of sfermions are split from the third generation in their SSB mass. Similarly the MSSM Higgs doublets have different masses from each other, while universality in gaugino masses is held. We show that our results can simultaneously be consistent with 125 GeV Higgs boson and muon g -- 2 within 1sigma deviation from its theoretical value. (Abstract shortened by UMI.)
Comment on "SU(5) octet scalar at the LHC"
NASA Astrophysics Data System (ADS)
Doršner, Ilja
2015-06-01
I address the validity of results presented in [S. Khalil, S. Salem, and M. Allam, Phys. Rev. D 89, 095011 (2014)] with regard to unification of gauge couplings within a particular S U (5 ) framework. The scalar sector of the proposed S U (5 ) model contains one 5-dimensional, one 24-dimensional, and one 45-dimensional representation. The authors discuss one specific unification scenario that supports the case for the LHC accessible color octet scalar. I show that the unification analysis in question is based on (i) an erroneous assumption related to the issue of nucleon stability and (ii) an incorrect input for the applicable set of renormalization group equations. This, in my view, invalidates the aforementioned gauge coupling unification study. I also question a source of the fermion mass relations presented in that work.
Model-based object classification using unification grammars and abstract representations
NASA Astrophysics Data System (ADS)
Liburdy, Kathleen A.; Schalkoff, Robert J.
1993-04-01
The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.
Effect of the Implicit Combinatorial Model on Combinatorial Reasoning in Secondary School Pupils.
ERIC Educational Resources Information Center
Batanero, Carmen; And Others
1997-01-01
Elementary combinatorial problems may be classified into three different combinatorial models: (1) selection; (2) partition; and (3) distribution. The main goal of this research was to determine the effect of the implicit combinatorial model on pupils' combinatorial reasoning before and after instruction. Gives an analysis of variance of the…
Unity of quark and lepton interactions with symplectic gauge symmetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajpoot, S.
1982-07-01
Properties of symplectic groups are reviewed and the gauge structure of Sp(2n) derived. The electroweak unification of leptons within Sp(8) gauge symmetry and grand unification of quarks and leptons within Sp(10) gauge symmetry are discussed.
Coverage of Unification Issue in North and South Korean Papers.
ERIC Educational Resources Information Center
Chang, Won H.
1981-01-01
Concludes that while the North Korean press gives more space to stories concerning the unification of the two Koreas than does the South, it is also more hostile and less varied in its coverage of the issue. (FL)
Karim, A.K.M. Rezaul; Proulx, Michael J.; Likova, Lora T.
2016-01-01
Reviewing the relevant literature in visual psychophysics and visual neuroscience we propose a three-stage model of directionality bias in visuospatial functioning. We call this model the ‘Perception-Action-Laterality’ (PAL) hypothesis. We analyzed the research findings for a wide range of visuospatial tasks, showing that there are two major directionality trends: clockwise versus anticlockwise. It appears these preferences are combinatorial, such that a majority of people fall in the first category demonstrating a preference for stimuli/objects arranged from left-to-right rather than from right-to-left, while people in the second category show an opposite trend. These perceptual biases can guide sensorimotor integration and action, creating two corresponding turner groups in the population. In support of PAL, we propose another model explaining the origins of the biases– how the neurogenetic factors and the cultural factors interact in a biased competition framework to determine the direction and extent of biases. This dynamic model can explain not only the two major categories of biases, but also the unbiased, unreliably biased or mildly biased cases in visuosptial functioning. PMID:27350096
The role of the left anterior temporal lobe in semantic composition vs. semantic memory.
Westerlund, Masha; Pylkkänen, Liina
2014-05-01
The left anterior temporal lobe (LATL) is robustly implicated in semantic processing by a growing body of literature. However, these results have emerged from two distinct bodies of work, addressing two different processing levels. On the one hand, the LATL has been characterized as a 'semantic hub׳ that binds features of concepts across a distributed network, based on results from semantic dementia and hemodynamic findings on the categorization of specific compared to basic exemplars. On the other, the LATL has been implicated in combinatorial operations in language, as shown by increased activity in this region associated with the processing of sentences and of basic phrases. The present work aimed to reconcile these two literatures by independently manipulating combination and concept specificity within a minimal MEG paradigm. Participants viewed simple nouns that denoted either low specificity (fish) or high specificity categories (trout) presented in either combinatorial (spotted fish/trout) or non-combinatorial contexts (xhsl fish/trout). By combining these paradigms from the two literatures, we directly compared the engagement of the LATL in semantic memory vs. semantic composition. Our results indicate that although noun specificity subtly modulates the LATL activity elicited by single nouns, it most robustly affects the size of the composition effect when these nouns are adjectivally modified, with low specificity nouns eliciting a much larger effect. We conclude that these findings are compatible with an account in which the specificity and composition effects arise from a shared mechanism of meaning specification. Copyright © 2014 Elsevier Ltd. All rights reserved.
Das, Ravi; Bhattacharjee, Shatabdi; Patel, Atit A; Harris, Jenna M; Bhattacharya, Surajit; Letcher, Jamin M; Clark, Sarah G; Nanda, Sumit; Iyer, Eswar Prasad R; Ascoli, Giorgio A; Cox, Daniel N
2017-12-01
Transcription factors (TFs) have emerged as essential cell autonomous mediators of subtype specific dendritogenesis; however, the downstream effectors of these TFs remain largely unknown, as are the cellular events that TFs control to direct morphological change. As dendritic morphology is largely dictated by the organization of the actin and microtubule (MT) cytoskeletons, elucidating TF-mediated cytoskeletal regulatory programs is key to understanding molecular control of diverse dendritic morphologies. Previous studies in Drosophila melanogaster have demonstrated that the conserved TFs Cut and Knot exert combinatorial control over aspects of dendritic cytoskeleton development, promoting actin and MT-based arbor morphology, respectively. To investigate transcriptional targets of Cut and/or Knot regulation, we conducted systematic neurogenomic studies, coupled with in vivo genetic screens utilizing multi-fluor cytoskeletal and membrane marker reporters. These analyses identified a host of putative Cut and/or Knot effector molecules, and a subset of these putative TF targets converge on modulating dendritic cytoskeletal architecture, which are grouped into three major phenotypic categories, based upon neuromorphometric analyses: complexity enhancer, complexity shifter, and complexity suppressor. Complexity enhancer genes normally function to promote higher order dendritic growth and branching with variable effects on MT stabilization and F-actin organization, whereas complexity shifter and complexity suppressor genes normally function in regulating proximal-distal branching distribution or in restricting higher order branching complexity, respectively, with spatially restricted impacts on the dendritic cytoskeleton. Collectively, we implicate novel genes and cellular programs by which TFs distinctly and combinatorially govern dendritogenesis via cytoskeletal modulation. Copyright © 2017 by the Genetics Society of America.
Unification of Gauge Couplings in the E{sub 6}SSM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Athron, P.; King, S. F.; Luo, R.
2010-02-10
We argue that in the two--loop approximation gauge coupling unification in the exceptional supersymmetric standard model (E{sub 6}SSM) can be achieved for any phenomenologically reasonable value of alpha{sub 3}(M{sub Z}) consistent with the experimentally measured central value.
Unification of Fundamental Forces
NASA Astrophysics Data System (ADS)
Salam, Abdus; Taylor, Foreword by John C.
2005-10-01
Foreword John C. Taylor; 1. Unification of fundamental forces Abdus Salam; 2. History unfolding: an introduction to the two 1968 lectures by W. Heisenberg and P. A. M. Dirac Abdus Salam; 3. Theory, criticism, and a philosophy Werner Heisenberg; 4. Methods in theoretical physics Paul Adrian Maurice Dirac.
Exploring the Use of Enterprise Content Management Systems in Unification Types of Organizations
NASA Astrophysics Data System (ADS)
Izza Arshad, Noreen; Mehat, Mazlina; Ariff, Mohamed Imran Mohamed
2014-03-01
The aim of this paper is to better understand how highly standardized and integrated businesses known as unification types of organizations use Enterprise Content Management Systems (ECMS) to support their business processes. Multiple case study approach was used to study the ways two unification organizations use their ECMS in their daily work practices. Arising from these case studies are insights into the differing ways in which ECMS is used to support businesses. Based on the comparisons of the two cases, this study proposed that unification organizations may use ECMS in four ways, for: (1) collaboration, (2) information sharing that supports a standardized process structure, (3) building custom workflows that support integrated and standardized processes, and (4) providing links and access to information systems. These findings may guide organizations that are highly standardized and integrated in fashion, to achieve their intended ECMS-use, to understand reasons for ECMS failures and underutilization and to exploit technologies investments.
NASA Astrophysics Data System (ADS)
Negoda, S. A.
2002-01-01
space activities. For the future legal regime of space activities it is vital to preserve the existed principles and main provisions of the international space law. related legislations are developing rapidly. They become serious instrument for legal regulation of space activities. those projects with a foreign party involvement. Quite often partners in international space projects agree to choice a domestic law of one of them. They do this for defining a certain organizational and/or contractual issue (disputes settlement, for example) of the project. that such practice will spread widely. could help to preserve the existed important provisions of international space law (responsibility of states for their national activities, for instance). development of international space private law. We believe that solely special laws and regulations of national legislations could not regulate modern space activities. Being more and more commercial, space activities are becoming a real part of "downed to Earth" commercial activities. Therefore, in many countries provisions of civil, commercial, investment and other branches of national law are applied to such activities. which could low possible risks of such activities and to control them. Such unification seems to be suitable in the following fields: 1)implementation of provisions of international space law in national space laws; 2)definition of unified terminology, accepted by national laws of all parties; 3)unification in national legislations of a certain standards (insurance rates and rules, for instance); 4)unification in national laws of issues related to liability (for instance, a mutual wave of liability in certain types of 5)implementation in national laws of unified rules and procedures of space-related commercial disputes settlement; 6)unification of mechanisms for protection of space-related intellectual property. unification of their provisions. Special attention is paid to provisions of private law (including collision norms). conflicts between parties and national laws in light of expanding of application of national laws' provisions to space activities, 2) unification and further development of international space private law will help to maintain the authority of international public space law and to keep a proper hierarchy between these branches.
Explanatory Unification by Proofs in School Mathematics
ERIC Educational Resources Information Center
Komatsu, Kotaro; Fujita, Taro; Jones, Keith; Naoki, Sue
2018-01-01
Kitcher's idea of 'explanatory unification', while originally proposed in the philosophy of science, may also be relevant to mathematics education, as a way of enhancing student thinking and achieving classroom activity that is closer to authentic mathematical practice. There is, however, no mathematics education research treating explanatory…
Educational Systems and Rising Inequality: Eastern Germany after Unification
ERIC Educational Resources Information Center
von Below, Susanne; Powell, Justin J. W.; Roberts, Lance W.
2013-01-01
Educational systems considerably influence educational opportunities and the resulting social inequalities. Contrasting institutional regulations of both structures and contents, the authors present a typology of educational system types in Germany to analyze their effects on social inequality in eastern Germany after unification. After 1990, the…
The disadvantage of combinatorial communication.
Lachmann, Michael; Bergstrom, Carl T.
2004-01-01
Combinatorial communication allows rapid and efficient transfer of detailed information, yet combinatorial communication is used by few, if any, non-human species. To complement recent studies illustrating the advantages of combinatorial communication, we highlight a critical disadvantage. We use the concept of information value to show that deception poses a greater and qualitatively different threat to combinatorial signalling than to non-combinatorial systems. This additional potential for deception may represent a strategic barrier that has prevented widespread evolution of combinatorial communication. Our approach has the additional benefit of drawing clear distinctions among several types of deception that can occur in communication systems. PMID:15556886
The disadvantage of combinatorial communication.
Lachmann, Michael; Bergstrom, Carl T
2004-11-22
Combinatorial communication allows rapid and efficient transfer of detailed information, yet combinatorial communication is used by few, if any, non-human species. To complement recent studies illustrating the advantages of combinatorial communication, we highlight a critical disadvantage. We use the concept of information value to show that deception poses a greater and qualitatively different threat to combinatorial signalling than to non-combinatorial systems. This additional potential for deception may represent a strategic barrier that has prevented widespread evolution of combinatorial communication. Our approach has the additional benefit of drawing clear distinctions among several types of deception that can occur in communication systems.
Unifying electromagnetism and gravitation without curvature
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuetze, D.
1985-10-01
This paper is devoted to a five-dimensional unification of the gravitational theory of Hayashi and Shirafuji with electromagnetism. Interference effects are found between gravitational contributions of matter spin and electromagnetism. This unification becomes the classical Kaluza--Klein theory if contributions of the torsion tensor related with spin are neglected.
ERIC Educational Resources Information Center
Matthiessen, Christian; Kasper, Robert
Consisting of two separate papers, "Representational Issues in Systemic Functional Grammar," by Christian Matthiessen and "Systemic Grammar and Functional Unification Grammar," by Robert Kasper, this document deals with systemic aspects of natural language processing and linguistic theory and with computational applications of…
Vygotsky's Analysis of Children's Meaning Making Processes
ERIC Educational Resources Information Center
Mahn, Holbrook
2012-01-01
Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…
Corpus-Based Optimization of Language Models Derived from Unification Grammars
NASA Technical Reports Server (NTRS)
Rayner, Manny; Hockey, Beth Ann; James, Frankie; Bratt, Harry; Bratt, Elizabeth O.; Gawron, Mark; Goldwater, Sharon; Dowding, John; Bhagat, Amrita
2000-01-01
We describe a technique which makes it feasible to improve the performance of a language model derived from a manually constructed unification grammar, using low-quality untranscribed speech data and a minimum of human annotation. The method is on a medium-vocabulary spoken language command and control task.
Authenticity and Unification in Quechua Language Planning.
ERIC Educational Resources Information Center
Hornberger, Nancy H.; King, Kendall
1998-01-01
Examines the potentially problematic tension between the goals of authenticity and unification in Quechua-language planning. One case study examines the orthographic debate that arose in Peru, and the second case study concerns two indigenous communities in Saraguro in the Southern Ecuadorian highlands where Spanish predominates but two Quichua…
Experience with Acore: Implementing GHC with Actors
1990-10-01
Our unification algorithm must differentiate between the two and perform the unification only from Foo to bar, no matter which of the two initially...variable Y. We :passive-unify it to X. Since X is uninstantiated, it buffers the : pasive -unify message. Now the guard goal, Y = ok is executed, and the
Schizoaffective disorder--an ongoing challenge for psychiatric nosology.
Jäger, M; Haack, S; Becker, T; Frasch, K
2011-04-01
Schizoaffective disorder is a common diagnosis in mental health services. The present article aims to provide an overview of diagnostic reliability, symptomatology, outcome, neurobiology and treatment of schizoaffective disorder. Literature was identified by searches in "Medline" and "Cochrane Library". The diagnosis of schizoaffective disorder has a low reliability. There are marked differences between the current diagnostic systems. With respect to psychopathological symptoms, no clear boundaries were found between schizophrenia, schizoaffective disorder and affective disorders. Common neurobiological factors were found across the traditional diagnostic categories. Schizoaffective disorder according to ICD-10 criteria, but not to DSM-IV criteria, shows a more favorable outcome than schizophrenia. With regard to treatment, only a small and heterogeneous database exists. Due to the low reliability and questionable validity there is a substantial need for revision and unification of the current diagnostic concepts of schizoaffective disorder. If future diagnostic systems return to Kraepelin's dichotomous classification of non-organic psychosis or adopt a dimensional diagnostic approach, schizoaffective disorder will disappear from the psychiatric nomenclature. A nosological model with multiple diagnostic entities, however, would be compatible with retaining the diagnostic category of schizoaffective disorder. Copyright © 2010 Elsevier Masson SAS. All rights reserved.
Scott-Phillips, Thomas C; Blythe, Richard A
2013-11-06
In a combinatorial communication system, some signals consist of the combinations of other signals. Such systems are more efficient than equivalent, non-combinatorial systems, yet despite this they are rare in nature. Why? Previous explanations have focused on the adaptive limits of combinatorial communication, or on its purported cognitive difficulties, but neither of these explains the full distribution of combinatorial communication in the natural world. Here, we present a nonlinear dynamical model of the emergence of combinatorial communication that, unlike previous models, considers how initially non-communicative behaviour evolves to take on a communicative function. We derive three basic principles about the emergence of combinatorial communication. We hence show that the interdependence of signals and responses places significant constraints on the historical pathways by which combinatorial signals might emerge, to the extent that anything other than the most simple form of combinatorial communication is extremely unlikely. We also argue that these constraints can be bypassed if individuals have the socio-cognitive capacity to engage in ostensive communication. Humans, but probably no other species, have this ability. This may explain why language, which is massively combinatorial, is such an extreme exception to nature's general trend for non-combinatorial communication.
ERIC Educational Resources Information Center
Berggreen, Ingeborg
1990-01-01
Discusses consequences of European unification in the Federal Republic of Germany. Focuses on the relationships between the European Community, the federal government of Germany, and the German states. Suggests that the German states are aware of their responsibility to give education and culture a European dimension. (NL)
77 FR 14538 - Announcement of Funding Awards Family Unification Program (FUP) Fiscal Year (FY) 2010
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... announcement contains the consolidated names and addresses of the award recipients for this year under the FUP.... Appendix A Fiscal Year 2010 Funding Awards for the Family Unification Program Recipient Address City State... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. 5415-FA-15] Announcement of Funding Awards...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-17
... Information Collection for Public Comment for the Family Unification Program (FUP) AGENCY: Office of the Assistant Secretary for Public and Indian Housing, HUD. ACTION: Notice. SUMMARY: The proposed information... Department is soliciting public comments on the subject proposal. DATES: Comments Due Date: February 15, 2011...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-26
... Information Collection for Public Comment for the Family Unification Program (FUP) AGENCY: Office of the Assistant Secretary for Public and Indian Housing, HUD. ACTION: Notice. SUMMARY: The proposed information... Department is soliciting public comments on the subject proposal. DATES: Comments Due Date: December 27, 2011...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-19
... DEPARTMENT OF STATE [Public Notice: 7017] Culturally Significant Object Imported for Exhibition Determinations: ``E Ku Ana Ka Paia: Unification, Responsibility and the Ku Images'' SUMMARY: Notice is hereby... object to be included in the exhibition ``E Ku Ana Ka Paia: Unification, Responsibility and the Ku Images...
NASA Astrophysics Data System (ADS)
Wells, James D.; Zhang, Zhengkang
2018-05-01
Dismissing traditional naturalness concerns while embracing the Higgs boson mass measurement and unification motivates careful analysis of trans-TeV supersymmetric theories. We take an effective field theory (EFT) approach, matching the Minimal Supersymmetric Standard Model (MSSM) onto the Standard Model (SM) EFT by integrating out heavy superpartners, and evolving MSSM and SMEFT parameters according to renormalization group equations in each regime. Our matching calculation is facilitated by the recent covariant diagrams formulation of functional matching techniques, with the full one-loop SUSY threshold corrections encoded in just 30 diagrams. Requiring consistent matching onto the SMEFT with its parameters (those in the Higgs potential in particular) measured at low energies, and in addition requiring unification of bottom and tau Yukawa couplings at the scale of gauge coupling unification, we detail the solution space of superpartner masses from the TeV scale to well above. We also provide detailed views of parameter space where Higgs coupling measurements have probing capability at future colliders beyond the reach of direct superpartner searches at the LHC.
Vosse, Theo; Kempen, Gerard
2009-12-01
We introduce a novel computer implementation of the Unification-Space parser (Vosse and Kempen in Cognition 75:105-143, 2000) in the form of a localist neural network whose dynamics is based on interactive activation and inhibition. The wiring of the network is determined by Performance Grammar (Kempen and Harbusch in Verb constructions in German and Dutch. Benjamins, Amsterdam, 2003), a lexicalist formalism with feature unification as binding operation. While the network is processing input word strings incrementally, the evolving shape of parse trees is represented in the form of changing patterns of activation in nodes that code for syntactic properties of words and phrases, and for the grammatical functions they fulfill. The system is capable, at least qualitatively and rudimentarily, of simulating several important dynamic aspects of human syntactic parsing, including garden-path phenomena and reanalysis, effects of complexity (various types of clause embeddings), fault-tolerance in case of unification failures and unknown words, and predictive parsing (expectation-based analysis, surprisal effects). English is the target language of the parser described.
ERIC Educational Resources Information Center
Duarte, Robert; Nielson, Janne T.; Dragojlovic, Veljko
2004-01-01
A group of techniques aimed at synthesizing a large number of structurally diverse compounds is called combinatorial synthesis. Synthesis of chemiluminescence esters using parallel combinatorial synthesis and mix-and-split combinatorial synthesis is experimented.
Building a Larger Tent for Public Health: Implications of the SOPHE-AAHE Unification
ERIC Educational Resources Information Center
Goodman, Robert Mark
2013-01-01
The unification of the American Association for Health Education (AAHE) and the Society for Public Health Education (SOPHE) generates a long-desired synergy, a ramping up of our leadership influence in promoting health. It also serves as an ongoing opportunity to reflect on how we synergize the distinct philosophic, scientific, and practical…
Principles of the Unification of Our Agency
ERIC Educational Resources Information Center
Roth, Klas
2011-01-01
Do we need principles of the unification of our agency, our mode of acting? Immanuel Kant and Christine Korsgaard argue that the reflective structure of our mind forces us to have some conception of ourselves, others and the world--including our agency--and that it is through will and reason, and in particular principles of our agency, that we…
A Solution to Moldova’s Transdniestrian Conflict: Regional Complex Interdependence
2003-06-01
16 a. Reunion with Romania in 1918 and the Transdniestrian Factor...Russia (Soviet Union) and Romania for Bessarabia (MSSR), when Transdniestria was used as a psychological check factor. a. Reunion with Romania in 1918 ...fate of Bessarabia, resulting in unification with Romania in 1918 . However, that unification was not a welcome outcome for Bessarabians but rather
Lyndon, Johnson "Will" Seek and Accept a New Term as Northern Vermont University
ERIC Educational Resources Information Center
Spaulding, Jeb
2017-01-01
The Vermont State Colleges Board of Trustees has decided to combine two small state colleges--Lyndon State College and Johnson State College--that have been the access institutions for the northern and often most economically challenged region of Vermont. Northern Vermont University will result from the unification. The unification will multiply…
Asessing for Structural Understanding in Childrens' Combinatorial Problem Solving.
ERIC Educational Resources Information Center
English, Lyn
1999-01-01
Assesses children's structural understanding of combinatorial problems when presented in a variety of task situations. Provides an explanatory model of students' combinatorial understandings that informs teaching and assessment. Addresses several components of children's structural understanding of elementary combinatorial problems. (Contains 50…
Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication
ERIC Educational Resources Information Center
Wolf, Michael Maclean
2009-01-01
Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…
Signals, cues and the nature of mimicry
2017-01-01
‘Mimicry’ is used in the evolutionary and ecological literature to describe diverse phenomena. Many are textbook examples of natural selection's power to produce stunning adaptations. However, there remains a lack of clarity over how mimetic resemblances are conceptually related to each other. The result is that categories denoting the traditional subdivisions of mimicry are applied inconsistently across studies, hindering attempts at conceptual unification. This review critically examines the logic by which mimicry can be conceptually organized and analysed. It highlights the following three evolutionarily relevant distinctions. (i) Are the model's traits being mimicked signals or cues? (ii) Does the mimic signal a fitness benefit or fitness cost in order to manipulate the receiver's behaviour? (iii) Is the mimic's signal deceptive? The first distinction divides mimicry into two broad categories: ‘signal mimicry’ and ‘cue mimicry’. ‘Signal mimicry’ occurs when mimic and model share the same receiver, and ‘cue mimicry’ when mimic and model have different receivers or when there is no receiver for the model's trait. ‘Masquerade’ fits conceptually within cue mimicry. The second and third distinctions divide both signal and cue mimicry into four types each. These are the three traditional mimicry categories (aggressive, Batesian and Müllerian) and a fourth, often overlooked category for which the term ‘rewarding mimicry’ is suggested. Rewarding mimicry occurs when the mimic's signal is non-deceptive (as in Müllerian mimicry) but where the mimic signals a fitness benefit to the receiver (as in aggressive mimicry). The existence of rewarding mimicry is a logical extension of the criteria used to differentiate the three well-recognized forms of mimicry. These four forms of mimicry are not discrete, immutable types, but rather help to define important axes along which mimicry can vary. PMID:28202806
Combinatorial structures to modeling simple games and applications
NASA Astrophysics Data System (ADS)
Molinero, Xavier
2017-09-01
We connect three different topics: combinatorial structures, game theory and chemistry. In particular, we establish the bases to represent some simple games, defined as influence games, and molecules, defined from atoms, by using combinatorial structures. First, we characterize simple games as influence games using influence graphs. It let us to modeling simple games as combinatorial structures (from the viewpoint of structures or graphs). Second, we formally define molecules as combinations of atoms. It let us to modeling molecules as combinatorial structures (from the viewpoint of combinations). It is open to generate such combinatorial structures using some specific techniques as genetic algorithms, (meta-)heuristics algorithms and parallel programming, among others.
Biana: a software framework for compiling biological interactions and analyzing networks
2010-01-01
Background The analysis and usage of biological data is hindered by the spread of information across multiple repositories and the difficulties posed by different nomenclature systems and storage formats. In particular, there is an important need for data unification in the study and use of protein-protein interactions. Without good integration strategies, it is difficult to analyze the whole set of available data and its properties. Results We introduce BIANA (Biologic Interactions and Network Analysis), a tool for biological information integration and network management. BIANA is a Python framework designed to achieve two major goals: i) the integration of multiple sources of biological information, including biological entities and their relationships, and ii) the management of biological information as a network where entities are nodes and relationships are edges. Moreover, BIANA uses properties of proteins and genes to infer latent biomolecular relationships by transferring edges to entities sharing similar properties. BIANA is also provided as a plugin for Cytoscape, which allows users to visualize and interactively manage the data. A web interface to BIANA providing basic functionalities is also available. The software can be downloaded under GNU GPL license from http://sbi.imim.es/web/BIANA.php. Conclusions BIANA's approach to data unification solves many of the nomenclature issues common to systems dealing with biological data. BIANA can easily be extended to handle new specific data repositories and new specific data types. The unification protocol allows BIANA to be a flexible tool suitable for different user requirements: non-expert users can use a suggested unification protocol while expert users can define their own specific unification rules. PMID:20105306
Biana: a software framework for compiling biological interactions and analyzing networks.
Garcia-Garcia, Javier; Guney, Emre; Aragues, Ramon; Planas-Iglesias, Joan; Oliva, Baldo
2010-01-27
The analysis and usage of biological data is hindered by the spread of information across multiple repositories and the difficulties posed by different nomenclature systems and storage formats. In particular, there is an important need for data unification in the study and use of protein-protein interactions. Without good integration strategies, it is difficult to analyze the whole set of available data and its properties. We introduce BIANA (Biologic Interactions and Network Analysis), a tool for biological information integration and network management. BIANA is a Python framework designed to achieve two major goals: i) the integration of multiple sources of biological information, including biological entities and their relationships, and ii) the management of biological information as a network where entities are nodes and relationships are edges. Moreover, BIANA uses properties of proteins and genes to infer latent biomolecular relationships by transferring edges to entities sharing similar properties. BIANA is also provided as a plugin for Cytoscape, which allows users to visualize and interactively manage the data. A web interface to BIANA providing basic functionalities is also available. The software can be downloaded under GNU GPL license from http://sbi.imim.es/web/BIANA.php. BIANA's approach to data unification solves many of the nomenclature issues common to systems dealing with biological data. BIANA can easily be extended to handle new specific data repositories and new specific data types. The unification protocol allows BIANA to be a flexible tool suitable for different user requirements: non-expert users can use a suggested unification protocol while expert users can define their own specific unification rules.
Results from the ESA-funded project 'Height System Unification with GOCE'
NASA Astrophysics Data System (ADS)
Sideris, M. G.; Rangelova, E. V.; Gruber, T.; Rummel, R. F.; Woodworth, P. L.; Hughes, C. W.; Ihde, J.; Liebsch, G.; Schäfer, U.; Rülke, A.; Gerlach, C.; Haagmans, R.
2013-12-01
The paper summarizes the main results of a project, supported by the European Space Agency, whose main goal is to identify the impact of GOCE gravity field models on height system unification. In particular, the Technical University Munich, the University of Calgary and the National Oceanography Centre in Liverpool, together with the Bavarian Academy of Sciences, the Federal German Agency for Cartography and Geodesy, and the Geodetic Surveys of Canada, USA and Mexico, have investigated the role of GOCE-derived gravity and geoid models for regional and global height datum connection. GOCE provides three important components of height unification: highly accurate potential differences (geopotential numbers), a global geoid- or quasi-geoid-based reference surface for elevations that is independent of inaccuracies and inconsistencies of local and regional data, and a consistent way to refer to the same datum all the relevant gravimetric, topographic and oceanographic data. We introduce briefly the methodology that has been applied in order to unify height system in North America, North Atlantic Ocean and Europe, and present results obtained using the available GOCE-derived satellite-only geopotential models, and their combination with terrestrial data and ocean models. The effects of various factors, such as data noise, omission errors, indirect bias terms, ocean models and temporal variations, on height datum unification are also presented, highlighting their magnitude and importance in the estimation of offsets between vertical datums. Based on the experiences gained in this project, a general roadmap has been developed for height datum unification in regions with good, as well as poor, coverage in gravity and geodetic height and tide gauge control stations.
Unification of Forces: The Road to Jointness?
1991-05-15
tend to resist large change--or innovation. Because organizations value "predictability, stability, and certainty," incremental change is the...preferred mode of behavior for organizations.29 Unification of the forces would be a large, rather than an incremental , change; thus, the services would...coordinating planning and bidgeting , providing unified direction, accounting and controlling weapons and equipment acquisition, eliminating duplication of
Gauge coupling unification and light exotica in string theory.
Raby, Stuart; Wingerter, Akin
2007-08-03
In this Letter we consider the consequences for the CERN Large Hadron Collider of light vectorlike exotica with fractional electric charge. It is shown that such states are found in orbifold constructions of the heterotic string. Moreover, these exotica are consistent with gauge coupling unification at one loop, even though they do not come in complete multiplets of SU(5).
Korean Unification: The Way Forward
2009-03-01
THE WAY FORWARD by Brian A. Forster March 2009 Thesis Advisor: Robert Weiner Second Reader: Christopher P .Twomey THIS PAGE ...INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is...possibility of a unified Korean nation. 15. NUMBER OF PAGES 109 14. SUBJECT TERMS Korean Unification, The Republic of Korea, The Democratic People’s
Code of Federal Regulations, 2012 CFR
2012-01-01
... Italy lives on in the millions of American women and men of Italian descent who strengthen and enrich... we mark this important milestone in Italian history, we also honor the joint efforts of Americans and... 150th Anniversary of the Unification of Italy. I encourage all Americans to learn more about the history...
Gauge coupling unification and nonequilibrium thermal dark matter.
Mambrini, Yann; Olive, Keith A; Quevillon, Jérémie; Zaldívar, Bryan
2013-06-14
We study a new mechanism for the production of dark matter in the Universe which does not rely on thermal equilibrium. Dark matter is populated from the thermal bath subsequent to inflationary reheating via a massive mediator whose mass is above the reheating scale T(RH). To this end, we consider models with an extra U(1) gauge symmetry broken at some intermediate scale (M(int) ≃ 10(10)-10(12) GeV). We show that not only does the model allow for gauge coupling unification (at a higher scale associated with grand unification) but it can provide a dark matter candidate which is a standard model singlet but charged under the extra U(1). The intermediate scale gauge boson(s) which are predicted in several E6/SO(10) constructions can be a natural mediator between dark matter and the thermal bath. We show that the dark matter abundance, while never having achieved thermal equilibrium, is fixed shortly after the reheating epoch by the relation T(RH)(3)/M(int)(4). As a consequence, we show that the unification of gauge couplings which determines M(int) also fixes the reheating temperature, which can be as high as T(RH) ≃ 10(11) GeV.
Fast Combinatorial Algorithm for the Solution of Linearly Constrained Least Squares Problems
Van Benthem, Mark H.; Keenan, Michael R.
2008-11-11
A fast combinatorial algorithm can significantly reduce the computational burden when solving general equality and inequality constrained least squares problems with large numbers of observation vectors. The combinatorial algorithm provides a mathematically rigorous solution and operates at great speed by reorganizing the calculations to take advantage of the combinatorial nature of the problems to be solved. The combinatorial algorithm exploits the structure that exists in large-scale problems in order to minimize the number of arithmetic operations required to obtain a solution.
Su, Zhangli
2016-01-01
Combinatorial patterns of histone modifications are key indicators of different chromatin states. Most of the current approaches rely on the usage of antibodies to analyze combinatorial histone modifications. Here we detail an antibody-free method named MARCC (Matrix-Assisted Reader Chromatin Capture) to enrich combinatorial histone modifications. The combinatorial patterns are enriched on native nucleosomes extracted from cultured mammalian cells and prepared by micrococcal nuclease digestion. Such enrichment is achieved by recombinant chromatin-interacting protein modules, or so-called reader domains, which can bind in a combinatorial modification-dependent manner. The enriched chromatin can be quantified by western blotting or mass spectrometry for the co-existence of histone modifications, while the associated DNA content can be analyzed by qPCR or next-generation sequencing. Altogether, MARCC provides a reproducible, efficient and customizable solution to enrich and analyze combinatorial histone modifications. PMID:26131849
A New Approach for Proving or Generating Combinatorial Identities
ERIC Educational Resources Information Center
Gonzalez, Luis
2010-01-01
A new method for proving, in an immediate way, many combinatorial identities is presented. The method is based on a simple recursive combinatorial formula involving n + 1 arbitrary real parameters. Moreover, this formula enables one not only to prove, but also generate many different combinatorial identities (not being required to know them "a…
Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner
2013-04-08
In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.
Zhang, H H; Gao, S; Chen, W; Shi, L; D'Souza, W D; Meyer, R R
2013-03-21
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equallyspaced beams (eplans), we have developed a global search metaheuristic process based on the nested partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are of superior quality.
Zhang, H H; Gao, S; Chen, W; Shi, L; D’Souza, W D; Meyer, R R
2013-01-01
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equally-spaced beams (eplans), we have developed a global search metaheuristic process based on the Nested Partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are superior quality. PMID:23459411
ERIC Educational Resources Information Center
Woyshner, Christine
2011-01-01
This articles discusses the unification of Alabama's black and white Parent-Teacher Associations from 1954 to 1971. Alabama was one of the last PTA state units to desegregate in the late 1960s, along with Arkansas, Georgia, Louisiana, Mississippi, South Carolina, and Texas. It was also the only state in which white members launched a successful…
Dynamic combinatorial libraries: new opportunities in systems chemistry.
Hunt, Rosemary A R; Otto, Sijbren
2011-01-21
Combinatorial chemistry is a tool for selecting molecules with special properties. Dynamic combinatorial chemistry started off aiming to be just that. However, unlike ordinary combinatorial chemistry, the interconnectedness of dynamic libraries gives them an extra dimension. An understanding of these molecular networks at systems level is essential for their use as a selection tool and creates exciting new opportunities in systems chemistry. In this feature article we discuss selected examples and considerations related to the advanced exploitation of dynamic combinatorial libraries for their originally conceived purpose of identifying strong binding interactions. Also reviewed are examples illustrating a trend towards increasing complexity in terms of network behaviour and reversible chemistry. Finally, new applications of dynamic combinatorial chemistry in self-assembly, transport and self-replication are discussed.
Properties of Low-mass AGN as They Relate to Unification and Massive AGN
NASA Astrophysics Data System (ADS)
Hood, Carol E.
2011-01-01
Current unification models of AGN suggest the observational differences between Type 1 and Type 2 objects are solely due to the orientation angle of the object. Observations have proved consistent with predictions and continue to strengthen the case for unification, however, many are still searching for "true" Type 2 objects, including predictions of their formation due to low luminosity or low accretion rate. Low-mass (< 106solar masses) AGN provide interesting environments in which these unification models can be studied. We also aim to compare the properties of low-mass AGN with their more massive counterparts to look for structural similarities and differences over a more substantial range of luminosities and accretion rates than previously studied. We present an in-depth multi-wavelength study of one of the prototypical low-mass AGN, POX 52, investigating the properties of the central engine along with that of the host galaxy. This includes data from the VLA, Spitzer, 2MASS, HST, GALEX, XMM, and Chandra, providing us with one of the most comprehensive looks into low-mass AGN. Unlike the other prototypical low-mass AGN, NGC 4395, POX 52 resides in a dwarf elliptical galaxy, accreting at ≈ 0.35 the Eddington limit. Additionally, we examine a sample 41 Type 1 and Type 2 objects, including POX 52 and NGC 4395, with the Spitzer IRS and a sub-sample of those with XMM to study the absorption properties of low-mass AGN, to test the validity of unification models in the low-mass regime, and to investigate possible structural differences between objects with low and high mass black holes and accretion rates. We will discuss the IR spectral shape and present emission-line diagnostics for Type 1 and Type 2 AGNs at low masses.
Contemporary data communications and local networking principles
NASA Astrophysics Data System (ADS)
Chartrand, G. A.
1982-08-01
The most important issue of data communications today is networking which can be roughly divided into two catagories: local networking; and distributed processing. The most sought after aspect of local networking is office automation. Office automation really is the grand unification of all local communications and not of a new type of business office as the name might imply. This unification is the ability to have voice, data, and video carried by the same medium and managed by the same network resources. There are many different ways this unification can be done, and many manufacturers are designing systems to accomplish the task. Distributed processing attempts to share resources between computer systems and peripheral subsystems from the same or different manufacturers. There are several companies that are trying to solve both networking problems with the same network architecture.
cDREM: inferring dynamic combinatorial gene regulation.
Wise, Aaron; Bar-Joseph, Ziv
2015-04-01
Genes are often combinatorially regulated by multiple transcription factors (TFs). Such combinatorial regulation plays an important role in development and facilitates the ability of cells to respond to different stresses. While a number of approaches have utilized sequence and ChIP-based datasets to study combinational regulation, these have often ignored the combinational logic and the dynamics associated with such regulation. Here we present cDREM, a new method for reconstructing dynamic models of combinatorial regulation. cDREM integrates time series gene expression data with (static) protein interaction data. The method is based on a hidden Markov model and utilizes the sparse group Lasso to identify small subsets of combinatorially active TFs, their time of activation, and the logical function they implement. We tested cDREM on yeast and human data sets. Using yeast we show that the predicted combinatorial sets agree with other high throughput genomic datasets and improve upon prior methods developed to infer combinatorial regulation. Applying cDREM to study human response to flu, we were able to identify several combinatorial TF sets, some of which were known to regulate immune response while others represent novel combinations of important TFs.
Lee, M L; Schneider, G
2001-01-01
Natural products were analyzed to determine whether they contain appealing novel scaffold architectures for potential use in combinatorial chemistry. Ring systems were extracted and clustered on the basis of structural similarity. Several such potential scaffolds for combinatorial chemistry were identified that are not present in current trade drugs. For one of these scaffolds a virtual combinatorial library was generated. Pharmacophoric properties of natural products, trade drugs, and the virtual combinatorial library were assessed using a self-organizing map. Obviously, current trade drugs and natural products have several topological pharmacophore patterns in common. These features can be systematically explored with selected combinatorial libraries based on a combination of natural product-derived and synthetic molecular building blocks.
Proof-Term Synthesis on Dependent-Type Systems via Explicit Substitutions
1999-11-01
oriented functional language OCaml , in about 50 lines. We have also implemented a higher-order unification algorithm for ground expressions. The soundness... OCaml , and it is electronically available by contacting the author. The underlying theory of the method proposed here is the An^-calculus. We believe...CORNES, Conception d’un langage de haut niveau de representation de preuves: recurrence par filtrage de motifs, unification en presence de types
ERIC Educational Resources Information Center
Pinquart, Martin; Silbereisen, Rainer K.; Juang, Linda P.
2004-01-01
Abrupt social change, such as the breakdown of a political system of the former communist states, presents a major adaptive challenge to the individual. The authors analyzed whether commitment to the old political system and high self-efficacy beliefs measured before German unification would predict change in psychological distress in East German…
LHC phenomenology of natural MSSM with non-universal gaugino masses at the unification scale
NASA Astrophysics Data System (ADS)
Abe, Hiroyuki; Kawamura, Junichiro; Omura, Yuji
2015-08-01
In this letter, we study collider phenomenology in the supersymmetric Standard Model with a certain type of non-universal gaugino masses at the gauge coupling unification scale, motivated by the little hierarchy problem. In this scenario, especially the wino mass is relatively large compared to the gluino mass at the unification scale, and the heavy wino can relax the fine-tuning of the higgsino mass parameter, so-called μ-parameter. Besides, it will enhance the lightest Higgs boson mass due to the relatively large left-right mixing of top squarks through the renormalization group (RG) effect. Then 125 GeV Higgs boson could be accomplished, even if the top squarks are lighter than 1 TeV and the μ parameter is within a few hundreds GeV. The right-handed top squark tends to be lighter than the other sfermions due to the RG runnings, then we focus on the top squark search at the LHC. Since the top squark is almost right-handed and the higgsinos are nearly degenerate, 2 b + E T miss channel is the most sensitive to this scenario. We figure out current and expected experimental bounds on the lightest top squark mass and model parameters at the gauge coupling unification scale.
Use of combinatorial chemistry to speed drug discovery.
Rádl, S
1998-10-01
IBC's International Conference on Integrating Combinatorial Chemistry into the Discovery Pipeline was held September 14-15, 1998. The program started with a pre-conference workshop on High-Throughput Compound Characterization and Purification. The agenda of the main conference was divided into sessions of Synthesis, Automation and Unique Chemistries; Integrating Combinatorial Chemistry, Medicinal Chemistry and Screening; Combinatorial Chemistry Applications for Drug Discovery; and Information and Data Management. This meeting was an excellent opportunity to see how big pharma, biotech and service companies are addressing the current bottlenecks in combinatorial chemistry to speed drug discovery. (c) 1998 Prous Science. All rights reserved.
Karim, A K M Rezaul; Proulx, Michael J; Likova, Lora T
2016-09-01
Orientation bias and directionality bias are two fundamental functional characteristics of the visual system. Reviewing the relevant literature in visual psychophysics and visual neuroscience we propose here a three-stage model of directionality bias in visuospatial functioning. We call this model the 'Perception-Action-Laterality' (PAL) hypothesis. We analyzed the research findings for a wide range of visuospatial tasks, showing that there are two major directionality trends in perceptual preference: clockwise versus anticlockwise. It appears these preferences are combinatorial, such that a majority of people fall in the first category demonstrating a preference for stimuli/objects arranged from left-to-right rather than from right-to-left, while people in the second category show an opposite trend. These perceptual biases can guide sensorimotor integration and action, creating two corresponding turner groups in the population. In support of PAL, we propose another model explaining the origins of the biases - how the neurogenetic factors and the cultural factors interact in a biased competition framework to determine the direction and extent of biases. This dynamic model can explain not only the two major categories of biases in terms of direction and strength, but also the unbiased, unreliably biased or mildly biased cases in visuosptial functioning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Pati, Jogesh C.
2017-03-01
By way of paying tribute to Abdus Salam, I first recall the ideas of higher unification which the two of us introduced in 1972-73 to remove certain shortcomings in the status of particle physics prevailing then, and then present their current role in theory as well as experiments. These attempts initiated the idea of grand unification and provided the core symmetry-structure G(2, 2, 4) = SU(2)L × SU(2)R × SU(4)-color towards such a unification. Embodied with quark-lepton unification and left-right symmetry, the symmetry G(2, 2, 4) is uniquely chosen as being the minimal one that permits members of a family to belong to a single multiplet. The minimal extension of G(2, 2, 4) to a simple group is given by the attractive SO(10)-symmetry that was suggested a year later. The new concepts, and the many advantages introduced by this core symmetry (which are, of course, retained by SO(10) as well) are noted. These include explanations of the observed: (i) (rather weird) electroweak and color quantum numbers of the members of a family; (ii) quantization of electric charge; (iii) electron-proton charge-ratio being - 1; (iv) the co-existence of quarks and leptons; (v) likewise that of the three basic forces — the weak, electromagnetic and strong; (vi) the non-trivial cancelation of the triangle anomalies within each family; and opening the door for (vii) the appealing concept of parity being an exact symmetry of nature at the fundamental level. In addition, as a distinguishing feature, both because of SU(4)-color and independently because of SU(2)R as well, the symmetry G(2, 2, 4) introduced, to my knowledge, for the first time in the literature: (viii) a new kind of matter — the right-handed (RH) neutrino (νR) — as a compelling member of each family, and together with it; (ix) (B-L) as a local symmetry. The RH neutrions — contrary to prejudices held in the 1970’s against neutrinos being massive and thereby against the existence of νR’s as well — have in fact turned out to be an asset. They are needed to (a) understand naturally the tiny mass-scales observed in neutrino oscillations by combining the seesaw mechanism together with the unification ideas based on the symmetry SU(4)-color, and also (b) to implement the attractive mechanism of baryogenesis via leptogenesis. The quantitative success of the attempts as regards understanding both (a) and (b) are discussed in Sec. 6. These provide a clear support simultaneously for the following three features: (i) the seesaw mechanism, (ii) the SU(4)-color route to higher unification based on a symmetry like SO(10) or a string-derived G(2, 2, 4) symmetry in 4D, as opposed to alternative symmetries like SU(5) or even [SU(3)]3, and (iii) the (B-L)-breaking scale being close to the unification scale ˜ 2 × 1016 GeV. The observed dramatic meeting of the three gauge couplings in the context of low-energy supersymmetry, at a scale MU ˜ 2 × 1016 GeV, providing strong evidence in favor of the ideas of both grand unification and supersymmetry, is discussed in Sec. 3. The implications of such a meeting in the context of string-unification are briefly mentioned. Weighing the possibility of a stringy origin of gauge coupling unification versus the familiar problem of doublet-triplet splitting in supersymmetric SO(10) (or SU(5)), I discuss the common advantages as well as relative merits and demerits of an effective SO(10) versus a string-derived G(2, 2, 4) symmetry in 4D. In Sec. 7, I discuss the hallmark prediction of grand unification, viz. proton decay, which is a generic feature of most models of grand unification. I present results of works carried out in collaboration with Babu and Wilczek and most recently with Babu and Tavartkiladze on expectations for decay modes and lifetimes for proton decay, including upper limits for such lifetimes, in the context of a well-motivated class of supersymmetric SO(10)-models. In view of such expectations, I stress the pressing need for having the next-generation large underground detectors — like DUNE and HyperKamiokande — coupled to long-baseline neutrino beams to search simultaneously with high sensitivity for (a) proton decay, (b) neutrino oscillations and (c) supernova neutrinos. It is remarked that the potential for major discoveries through these searches would be high. Some concluding remarks on the invaluable roles of neutrinos and especially of proton decay in probing physics at the highest energy scales are made in the last section. The remarkable success of a class of supersymmetric grand unification models (discussed here) in explaining a large set of distinct phenomena is summarized. Noticing such a success and yet its limitations in addressing some fundamental issues within its premises, such as an understanding of the origin of the three families, and most importantly, the realization of a well-understood unified quantum theory of gravity describing reality, some wishes are expressed on the possible emergence and the desirable role of a string-derived grand-unified bridge between string/M-theory in higher dimensions and the world of phenomena at low energies.
Microbatteries for Combinatorial Studies of Conventional Lithium-Ion Batteries
NASA Technical Reports Server (NTRS)
West, William; Whitacre, Jay; Bugga, Ratnakumar
2003-01-01
Integrated arrays of microscopic solid-state batteries have been demonstrated in a continuing effort to develop microscopic sources of power and of voltage reference circuits to be incorporated into low-power integrated circuits. Perhaps even more importantly, arrays of microscopic batteries can be fabricated and tested in combinatorial experiments directed toward optimization and discovery of battery materials. The value of the combinatorial approach to optimization and discovery has been proven in the optoelectronic, pharmaceutical, and bioengineering industries. Depending on the specific application, the combinatorial approach can involve the investigation of hundreds or even thousands of different combinations; hence, it is time-consuming and expensive to attempt to implement the combinatorial approach by building and testing full-size, discrete cells and batteries. The conception of microbattery arrays makes it practical to bring the advantages of the combinatorial approach to the development of batteries.
MIFT: GIFT Combinatorial Geometry Input to VCS Code
1977-03-01
r-w w-^ H ^ß0318is CQ BRL °RCUMr REPORT NO. 1967 —-S: ... MIFT: GIFT COMBINATORIAL GEOMETRY INPUT TO VCS CODE Albert E...TITLE (and Subtitle) MIFT: GIFT Combinatorial Geometry Input to VCS Code S. TYPE OF REPORT & PERIOD COVERED FINAL 6. PERFORMING ORG. REPORT NUMBER...Vehicle Code System (VCS) called MORSE was modified to accept the GIFT combinatorial geometry package. GIFT , as opposed to the geometry package
Neural Meta-Memes Framework for Combinatorial Optimization
NASA Astrophysics Data System (ADS)
Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon
In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).
On the extensive unification of digital-to-analog converters and kernels
NASA Astrophysics Data System (ADS)
Liao, Yanchu
2012-09-01
System administrators agree that scalable communication is an interesting new topic in the field of steganography, and leading analysts concur. After years of unfortunate re-search into context-free grammar, we argue the intuitive unification of fiber-optic cables and context-free grammar. Our focus here is not on whether sensor networks and randomized algorithms can collaborate to accomplish this aim, but rather on introducing an analysis of DHTs [2] (Soupy Coil).
Dark Matter after LHC Run I: Clues to Unification
NASA Astrophysics Data System (ADS)
Olive, Keith A.
2017-03-01
After the results of Run I, can we still `guarantee' the discovery of supersymmetry at the LHC? It is shown that viable dark matter models in CMSSM-like models tend to lie in strips (co-annihilation, funnel, focus point). The role of grand unification in constructing supersymmetric models is discussed and it is argued that non-supersymmetric GUTs such as SO(10) may provide solutions to many of the standard problems addressed by supersymmetry.
THE COMMON MARKET AND EUROPEAN UNIFICATION,
A study of the Common Market ; its past problems, current difficulties, and future possibilities are presented. The study consists of seven sections...each of which may be read independently: (1) an introduction to the Common Market ; (2) the Common Market and internal trade; (3) external economic...European Economic Community agriculture; and (7) the Common Market and European political unification. Statistical tables showing import and export data of the Common Market countries are appended. (Author)
Deductive Synthesis of the Unification Algorithm,
1981-06-01
DEDUCTIVE SYNTHESIS OF THE I - UNIFICATION ALGORITHM Zohar Manna Richard Waldinger I F? Computer Science Department Artificial Intelligence Center...theorem proving," Artificial Intelligence Journal, Vol. 9, No. 1, pp. 1-35. Boyer, R. S. and J S. Moore [Jan. 19751, "Proving theorems about LISP...d’Intelligence Artificielle , U.E.R. de Luminy, Universit6 d’ Aix-Marseille II. Green, C. C. [May 1969], "Application of theorem proving to problem
Dark matter in E 6 Grand unification
NASA Astrophysics Data System (ADS)
Schwichtenberg, Jakob
2018-02-01
We discuss fermionic dark matter in non-supersymmetric E 6 Grand Unification. The fundamental representation of E 6 contains, in addition to the standard model fermions, exotic fermions and we argue that one of them is a viable, interesting dark matter candidate. Its stability is guaranteed by a discrete remnant symmetry, which is an unbroken subgroup of the E 6 gauge symmetry. We compute the symmetry breaking scales and the effect of possible threshold corrections by solving the renormalization group equations numerically after imposing gauge coupling unification. Since the Yukawa couplings of the exotic and the standard model fermions have a common origin, the mass of the dark matter particles is constrained. We find a mass range of 3 · 109 GeV ≲ m DM ≲ 1 · 1013 GeV for our E 6 dark matter candidate, which is within the reach of next-generation direct detection experiments.
NASA Astrophysics Data System (ADS)
Hati, Chandan; Patra, Sudhanwa; Reig, Mario; Valle, José W. F.; Vaquera-Araujo, C. A.
2017-07-01
We consider the possibility of gauge coupling unification within the simplest realizations of the SU (3 )c×SU (3 )L×SU (3 )R×U (1 )X gauge theory. We present a first exploration of the renormalization group equations governing the "bottom-up" evolution of the gauge couplings in a generic model with free normalization for the generators. Interestingly, we find that for a SU (3 )c×SU (3 )L×SU (3 )R×U (1 )X symmetry breaking scale MX as low as a few TeV one can achieve unification in the presence of leptonic octets. We briefly comment on possible grand unified theory frameworks which can embed the SU (3 )c×SU (3 )L×SU (3 )R×U (1 )X model as well as possible implications, such as lepton flavor violating physics at the LHC.
FOREWORD: Focus on Combinatorial Materials Science Focus on Combinatorial Materials Science
NASA Astrophysics Data System (ADS)
Chikyo, Toyohiro
2011-10-01
About 15 years have passed since the introduction of modern combinatorial synthesis and high-throughput techniques for the development of novel inorganic materials; however, similar methods existed before. The most famous was reported in 1970 by Hanak who prepared composition-spread films of metal alloys by sputtering mixed-material targets. Although this method was innovative, it was rarely used because of the large amount of data to be processed. This problem is solved in the modern combinatorial material research, which is strongly related to computer data analysis and robotics. This field is still at the developing stage and may be enriched by new methods. Nevertheless, given the progress in measurement equipment and procedures, we believe the combinatorial approach will become a major and standard tool of materials screening and development. The first article of this journal, published in 2000, was titled 'Combinatorial solid state materials science and technology', and this focus issue aims to reintroduce this topic to the Science and Technology of Advanced Materials audience. It covers recent progress in combinatorial materials research describing new results in catalysis, phosphors, polymers and metal alloys for shape memory materials. Sophisticated high-throughput characterization schemes and innovative synthesis tools are also presented, such as spray deposition using nanoparticles or ion plating. On a technical note, data handling systems are introduced to familiarize researchers with the combinatorial methodology. We hope that through this focus issue a wide audience of materials scientists can learn about recent and future trends in combinatorial materials science and high-throughput experimentation.
A 2-categorical state sum model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baratin, Aristide, E-mail: abaratin@uwaterloo.ca; Freidel, Laurent, E-mail: lfreidel@perimeterinstitute.ca
It has long been argued that higher categories provide the proper algebraic structure underlying state sum invariants of 4-manifolds. This idea has been refined recently, by proposing to use 2-groups and their representations as specific examples of 2-categories. The challenge has been to make these proposals fully explicit. Here, we give a concrete realization of this program. Building upon our earlier work with Baez and Wise on the representation theory of 2-groups, we construct a four-dimensional state sum model based on a categorified version of the Euclidean group. We define and explicitly compute the simplex weights, which may be viewedmore » a categorified analogue of Racah-Wigner 6j-symbols. These weights solve a hexagon equation that encodes the formal invariance of the state sum under the Pachner moves of the triangulation. This result unravels the combinatorial formulation of the Feynman amplitudes of quantum field theory on flat spacetime proposed in A. Baratin and L. Freidel [Classical Quantum Gravity 24, 2027–2060 (2007)] which was shown to lead after gauge-fixing to Korepanov’s invariant of 4-manifolds.« less
Antolini, Ermete
2017-02-13
Combinatorial chemistry and high-throughput screening represent an innovative and rapid tool to prepare and evaluate a large number of new materials, saving time and expense for research and development. Considering that the activity and selectivity of catalysts depend on complex kinetic phenomena, making their development largely empirical in practice, they are prime candidates for combinatorial discovery and optimization. This review presents an overview of recent results of combinatorial screening of low-temperature fuel cell electrocatalysts for methanol oxidation. Optimum catalyst compositions obtained by combinatorial screening were compared with those of bulk catalysts, and the effect of the library geometry on the screening of catalyst composition is highlighted.
Combinatorial Nano-Bio Interfaces.
Cai, Pingqiang; Zhang, Xiaoqian; Wang, Ming; Wu, Yun-Long; Chen, Xiaodong
2018-06-08
Nano-bio interfaces are emerging from the convergence of engineered nanomaterials and biological entities. Despite rapid growth, clinical translation of biomedical nanomaterials is heavily compromised by the lack of comprehensive understanding of biophysicochemical interactions at nano-bio interfaces. In the past decade, a few investigations have adopted a combinatorial approach toward decoding nano-bio interfaces. Combinatorial nano-bio interfaces comprise the design of nanocombinatorial libraries and high-throughput bioevaluation. In this Perspective, we address challenges in combinatorial nano-bio interfaces and call for multiparametric nanocombinatorics (composition, morphology, mechanics, surface chemistry), multiscale bioevaluation (biomolecules, organelles, cells, tissues/organs), and the recruitment of computational modeling and artificial intelligence. Leveraging combinatorial nano-bio interfaces will shed light on precision nanomedicine and its potential applications.
Koyama, Michihisa; Tsuboi, Hideyuki; Endou, Akira; Takaba, Hiromitsu; Kubo, Momoji; Del Carpio, Carlos A; Miyamoto, Akira
2007-02-01
Computational chemistry can provide fundamental knowledge regarding various aspects of materials. While its impact in scientific research is greatly increasing, its contributions to industrially important issues are far from satisfactory. In order to realize industrial innovation by computational chemistry, a new concept "combinatorial computational chemistry" has been proposed by introducing the concept of combinatorial chemistry to computational chemistry. This combinatorial computational chemistry approach enables theoretical high-throughput screening for materials design. In this manuscript, we review the successful applications of combinatorial computational chemistry to deNO(x) catalysts, Fischer-Tropsch catalysts, lanthanoid complex catalysts, and cathodes of the lithium ion secondary battery.
The Rebirth of a World Power? German Unification and the Future of European Security
1990-12-01
Government. 17 COSATI CODES 18 SUBJECT TERMS (Continue on reverse if necessary and identify by block number) FIELD GROUP SUB- GROUP German unification...in the balance of power system. Germany (and Prussia) was used as a territorial "shock absorber" to maintain a buffer between the great powers--a...European statecraft was not pressured by internal problems which later forced governments to take actions that aroused the suspicions and fears of the
North Korean Paradoxes. Circumstances, Costs, and Consequences of Korean Unification
2005-01-01
consequences of Korean unification / Charles Wolf, Jr., Kamil Akramov. p. cm. "MG-333." Includes bibliographical references. ISBN 0-8330-3762-5 (pbk. : alk ...Institute of Technology NBER National Bureau for Economic Research OECD Organization for Economic Cooperation and Develoment OSD Office of the Secretary of...opment ( OECD ) with a per capita income over $10,000 and the other a "lights-out" but nuclear-capable dynastic state-is the riddle; how to link economic
Singh, Narender; Guha, Rajarshi; Giulianotti, Marc; Pinilla, Clemencia; Houghten, Richard; Medina-Franco, Jose L.
2009-01-01
A multiple criteria approach is presented, that is used to perform a comparative analysis of four recently developed combinatorial libraries to drugs, Molecular Libraries Small Molecule Repository (MLSMR) and natural products. The compound databases were assessed in terms of physicochemical properties, scaffolds and fingerprints. The approach enables the analysis of property space coverage, degree of overlap between collections, scaffold and structural diversity and overall structural novelty. The degree of overlap between combinatorial libraries and drugs was assessed using the R-NN curve methodology, which measures the density of chemical space around a query molecule embedded in the chemical space of a target collection. The combinatorial libraries studied in this work exhibit scaffolds that were not observed in the drug, MLSMR and natural products collections. The fingerprint-based comparisons indicate that these combinatorial libraries are structurally different to current drugs. The R-NN curve methodology revealed that a proportion of molecules in the combinatorial libraries are located within the property space of the drugs. However, the R-NN analysis also showed that there are a significant number of molecules in several combinatorial libraries that are located in sparse regions of the drug space. PMID:19301827
Smooth Constrained Heuristic Optimization of a Combinatorial Chemical Space
2015-05-01
ARL-TR-7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...7294•MAY 2015 US Army Research Laboratory Smooth ConstrainedHeuristic Optimization of a Combinatorial Chemical Space by Berend Christopher...
Preparation of cherry-picked combinatorial libraries by string synthesis.
Furka, Arpád; Dibó, Gábor; Gombosuren, Naran
2005-03-01
String synthesis [1-3] is an efficient and cheap manual method for preparation of combinatorial libraries by using macroscopic solid support units. Sorting the units between two synthetic steps is an important operation of the procedure. The software developed to guide sorting can be used only when complete combinatorial libraries are prepared. Since very often only selected components of the full libraries are needed, new software was constructed that guides sorting in preparation of non-complete combinatorial libraries. Application of the software is described in details.
2008-08-01
services, DIDS and DMS, are deployable on the TanGrid system and are accessible via two APIs, a Java client and a servlet based interface. Additionally...but required the user to instantiate an IGraph object with several Java Maps containing the nodes, node attributes, edge types, and the connections...restrictions imposed by the bulk ingest process. Finally, once the bulk ingest process was available in the GraphUnification Java Archives (JAR), DC was
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chizhov, M. V., E-mail: mih@phys.uni-sofia.bg; Bednyakov, V. A., E-mail: bedny@jinr.ru
The gauge coupling unification can be achieved at a unification scale around 5×10{sup 13} GeV if the Standard Model scalar sector is extended with extra Higgs-like doublets. The relevant new scalar degrees of freedom in the form of chiral Z* and W* vector bosons might “be visible” already at about 700 GeV. Their eventual preferred coupling to the heavy quarks explains the non observation of these bosons in the first LHC run and provides promising expectation for the second LHC run.
1983-02-28
tire, .v ,.,, DNA 5433F-2 I 1 UNIFICATION OF ELECTROMAGNETIC -’, SPECIFICATIONS AND STANDARDS Part Ih: Recommendations for Revisions of Existing...ADDRESSEE IS NO LONGER EMPLOYED BY" "• YOUR ORGANIZATION. ... : 1 ,S ::! ,.S ., UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE (When D.e. 6.e1e..c...REPORT DOCUMENTATION PAGE BEFORE COMPETISRM 1 . REPORT NUMBER 2. GOVT ACCESSION NO. 3. RECIPIENT’S CATALOG NUMBER DNA 5433F-2 1 /f9 I;’,71 4. TITLE
Distinct signals of the gauge-Higgs unification in e+e- collider experiments
NASA Astrophysics Data System (ADS)
Funatsu, Shuichiro; Hatanaka, Hisaki; Hosotani, Yutaka; Orikasa, Yuta
2017-12-01
Effects of Kaluza-Klein excited neutral vector bosons (Z‧ bosons) in the gauge-Higgs unification on e+e- → q bar q ,ℓ+ℓ- cross sections are studied, particularly in future e+e- collider experiments with polarized beams. Significant deviations in the energy and polarization dependence in σ (μ+μ-), the lepton forward-backward asymmetry, Rb (μ) ≡ σ (b bar b) / σ (μ+μ-) and the left-right asymmetry from the standard model are predicted.
Validation of an Instrument and Testing Protocol for Measuring the Combinatorial Analysis Schema.
ERIC Educational Resources Information Center
Staver, John R.; Harty, Harold
1979-01-01
Designs a testing situation to examine the presence of combinatorial analysis, to establish construct validity in the use of an instrument, Combinatorial Analysis Behavior Observation Scheme (CABOS), and to investigate the presence of the schema in young adolescents. (Author/GA)
Rad, Mostafa; Karimi Moonaghi, Hossein
2016-01-01
Introduction: Students’ incivility is an impolite and disturbing behavior in education and if ignored could lead to behavioral complexities and eventually violence and aggression in classrooms. This study aimed to reveal the experiences of Iranian educators regarding the management of such behaviors. Methods: In this qualitative study, qualitative content analysis method was used to evaluate the experiences and perceptions of nursing educators and students. A total of 22 persons (14 educators and 8 students) were selected through purposive sampling and individually interviewed. Results: Categories of unification of educators regarding behavioral management, teaching-learning strategy, friendship strategy and training through role playing, authority, appropriative decision-making and freedom, stronger relationships between students, reflection, and interactive educational environment were some strategies used by teachers for management of incivility. Conclusion: Educators suggested some strategies which could be used depending on uniqueness of behaviors and given situation. Educators and managers of medical fields can use these approaches in their classrooms to control uncivil behaviors. PMID:26989663
Microbial genome analysis: the COG approach.
Galperin, Michael Y; Kristensen, David M; Makarova, Kira S; Wolf, Yuri I; Koonin, Eugene V
2017-09-14
For the past 20 years, the Clusters of Orthologous Genes (COG) database had been a popular tool for microbial genome annotation and comparative genomics. Initially created for the purpose of evolutionary classification of protein families, the COG have been used, apart from straightforward functional annotation of sequenced genomes, for such tasks as (i) unification of genome annotation in groups of related organisms; (ii) identification of missing and/or undetected genes in complete microbial genomes; (iii) analysis of genomic neighborhoods, in many cases allowing prediction of novel functional systems; (iv) analysis of metabolic pathways and prediction of alternative forms of enzymes; (v) comparison of organisms by COG functional categories; and (vi) prioritization of targets for structural and functional characterization. Here we review the principles of the COG approach and discuss its key advantages and drawbacks in microbial genome analysis. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.
NASA Astrophysics Data System (ADS)
Brouwer, Harm; Crocker, Matthew W.
2016-03-01
The Mirror System Hypothesis (MSH) on the evolution of the language-ready brain draws upon the parallel dorsal-ventral stream architecture for vision [1]. The dorsal ;how; stream provides a mapping of parietally-mediated affordances onto the motor system (supporting preshape), whereas the ventral ;what; stream engages in object recognition and visual scene analysis (supporting pantomime and verbal description). Arbib attempts to integrate this MSH perspective with a recent conceptual dorsal-ventral stream model of auditory language comprehension [5] (henceforth, the B&S model). In the B&S model, the dorsal stream engages in time-dependent combinatorial processing, which subserves syntactic structuring and linkage to action, whereas the ventral stream performs time-independent unification of conceptual schemata. These streams are integrated in the left Inferior Frontal Gyrus (lIFG), which is assumed to subserve cognitive control, and no linguistic processing functions. Arbib criticizes the B&S model on two grounds: (i) the time-independence of the semantic processing in the ventral stream (by arguing that semantic processing is just as time-dependent as syntactic processing), and (ii) the absence of linguistic processing in the lIFG (reconciling syntactic and semantic representations is very much linguistic processing proper). Here, we provide further support for these two points of criticism on the basis of insights from the electrophysiology of language. In the course of our argument, we also sketch the contours of an alternative model that may prove better suited for integration with the MSH.
Bifurcation-based approach reveals synergism and optimal combinatorial perturbation.
Liu, Yanwei; Li, Shanshan; Liu, Zengrong; Wang, Ruiqi
2016-06-01
Cells accomplish the process of fate decisions and form terminal lineages through a series of binary choices in which cells switch stable states from one branch to another as the interacting strengths of regulatory factors continuously vary. Various combinatorial effects may occur because almost all regulatory processes are managed in a combinatorial fashion. Combinatorial regulation is crucial for cell fate decisions because it may effectively integrate many different signaling pathways to meet the higher regulation demand during cell development. However, whether the contribution of combinatorial regulation to the state transition is better than that of a single one and if so, what the optimal combination strategy is, seem to be significant issue from the point of view of both biology and mathematics. Using the approaches of combinatorial perturbations and bifurcation analysis, we provide a general framework for the quantitative analysis of synergism in molecular networks. Different from the known methods, the bifurcation-based approach depends only on stable state responses to stimuli because the state transition induced by combinatorial perturbations occurs between stable states. More importantly, an optimal combinatorial perturbation strategy can be determined by investigating the relationship between the bifurcation curve of a synergistic perturbation pair and the level set of a specific objective function. The approach is applied to two models, i.e., a theoretical multistable decision model and a biologically realistic CREB model, to show its validity, although the approach holds for a general class of biological systems.
750 GeV diphotons: implications for supersymmetric unification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Lawrence J.; Harigaya, Keisuke; Nomura, Yasunori
2016-03-03
A recent signal of 750 GeV diphotons at the LHC can be explained within the framework of supersymmetric unification by the introduction of vector quarks and leptons with Yukawa couplings to a singlet S that describes the 750 GeV resonance. We study the most general set of theories that allow successful gauge coupling unification, and find that these Yukawa couplings are severely constrained by renormalization group behavior: they are independent of ultraviolet physics and flow to values at the TeV scale that we calculate precisely. As a consequence the vector quarks and leptons must be light; typically in the regionmore » of 375 GeV to 700 GeV, and in certain cases up to 1 TeV. The 750 GeV resonance may have a width less than the experimental resolution; alternatively, with the mass splitting between scalar and pseudoscalar components of S arising from one-loop diagrams involving vector fermions, we compute an apparent width of 10s of GeV.« less
Cosmological density fluctuations produced by vacuum strings
NASA Astrophysics Data System (ADS)
Vilenkin, A.
1981-04-01
Consideration is given to the possible role of vacuum domain strings produced in the grand unification phase transition in the early universe in the generation of the density fluctuations giving rise to galaxies. The cosmological evolution of the strings formed in the grand unification phase transition is analyzed, with attention given to possible mechanisms for the damping out of oscillations produced by tension in convoluted strings and closed loops. The cosmological density fluctuations introduced by infinite strings and closed loops smaller than the horizon are then shown to be capable of giving rise to mass condensations on a scale of approximately 10 to the 9th solar masses at the time of the decoupling of radiation from matter, around which the galaxies condense. Differences between the present theory and that suggested by Zel'dovich (1980) are pointed out, and it is noted that string formation at the grand unification phase transition is possible only if the manifold of the degenerate vacua of the gauge theory is not simply connected.
Combinatorial enzyme technology for the conversion of agricultural fibers to functional properties
USDA-ARS?s Scientific Manuscript database
The concept of combinatorial chemistry has received little attention in agriculture and food research, although its applications in this area were described more than fifteen years ago (1, 2). More recently, interest in the use of combinatorial chemistry in agrochemical discovery has been revitalize...
An Investigation into Post-Secondary Students' Understanding of Combinatorial Questions
ERIC Educational Resources Information Center
Bulone, Vincent William
2017-01-01
The purpose of this dissertation was to study aspects of how post-secondary students understand combinatorial problems. Within this dissertation, I considered understanding through two different lenses: i) student connections to previous problems; and ii) common combinatorial distinctions such as ordered versus unordered and repetitive versus…
Olbert, Charles M; Gala, Gary J; Tupler, Larry A
2014-05-01
Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.
SO(10) supersymmetric grand unified theories
NASA Astrophysics Data System (ADS)
Dermisek, Radovan
The origin of the fermion mass hierarchy is one of the most challenging problems in elementary particle physics. In the standard model fermion masses and mixing angles are free parameters. Supersymmetric grand unified theories provide a beautiful framework for physics beyond the standard model. In addition to gauge coupling unification these theories provide relations between quark and lepton masses within families, and with additional family symmetry the hierarchy between families can be generated. We present a predictive SO(10) supersymmetric grand unified model with D 3 x U(1) family symmetry. The hierarchy in fermion masses is generated by the family symmetry breaking D 3 x U(1) → ZN → nothing. This model fits the low energy data in the charged fermion sector quite well. We discuss the prediction of this model for the proton lifetime in light of recent SuperKamiokande results and present a clear picture of the allowed spectra of supersymmetric particles. Finally, the detailed discussion of the Yukawa coupling unification of the third generation particles is provided. We find a narrow region is consistent with t, b, tau Yukawa unification for mu > 0 (suggested by b → sgamma and the anomalous magnetic moment of the muon) with A0 ˜ -1.9m16, m10 ˜ 1.4m16, m16 ≳ 1200 GeV and mu, M1/2 ˜ 100--500 GeV. Demanding Yukawa unification thus makes definite predictions for Higgs and sparticle masses.
Roadmap of left-right models based on GUTs
NASA Astrophysics Data System (ADS)
Chakrabortty, Joydeep; Maji, Rinku; Patra, Sunando Kumar; Srivastava, Tripurari; Mohanty, Subhendra
2018-05-01
We perform a detailed study of the grand unified theories S O (10 ) and E (6 ) with left-right intermediate gauge symmetries of the form S U (N )L⊗S U (N )R⊗G . Proton decay lifetime constrains the unification scale to be ≳1016 GeV and, as discussed in this paper, unwanted cosmological relics can be evaded if the intermediate symmetry scale is ≳1012 GeV . With these conditions, we study the renormalization group evolution of the gauge couplings and do a comparative analysis of all possible left-right models where unification can occur. Both the D-parity conserved and broken scenarios as well as the supersymmetric (SUSY) and nonsupersymmetric (non-SUSY) versions are considered. In addition to the fermion and scalar representations at each stage of the symmetry breaking, contributing to the β functions, we list the intermediate left-right groups that successfully meet these requirements. We make use of the dimension-5 kinetic mixing effective operators for achieving unification and large intermediate scale. A significant result in the supersymmetric case is that to achieve successful unification for some breaking patterns, the scale of SUSY breaking needs to be at least a few TeV. In some of these cases, the intermediate scale can be as low as ˜1012 GeV , for the SUSY scale to be ˜30 TeV . This has important consequences in the collider searches for SUSY particles and phenomenology of the lightest neutralino as dark matter.
Xu, Huayong; Yu, Hui; Tu, Kang; Shi, Qianqian; Wei, Chaochun; Li, Yuan-Yuan; Li, Yi-Xue
2013-01-01
We are witnessing rapid progress in the development of methodologies for building the combinatorial gene regulatory networks involving both TFs (Transcription Factors) and miRNAs (microRNAs). There are a few tools available to do these jobs but most of them are not easy to use and not accessible online. A web server is especially needed in order to allow users to upload experimental expression datasets and build combinatorial regulatory networks corresponding to their particular contexts. In this work, we compiled putative TF-gene, miRNA-gene and TF-miRNA regulatory relationships from forward-engineering pipelines and curated them as built-in data libraries. We streamlined the R codes of our two separate forward-and-reverse engineering algorithms for combinatorial gene regulatory network construction and formalized them as two major functional modules. As a result, we released the cGRNB (combinatorial Gene Regulatory Networks Builder): a web server for constructing combinatorial gene regulatory networks through integrated engineering of seed-matching sequence information and gene expression datasets. The cGRNB enables two major network-building modules, one for MPGE (miRNA-perturbed gene expression) datasets and the other for parallel miRNA/mRNA expression datasets. A miRNA-centered two-layer combinatorial regulatory cascade is the output of the first module and a comprehensive genome-wide network involving all three types of combinatorial regulations (TF-gene, TF-miRNA, and miRNA-gene) are the output of the second module. In this article we propose cGRNB, a web server for building combinatorial gene regulatory networks through integrated engineering of seed-matching sequence information and gene expression datasets. Since parallel miRNA/mRNA expression datasets are rapidly accumulated by the advance of next-generation sequencing techniques, cGRNB will be very useful tool for researchers to build combinatorial gene regulatory networks based on expression datasets. The cGRNB web-server is free and available online at http://www.scbit.org/cgrnb.
Combinatorial effects on clumped isotopes and their significance in biogeochemistry
NASA Astrophysics Data System (ADS)
Yeung, Laurence Y.
2016-01-01
The arrangement of isotopes within a collection of molecules records their physical and chemical histories. Clumped-isotope analysis interrogates these arrangements, i.e., how often rare isotopes are bound together, which in many cases can be explained by equilibrium and/or kinetic isotope fractionation. However, purely combinatorial effects, rooted in the statistics of pairing atoms in a closed system, are also relevant, and not well understood. Here, I show that combinatorial isotope effects are most important when two identical atoms are neighbors on the same molecule (e.g., O2, N2, and D-D clumping in CH4). When the two halves of an atom pair are either assembled with different isotopic preferences or drawn from different reservoirs, combinatorial effects cause depletions in clumped-isotope abundance that are most likely between zero and -1‰, although they could potentially be -10‰ or larger for D-D pairs. These depletions are of similar magnitude, but of opposite sign, to low-temperature equilibrium clumped-isotope effects for many small molecules. Enzymatic isotope-pairing reactions, which can have site-specific isotopic fractionation factors and atom reservoirs, should express this class of combinatorial isotope effect, although it is not limited to biological reactions. Chemical-kinetic isotope effects, which are related to a bond-forming transition state, arise independently and express second-order combinatorial effects related to the abundance of the rare isotope. Heteronuclear moeties (e.g., Csbnd O and Csbnd H), are insensitive to direct combinatorial influences, but secondary combinatorial influences are evident. In general, both combinatorial and chemical-kinetic factors are important for calculating and interpreting clumped-isotope signatures of kinetically controlled reactions. I apply this analytical framework to isotope-pairing reactions relevant to geochemical oxygen, carbon, and nitrogen cycling that may be influenced by combinatorial clumped-isotope effects. These isotopic signatures, manifest as either directly bound isotope ;clumps; or as features of a molecule's isotopic anatomy, are linked to molecular mechanisms and may eventually provide additional information about biogeochemical cycling on environmentally relevant spatial scales.
The construction of combinatorial manifolds with prescribed sets of links of vertices
NASA Astrophysics Data System (ADS)
Gaifullin, A. A.
2008-10-01
To every oriented closed combinatorial manifold we assign the set (with repetitions) of isomorphism classes of links of its vertices. The resulting transformation \\mathcal{L} is the main object of study in this paper. We pose an inversion problem for \\mathcal{L} and show that this problem is closely related to Steenrod's problem on the realization of cycles and to the Rokhlin-Schwartz-Thom construction of combinatorial Pontryagin classes. We obtain a necessary condition for a set of isomorphism classes of combinatorial spheres to belong to the image of \\mathcal{L}. (Sets satisfying this condition are said to be balanced.) We give an explicit construction showing that every balanced set of isomorphism classes of combinatorial spheres falls into the image of \\mathcal{L} after passing to a multiple set and adding several pairs of the form (Z,-Z), where -Z is the sphere Z with the orientation reversed. Given any singular simplicial cycle \\xi of a space X, this construction enables us to find explicitly a combinatorial manifold M and a map \\varphi\\colon M\\to X such that \\varphi_* \\lbrack M \\rbrack =r[\\xi] for some positive integer r. The construction is based on resolving singularities of \\xi. We give applications of the main construction to cobordisms of manifolds with singularities and cobordisms of simple cells. In particular, we prove that every rational additive invariant of cobordisms of manifolds with singularities admits a local formula. Another application is the construction of explicit (though inefficient) local combinatorial formulae for polynomials in the rational Pontryagin classes of combinatorial manifolds.
ERIC Educational Resources Information Center
Barratt, Barnaby B.
1975-01-01
This study investigated the emergence of combinatorial competence in early adolescence and the effectiveness of a programmed discovery training procedure. Significant increases in combinatorial skill with age were shown; it was found that the expression of this skill was significantly facilitated if problems involved concrete material of low…
Invention as a combinatorial process: evidence from US patents
Youn, Hyejin; Strumsky, Deborah; Bettencourt, Luis M. A.; Lobo, José
2015-01-01
Invention has been commonly conceptualized as a search over a space of combinatorial possibilities. Despite the existence of a rich literature, spanning a variety of disciplines, elaborating on the recombinant nature of invention, we lack a formal and quantitative characterization of the combinatorial process underpinning inventive activity. Here, we use US patent records dating from 1790 to 2010 to formally characterize invention as a combinatorial process. To do this, we treat patented inventions as carriers of technologies and avail ourselves of the elaborate system of technology codes used by the United States Patent and Trademark Office to classify the technologies responsible for an invention's novelty. We find that the combinatorial inventive process exhibits an invariant rate of ‘exploitation’ (refinements of existing combinations of technologies) and ‘exploration’ (the development of new technological combinations). This combinatorial dynamic contrasts sharply with the creation of new technological capabilities—the building blocks to be combined—that has significantly slowed down. We also find that, notwithstanding the very reduced rate at which new technologies are introduced, the generation of novel technological combinations engenders a practically infinite space of technological configurations. PMID:25904530
Seo, Hyung-Min; Jeon, Jong-Min; Lee, Ju Hee; Song, Hun-Suk; Joo, Han-Byul; Park, Sung-Hee; Choi, Kwon-Young; Kim, Yong Hyun; Park, Kyungmoon; Ahn, Jungoh; Lee, Hongweon; Yang, Yung-Hun
2016-01-01
Furfural is a toxic by-product formulated from pretreatment processes of lignocellulosic biomass. In order to utilize the lignocellulosic biomass on isobutanol production, inhibitory effect of the furfural on isobutanol production was investigated and combinatorial application of two oxidoreductases, FucO and YqhD, was suggested as an alternative strategy. Furfural decreased cell growth and isobutanol production when only YqhD or FucO was employed as an isobutyraldehyde oxidoreductase. However, combinatorial overexpression of FucO and YqhD could overcome the inhibitory effect of furfural giving higher isobutanol production by 110% compared with overexpression of YqhD. The combinatorial oxidoreductases increased furfural detoxification rate 2.1-fold and also accelerated glucose consumption 1.4-fold. When it compares to another known system increasing furfural tolerance, membrane-bound transhydrogenase (pntAB), the combinatorial aldehyde oxidoreductases were better on cell growth and production. Thus, to control oxidoreductases is important to produce isobutanol using furfural-containing biomass and the combinatorial overexpression of FucO and YqhD can be an alternative strategy.
Combinatorial Methods for Exploring Complex Materials
NASA Astrophysics Data System (ADS)
Amis, Eric J.
2004-03-01
Combinatorial and high-throughput methods have changed the paradigm of pharmaceutical synthesis and have begun to have a similar impact on materials science research. Already there are examples of combinatorial methods used for inorganic materials, catalysts, and polymer synthesis. For many investigations the primary goal has been discovery of new material compositions that optimize properties such as phosphorescence or catalytic activity. In the midst of the excitement generated to "make things", another opportunity arises for materials science to "understand things" by using the efficiency of combinatorial methods. We have shown that combinatorial methods hold potential for rapid and systematic generation of experimental data over the multi-parameter space typical of investigations in polymer physics. We have applied the combinatorial approach to studies of polymer thin films, biomaterials, polymer blends, filled polymers, and semicrystalline polymers. By combining library fabrication, high-throughput measurements, informatics, and modeling we can demonstrate validation of the methodology, new observations, and developments toward predictive models. This talk will present some of our latest work with applications to coating stability, multi-component formulations, and nanostructure assembly.
Tumor-targeting peptides from combinatorial libraries*
Liu, Ruiwu; Li, Xiaocen; Xiao, Wenwu; Lam, Kit S.
2018-01-01
Cancer is one of the major and leading causes of death worldwide. Two of the greatest challenges infighting cancer are early detection and effective treatments with no or minimum side effects. Widespread use of targeted therapies and molecular imaging in clinics requires high affinity, tumor-specific agents as effective targeting vehicles to deliver therapeutics and imaging probes to the primary or metastatic tumor sites. Combinatorial libraries such as phage-display and one-bead one-compound (OBOC) peptide libraries are powerful approaches in discovering tumor-targeting peptides. This review gives an overview of different combinatorial library technologies that have been used for the discovery of tumor-targeting peptides. Examples of tumor-targeting peptides identified from each combinatorial library method will be discussed. Published tumor-targeting peptide ligands and their applications will also be summarized by the combinatorial library methods and their corresponding binding receptors. PMID:27210583
Identification of combinatorial drug regimens for treatment of Huntington's disease using Drosophila
NASA Astrophysics Data System (ADS)
Agrawal, Namita; Pallos, Judit; Slepko, Natalia; Apostol, Barbara L.; Bodai, Laszlo; Chang, Ling-Wen; Chiang, Ann-Shyn; Michels Thompson, Leslie; Marsh, J. Lawrence
2005-03-01
We explore the hypothesis that pathology of Huntington's disease involves multiple cellular mechanisms whose contributions to disease are incrementally additive or synergistic. We provide evidence that the photoreceptor neuron degeneration seen in flies expressing mutant human huntingtin correlates with widespread degenerative events in the Drosophila CNS. We use a Drosophila Huntington's disease model to establish dose regimens and protocols to assess the effectiveness of drug combinations used at low threshold concentrations. These proof of principle studies identify at least two potential combinatorial treatment options and illustrate a rapid and cost-effective paradigm for testing and optimizing combinatorial drug therapies while reducing side effects for patients with neurodegenerative disease. The potential for using prescreening in Drosophila to inform combinatorial therapies that are most likely to be effective for testing in mammals is discussed. combinatorial treatments | neurodegeneration
Nonparametric Combinatorial Sequence Models
NASA Astrophysics Data System (ADS)
Wauthier, Fabian L.; Jordan, Michael I.; Jojic, Nebojsa
This work considers biological sequences that exhibit combinatorial structures in their composition: groups of positions of the aligned sequences are "linked" and covary as one unit across sequences. If multiple such groups exist, complex interactions can emerge between them. Sequences of this kind arise frequently in biology but methodologies for analyzing them are still being developed. This paper presents a nonparametric prior on sequences which allows combinatorial structures to emerge and which induces a posterior distribution over factorized sequence representations. We carry out experiments on three sequence datasets which indicate that combinatorial structures are indeed present and that combinatorial sequence models can more succinctly describe them than simpler mixture models. We conclude with an application to MHC binding prediction which highlights the utility of the posterior distribution induced by the prior. By integrating out the posterior our method compares favorably to leading binding predictors.
Dynamic combinatorial libraries: from exploring molecular recognition to systems chemistry.
Li, Jianwei; Nowak, Piotr; Otto, Sijbren
2013-06-26
Dynamic combinatorial chemistry (DCC) is a subset of combinatorial chemistry where the library members interconvert continuously by exchanging building blocks with each other. Dynamic combinatorial libraries (DCLs) are powerful tools for discovering the unexpected and have given rise to many fascinating molecules, ranging from interlocked structures to self-replicators. Furthermore, dynamic combinatorial molecular networks can produce emergent properties at systems level, which provide exciting new opportunities in systems chemistry. In this perspective we will highlight some new methodologies in this field and analyze selected examples of DCLs that are under thermodynamic control, leading to synthetic receptors, catalytic systems, and complex self-assembled supramolecular architectures. Also reviewed are extensions of the principles of DCC to systems that are not at equilibrium and may therefore harbor richer functional behavior. Examples include self-replication and molecular machines.
Zhou, Qian-Mei; Chen, Qi-Long; Du, Jia; Wang, Xiu-Feng; Lu, Yi-Yu; Zhang, Hui; Su, Shi-Bing
2014-01-01
In order to explore the synergistic mechanisms of combinatorial treatment using curcumin and mitomycin C (MMC) for breast cancer, MCF-7 breast cancer xenografts were conducted to observe the synergistic effect of combinatorial treatment using curcumin and MMC at various dosages. The synergistic mechanisms of combinatorial treatment using curcumin and MMC on the inhibition of tumor growth were explored by differential gene expression profile, gene ontology (GO), ingenuity pathway analysis (IPA) and Signal–Net network analysis. The expression levels of selected genes identified by cDNA microarray expression profiling were validated by quantitative RT-PCR (qRT-PCR) and Western blot analysis. Effect of combinatorial treatment on the inhibition of cell growth was observed by MTT assay. Apoptosis was detected by flow cytometric analysis and Hoechst 33258 staining. The combinatorial treatment of 100 mg/kg curcumin and 1.5 mg/kg MMC revealed synergistic inhibition on tumor growth. Among 1501 differentially expressed genes, the expression of 25 genes exhibited an obvious change and a significant difference in 27 signal pathways was observed (p < 0.05). In addition, Mapk1 (ERK) and Mapk14 (MAPK p38) had more cross-interactions with other genes and revealed an increase in expression by 8.14- and 11.84-fold, respectively during the combinatorial treatment by curcumin and MMC when compared with the control. Moreover, curcumin can synergistically improve tumoricidal effect of MMC in another human breast cancer MDA-MB-231 cells. Apoptosis was significantly induced by the combinatorial treatment (p < 0.05) and significantly inhibited by ERK inhibitor (PD98059) in MCF-7 cells (p < 0.05). The synergistic effect of combinatorial treatment by curcumin and MMC on the induction of apoptosis in breast cancer cells may be via the ERK pathway. PMID:25226537
Combinatorial theory of Macdonald polynomials I: proof of Haglund's formula.
Haglund, J; Haiman, M; Loehr, N
2005-02-22
Haglund recently proposed a combinatorial interpretation of the modified Macdonald polynomials H(mu). We give a combinatorial proof of this conjecture, which establishes the existence and integrality of H(mu). As corollaries, we obtain the cocharge formula of Lascoux and Schutzenberger for Hall-Littlewood polynomials, a formula of Sahi and Knop for Jack's symmetric functions, a generalization of this result to the integral Macdonald polynomials J(mu), a formula for H(mu) in terms of Lascoux-Leclerc-Thibon polynomials, and combinatorial expressions for the Kostka-Macdonald coefficients K(lambda,mu) when mu is a two-column shape.
Combinatorial operad actions on cochains
NASA Astrophysics Data System (ADS)
Berger, Clemens; Fresse, Benoit
2004-07-01
A classical E-infinity operad is formed by the bar construction of the symmetric groups. Such an operad has been introduced by M. Barratt and P. Eccles in the context of simplicial sets in order to have an analogue of the Milnor FK-construction for infinite loop spaces. The purpose of this paper is to prove that the associative algebra structure on the normalized cochain complex of a simplicial set extends to the structure of an algebra over the Barratt-Eccles operad. We also prove that differential graded algebras over the Barratt-Eccles operad form a closed model category. Similar results hold for the normalized Hochschild cochain complex of an associative algebra. More precisely, the Hochschild cochain complex is acted on by a suboperad of the Barratt-Eccles operad which is equivalent to the classical little squares operad.
2014-06-01
between projecting an image of strength and convincing the world that it valued the city-state’s economic success and the laissez - faire sociopolitical...immediately. Poland had not existed as a country when World War I was fought, but its native leadership had a window of opportunity afterward to...reforms in the late 1980s removed from East Germany’s Communist leadership the support that they had long depended on, and the East German people took
Signal dimensionality and the emergence of combinatorial structure.
Little, Hannah; Eryılmaz, Kerem; de Boer, Bart
2017-11-01
In language, a small number of meaningless building blocks can be combined into an unlimited set of meaningful utterances. This is known as combinatorial structure. One hypothesis for the initial emergence of combinatorial structure in language is that recombining elements of signals solves the problem of overcrowding in a signal space. Another hypothesis is that iconicity may impede the emergence of combinatorial structure. However, how these two hypotheses relate to each other is not often discussed. In this paper, we explore how signal space dimensionality relates to both overcrowding in the signal space and iconicity. We use an artificial signalling experiment to test whether a signal space and a meaning space having similar topologies will generate an iconic system and whether, when the topologies differ, the emergence of combinatorially structured signals is facilitated. In our experiments, signals are created from participants' hand movements, which are measured using an infrared sensor. We found that participants take advantage of iconic signal-meaning mappings where possible. Further, we use trajectory predictability, measures of variance, and Hidden Markov Models to measure the use of structure within the signals produced and found that when topologies do not match, then there is more evidence of combinatorial structure. The results from these experiments are interpreted in the context of the differences between the emergence of combinatorial structure in different linguistic modalities (speech and sign). Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Stevens, Victoria
2014-01-01
The author considers combinatory play as an intersection between creativity, play, and neuroaesthetics. She discusses combinatory play as vital to the creative process in art and science, particularly with regard to the incubation of new ideas. She reviews findings from current neurobiological research and outlines the way that the brain activates…
Kim, Hyo Jin; Turner, Timothy Lee; Jin, Yong-Su
2013-11-01
Recent advances in metabolic engineering have enabled microbial factories to compete with conventional processes for producing fuels and chemicals. Both rational and combinatorial approaches coupled with synthetic and systematic tools play central roles in metabolic engineering to create and improve a selected microbial phenotype. Compared to knowledge-based rational approaches, combinatorial approaches exploiting biological diversity and high-throughput screening have been demonstrated as more effective tools for improving various phenotypes of interest. In particular, identification of unprecedented targets to rewire metabolic circuits for maximizing yield and productivity of a target chemical has been made possible. This review highlights general principles and the features of the combinatorial approaches using various libraries to implement desired phenotypes for strain improvement. In addition, recent applications that harnessed the combinatorial approaches to produce biofuels and biochemicals will be discussed. Copyright © 2013 Elsevier Inc. All rights reserved.
Tumor-targeting peptides from combinatorial libraries.
Liu, Ruiwu; Li, Xiaocen; Xiao, Wenwu; Lam, Kit S
2017-02-01
Cancer is one of the major and leading causes of death worldwide. Two of the greatest challenges in fighting cancer are early detection and effective treatments with no or minimum side effects. Widespread use of targeted therapies and molecular imaging in clinics requires high affinity, tumor-specific agents as effective targeting vehicles to deliver therapeutics and imaging probes to the primary or metastatic tumor sites. Combinatorial libraries such as phage-display and one-bead one-compound (OBOC) peptide libraries are powerful approaches in discovering tumor-targeting peptides. This review gives an overview of different combinatorial library technologies that have been used for the discovery of tumor-targeting peptides. Examples of tumor-targeting peptides identified from each combinatorial library method will be discussed. Published tumor-targeting peptide ligands and their applications will also be summarized by the combinatorial library methods and their corresponding binding receptors. Copyright © 2017. Published by Elsevier B.V.
Investigating the properties of low-mass AGN and their connection to unification models
NASA Astrophysics Data System (ADS)
Hood, Carol Elizabeth
The most basic model of active galactic nuclei (AGN) suggest the observational differences between Type 1 and Type 2 objects are solely due to the orientation angle of the object. Although there are still some unanswered questions about the structures surrounding the central engines of the AGN, such as if the obscuring region is due to a dusty torus or an outflowing wind, observations (e.g. the detections of broad lines in the polarized light of some Type 2 objects) have proved consistent with predictions and continue to strengthen the case for unification. However, many are still searching for "true" Type 2 objects. These objects optically look like other Type 2 objects, but instead of having their broad line region blocked from the line-of-sight by the obscuring region, they are believed to lack the broad line region altogether. Others have predicted that at low luminosity or low accretion rate, the broad line region will disappear, leaving all objects to optically look like Type 2 objects, despite their level of intrinsic absorption. Low-mass (< 10^6 solar masses) AGN provide interesting environments in which these unification models can be studied. We present an in-depth multi-wavelength study of one of the prototypical low-mass AGN, POX 52, investigating the properties of the central engine along with that of the host galaxy. In addition, we examine the X-ray properties of a sample of Type 2 objects observed with XMM-Newton and the IR properties of a sample of both Type 1 and 2 objects observed with the Spitzer Infrared Spectrograph, in order to study the absorption properties of these objects and test the validity of unification models in the low-mass regime. We find little to no evidence of any "true" Type 2 objects in any of our samples, and show that in all tests preformed, low-mass AGN appear to simply be scaled-down versions of their more massive counterparts, keeping current unification models intact down to the lowest black hole masses probed to date.
Discovery of proton decay: A must for theory, a challenge for experiment
NASA Astrophysics Data System (ADS)
Pati, Jogesh C.
2000-08-01
It is noted that, but for one missing piece—proton decay—the evidence in support of grand unification is now strong. It includes: (i) the observed family-structure, (ii) the meeting of the gauge couplings, (iii) neutrino-oscillations, (iv) the intricate pattern of the masses and mixings of all fermions, including the neutrinos, and (v) the need for B-L as a generator, to implement baryogenesis. Taken together, these not only favor grand unification but in fact select out a particular route to such unification, based on the ideas of supersymmetry, SU(4)-color and left-right symmetry. Thus they point to the relevance of an effective string-unified G(224) or SO(10)-symmetry. A concrete proposal is presented, within a predictive SO(10)/G(224)-framework, that successfully describes the masses and mixings of all fermions, including the neutrinos—with eight predictions, all in agreement with observation. Within this framework, a systematic study of proton decay is carried out, which pays special attention to its dependence on the fermion masses, including the superheavy Majorana masses of the right-handed neutrinos. The study shows that a conservative upper limit on the proton lifetime is about (1/2-1)×1034yrs, with ν¯K+ being the dominant decay mode, and as a distinctive feature, μ+K0 being prominent. This in turn strongly suggests that an improvement in the current sensitivity by a factor of five to ten (compared to SuperK) ought to reveal proton decay. Otherwise some promising and remarkably successful ideas on unification would suffer a major setback.
NASA Astrophysics Data System (ADS)
Tamai, Toshiyuki; Teramoto, Shuntarou; Kimura, Makoto
Steel pipe piles with wings installed in soil cement column is a composite foundation of pile consisting of soil improvement with cement and steel pipe with wings. This type of pile shows higher vertical bearing capacity when compared to steel pipe piles that are installed without soil cement. It is thought the wings contribute to higher bearing capacity of this type of piles. The wings are also thought to play the role of structural unification of pile foundations and load transfer. In this study, model test and 3D elastic finite element analysis was carried out in order to elucidate the effect of wings on the structural unification of pile foundation and the load transfer mechanism. Firstly, the model test was carried out in order to grasp the influence of pile with and without wings, the shape of wings of the pile and the unconfined compression strength of the soil cement on the structural unification of the pile foundation. The numerical analysis of the model test was then carried out on the intermediate part of the pile foundation with wings and mathematical model developed. Finally load tran sfer mechanism was checked for the entire length of the pile through this mathematical model and the load sharing ratio of the wings and stress distribution occurring in the soil cement clarified. In addition, the effect of the wing interval on the structural unification of the pile foundation and load transfer was also checked and clarified.
NASA Astrophysics Data System (ADS)
Amjadiparvar, Babak; Sideris, Michael
2015-04-01
Precise gravimetric geoid heights are required when the unification of vertical datums is performed using the Geodetic Boundary Value Problem (GBVP) approach. Five generations of Global Geopotential Models (GGMs) derived from Gravity field and steady-state Ocean Circulation Explorer (GOCE) observations have been computed and released so far (available via IAG's International Centre for Global Earth Models, ICGEM, http://icgem.gfz-potsdam.de/ICGEM/). The performance of many of these models with respect to geoid determination has been studied in order to select the best performing model to be used in height datum unification in North America. More specifically, Release-3, 4 and 5 of the GOCE-based global geopotential models have been evaluated using GNSS-levelling data as independent control values. Comparisons against EGM2008 show that each successive release improves upon the previous one, with Release-5 models showing an improvement over EGM2008 in Canada and CONUS between spherical harmonic degrees 100 and 210. In Alaska and Mexico, a considerable improvement over EGM2008 was brought by the Release-5 models when used up to spherical harmonic degrees of 250 and 280, respectively. The positive impact of the Release-5 models was also felt when a gravimetric geoid was computed using the GOCE-based GGMs together with gravity and topography data in Canada. This geoid model, with appropriately modified Stokes kernel between spherical harmonic degrees 190 and 260, performed better than the official Canadian gravimetric geoid model CGG2013, thus illustrating the advantages of using the latest release GOCE-based models for vertical datum unification in North America.
Morphological Constraints on Cerebellar Granule Cell Combinatorial Diversity.
Gilmer, Jesse I; Person, Abigail L
2017-12-13
Combinatorial expansion by the cerebellar granule cell layer (GCL) is fundamental to theories of cerebellar contributions to motor control and learning. Granule cells (GrCs) sample approximately four mossy fiber inputs and are thought to form a combinatorial code useful for pattern separation and learning. We constructed a spatially realistic model of the cerebellar GCL and examined how GCL architecture contributes to GrC combinatorial diversity. We found that GrC combinatorial diversity saturates quickly as mossy fiber input diversity increases, and that this saturation is in part a consequence of short dendrites, which limit access to diverse inputs and favor dense sampling of local inputs. This local sampling also produced GrCs that were combinatorially redundant, even when input diversity was extremely high. In addition, we found that mossy fiber clustering, which is a common anatomical pattern, also led to increased redundancy of GrC input combinations. We related this redundancy to hypothesized roles of temporal expansion of GrC information encoding in service of learned timing, and we show that GCL architecture produces GrC populations that support both temporal and combinatorial expansion. Finally, we used novel anatomical measurements from mice of either sex to inform modeling of sparse and filopodia-bearing mossy fibers, finding that these circuit features uniquely contribute to enhancing GrC diversification and redundancy. Our results complement information theoretic studies of granule layer structure and provide insight into the contributions of granule layer anatomical features to afferent mixing. SIGNIFICANCE STATEMENT Cerebellar granule cells are among the simplest neurons, with tiny somata and, on average, just four dendrites. These characteristics, along with their dense organization, inspired influential theoretical work on the granule cell layer as a combinatorial expander, where each granule cell represents a unique combination of inputs. Despite the centrality of these theories to cerebellar physiology, the degree of expansion supported by anatomically realistic patterns of inputs is unknown. Using modeling and anatomy, we show that realistic input patterns constrain combinatorial diversity by producing redundant combinations, which nevertheless could support temporal diversification of like combinations, suitable for learned timing. Our study suggests a neural substrate for producing high levels of both combinatorial and temporal diversity in the granule cell layer. Copyright © 2017 the authors 0270-6474/17/3712153-14$15.00/0.
Bauer, J Edgar
2005-01-01
Two prominent representatives of the sexual emancipation movement in Germany, John Henry Mackay (1864-1933) and Magnus Hirschfeld (1868-1935) launched significant attacks on sexual binarism and its combinatories. Although Mackay defended the nameless love against seminal Christian and subsequent secularised misconstructions of its nature, he was unable to overcome the fundamental scheme of binomic sexuality. Hirschfeld, however, resolved the theoretical issue through his doctrine of sexual intermediaries (Zwischenstufenlehre) which purports that-without exception- all human beings are intersexual variants, i.e. unique composites of different proportions of masculinity and femininity. Since these proportions vary from one sexual layer of description to another in the same individual and can alter or be altered in time, it is sensu stricto not possible implies a radical deconstruction of not only binomic sexuality but its supplementation through a third sex. It offers a meta-theoretical framework for rethinking sexual difference beyond the fictional schemes and categorial closures of Western traditions of sexual identity. His assumption of potentially infinite sexualities anticipates some of the basic tenets forwarded by the philosophical and political agendas of queer studies. to postulate discrete sexual categories. Hirschfeld's doctrine.
An experiment to detect gut monopoles
NASA Technical Reports Server (NTRS)
Macneill, G.; Fegan, D. J.
1985-01-01
Recent advances in the development of Grand Unification Theories have led to several interesting predictions. One of these states that Grand Unification Monopoles (GUMs) exist as solutions in may nonabelian gauge theories. Another consequence of Unification is the possibility of baryon decay. The efficiency of the water tank detector in registering a Rubakov type decay will vary with both the interaction length and the GUM's velocity, expressed in terms of beta ( 0.01). The efficiency decreases at large values of because of the limited resolving time of the detector (approx. 50 ns). At lower values of Beta the time between interactions is such that the criterion of 4 events in 2 mu s can no longer be satisfied. The Rubakov experiment has now been in operation for almost 2 years with an estimated live time of 80%. During this time no candidate events have been observed leading to an estimated upper limit on the flux of 7.82 x 0.00001 m(-2) d(-1) Sr(-1). The ionization loss detection system has only recently come on line and as yet no results are available from this experiment.
Reflections on the nature of the concepts of field in physics
NASA Astrophysics Data System (ADS)
Pombo, C.
2012-12-01
This paper is a short introduction on the analysis of the concepts of field in physics, showing their different natures. It comprises a study on the development of observers based on observational realism, a physical epistemology in development, on the basis of analytical psychology. This epistemology incorporates and justify the proposition of R. Carnap, of separating observational and theoretical domains of a theory, and gives a criterion for this separation. The basis of three theories are discussed, where concepts of field emerge. We discuss the different origins and meanings of these fields, from an epistemological point of view, in their respective theories. The aim of this paper is to form a basis of discussion to be applied in the analysis of other theories where concepts of field are present, to reach a better understanding of the contemporary programs of unification. We would like to clarify if these programs are intended for unification of fields as elements of the physical reality, fields as explanations for the observations, unification of their theories, or other possible cases.
Low-luminosity Blazars in Wise: A Mid-infrared View of Unification
NASA Astrophysics Data System (ADS)
Plotkin, Richard M.; Anderson, S. F.; Brandt, W. N.; Markoff, S.; Shemmer, O.; Wu, J.
2012-01-01
We use the preliminary data release from the Wide-Field Infrared Survey Explorer (WISE) to perform the first statistical study on the mid-infrared (IR) properties of a large number ( 102) of BL Lac objects -- low-luminosity Active Galactic Nuclei (AGN) with a jet beamed toward the Earth. As expected, many BL Lac objects are so highly beamed that their jet synchrotron emission dominates their IR spectral energy distributions (SEDs), and the shape of their SEDs in the IR correlates well with SED peak frequency. In other BL Lac objects, the jet is not strong enough to completely dilute the rest of the AGN, and we do not see observational signatures of the dusty torus from these weakly beamed BL Lac objects. While at odds with simple unification, the missing torus is consistent with recent suggestions that BL Lac objects are fed by radiatively inefficient accretion flows. We discuss implications on the ``nature vs. nurture" debate for FR I and FR II galaxies, and also on the standard orientation-based AGN unification model.
NASA Astrophysics Data System (ADS)
Burello, E.; Bologa, C.; Frecer, V.; Miertus, S.
Combinatorial chemistry and technologies have been developed to a stage where synthetic schemes are available for generation of a large variety of organic molecules. The innovative concept of combinatorial design assumes that screening of a large and diverse library of compounds will increase the probability of finding an active analogue among the compounds tested. Since the rate at which libraries are screened for activity currently constitutes a limitation to the use of combinatorial technologies, it is important to be selective about the number of compounds to be synthesized. Early experience with combinatorial chemistry indicated that chemical diversity alone did not result in a significant increase in the number of generated lead compounds. Emphasis has therefore been increasingly put on the use of computer assisted combinatorial chemical techniques. Computational methods are valuable in the design of virtual libraries of molecular models. Selection strategies based on computed physicochemical properties of the models or of a target compound are introduced to reduce the time and costs of library synthesis and screening. In addition, computational structure-based library focusing methods can be used to perform in silico screening of the activity of compounds against a target receptor by docking the ligands into the receptor model. Three case studies are discussed dealing with the design of targeted combinatorial libraries of inhibitors of HIV-1 protease, P. falciparum plasmepsin and human urokinase as potential antivirial, antimalarial and anticancer drugs. These illustrate library focusing strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harnik, Roni
2004-10-27
Supersymmetric models have traditionally been assumed to be perturbative up to high scales due to the requirement of calculable unification. In this note I review the recently proposed `Fat Higgs' model which relaxes the requirement of perturbativity. In this framework, an NMSSM-like trilinear coupling becomes strong at some intermediate scale. The NMSSM Higgses are meson composites of an asymptotically-free gauge theory. This allows us to raise the mass of the Higgs, thus alleviating the MSSM of its fine tuning problem. Despite the strong coupling at an intermediate scale, the UV completion allows us to maintain gauge coupling unification.
Dibó, Gábor
2012-02-01
Combinatorial chemistry was introduced in the 1980s. It provided the possibility to produce new compounds in practically unlimited number. New strategies and technologies have also been developed that made it possible to screen very large number of compounds and to identify useful components in mixtures containing millions of different substances. This dramatically changed the drug discovery process and the way of thinking of synthetic chemists. In addition, combinatorial strategies became useful in areas such as pharmaceutical research, agrochemistry, catalyst design, and materials research. Prof. Árpád Furka is one of the pioneers of combinatorial chemistry.
Liao, Chenzhong; Liu, Bing; Shi, Leming; Zhou, Jiaju; Lu, Xian-Ping
2005-07-01
Based on the structural characters of PPAR modulators, a virtual combinatorial library containing 1226,625 compounds was constructed using SMILES strings. Selected ADME filters were employed to compel compounds having poor drug-like properties from this library. This library was converted to sdf and mol2 files by CONCORD 4.0, and was then docked to PPARgamma by DOCK 4.0 to identify new chemical entities that may be potential drug leads against type 2 diabetes and other metabolic diseases. The method to construct virtual combinatorial library using SMILES strings was further visualized by Visual Basic.net that can facilitate the needs of generating other type virtual combinatorial libraries.
Systematic Identification of Combinatorial Drivers and Targets in Cancer Cell Lines
Tabchy, Adel; Eltonsy, Nevine; Housman, David E.; Mills, Gordon B.
2013-01-01
There is an urgent need to elicit and validate highly efficacious targets for combinatorial intervention from large scale ongoing molecular characterization efforts of tumors. We established an in silico bioinformatic platform in concert with a high throughput screening platform evaluating 37 novel targeted agents in 669 extensively characterized cancer cell lines reflecting the genomic and tissue-type diversity of human cancers, to systematically identify combinatorial biomarkers of response and co-actionable targets in cancer. Genomic biomarkers discovered in a 141 cell line training set were validated in an independent 359 cell line test set. We identified co-occurring and mutually exclusive genomic events that represent potential drivers and combinatorial targets in cancer. We demonstrate multiple cooperating genomic events that predict sensitivity to drug intervention independent of tumor lineage. The coupling of scalable in silico and biologic high throughput cancer cell line platforms for the identification of co-events in cancer delivers rational combinatorial targets for synthetic lethal approaches with a high potential to pre-empt the emergence of resistance. PMID:23577104
Systematic identification of combinatorial drivers and targets in cancer cell lines.
Tabchy, Adel; Eltonsy, Nevine; Housman, David E; Mills, Gordon B
2013-01-01
There is an urgent need to elicit and validate highly efficacious targets for combinatorial intervention from large scale ongoing molecular characterization efforts of tumors. We established an in silico bioinformatic platform in concert with a high throughput screening platform evaluating 37 novel targeted agents in 669 extensively characterized cancer cell lines reflecting the genomic and tissue-type diversity of human cancers, to systematically identify combinatorial biomarkers of response and co-actionable targets in cancer. Genomic biomarkers discovered in a 141 cell line training set were validated in an independent 359 cell line test set. We identified co-occurring and mutually exclusive genomic events that represent potential drivers and combinatorial targets in cancer. We demonstrate multiple cooperating genomic events that predict sensitivity to drug intervention independent of tumor lineage. The coupling of scalable in silico and biologic high throughput cancer cell line platforms for the identification of co-events in cancer delivers rational combinatorial targets for synthetic lethal approaches with a high potential to pre-empt the emergence of resistance.
Hernando, Leticia; Mendiburu, Alexander; Lozano, Jose A
2013-01-01
The solution of many combinatorial optimization problems is carried out by metaheuristics, which generally make use of local search algorithms. These algorithms use some kind of neighborhood structure over the search space. The performance of the algorithms strongly depends on the properties that the neighborhood imposes on the search space. One of these properties is the number of local optima. Given an instance of a combinatorial optimization problem and a neighborhood, the estimation of the number of local optima can help not only to measure the complexity of the instance, but also to choose the most convenient neighborhood to solve it. In this paper we review and evaluate several methods to estimate the number of local optima in combinatorial optimization problems. The methods reviewed not only come from the combinatorial optimization literature, but also from the statistical literature. A thorough evaluation in synthetic as well as real problems is given. We conclude by providing recommendations of methods for several scenarios.
NASA Astrophysics Data System (ADS)
Tong, Wei
2017-04-01
Combinatorial material research offers fast and efficient solutions to identify promising and advanced materials. It has revolutionized the pharmaceutical industry and now is being applied to accelerate the discovery of other new compounds, e.g. superconductors, luminescent materials, catalysts etc. Differing from the traditional trial-and-error process, this approach allows for the synthesis of a large number of compositionally diverse compounds by varying the combinations of the components and adjusting the ratios. It largely reduces the cost of single-sample synthesis/characterization, along with the turnaround time in the material discovery process, therefore, could dramatically change the existing paradigm for discovering and commercializing new materials. This talk outlines the use of combinatorial materials approach in the material discovery in transportation sector. It covers the general introduction to the combinatorial material concept, state of art for its application in energy-related research. At the end, LBNL capabilities in combinatorial materials synthesis and high throughput characterization that are applicable for material discovery research will be highlighted.
Discovery of the leinamycin family of natural products by mining actinobacterial genomes
Xu, Zhengren; Guo, Zhikai; Hindra; Ma, Ming; Zhou, Hao; Gansemans, Yannick; Zhu, Xiangcheng; Huang, Yong; Zhao, Li-Xing; Jiang, Yi; Cheng, Jinhua; Van Nieuwerburgh, Filip; Suh, Joo-Won; Duan, Yanwen
2017-01-01
Nature’s ability to generate diverse natural products from simple building blocks has inspired combinatorial biosynthesis. The knowledge-based approach to combinatorial biosynthesis has allowed the production of designer analogs by rational metabolic pathway engineering. While successful, structural alterations are limited, with designer analogs often produced in compromised titers. The discovery-based approach to combinatorial biosynthesis complements the knowledge-based approach by exploring the vast combinatorial biosynthesis repertoire found in Nature. Here we showcase the discovery-based approach to combinatorial biosynthesis by targeting the domain of unknown function and cysteine lyase domain (DUF–SH) didomain, specific for sulfur incorporation from the leinamycin (LNM) biosynthetic machinery, to discover the LNM family of natural products. By mining bacterial genomes from public databases and the actinomycetes strain collection at The Scripps Research Institute, we discovered 49 potential producers that could be grouped into 18 distinct clades based on phylogenetic analysis of the DUF–SH didomains. Further analysis of the representative genomes from each of the clades identified 28 lnm-type gene clusters. Structural diversities encoded by the LNM-type biosynthetic machineries were predicted based on bioinformatics and confirmed by in vitro characterization of selected adenylation proteins and isolation and structural elucidation of the guangnanmycins and weishanmycins. These findings demonstrate the power of the discovery-based approach to combinatorial biosynthesis for natural product discovery and structural diversity and highlight Nature’s rich biosynthetic repertoire. Comparative analysis of the LNM-type biosynthetic machineries provides outstanding opportunities to dissect Nature’s biosynthetic strategies and apply these findings to combinatorial biosynthesis for natural product discovery and structural diversity. PMID:29229819
Discovery of the leinamycin family of natural products by mining actinobacterial genomes.
Pan, Guohui; Xu, Zhengren; Guo, Zhikai; Hindra; Ma, Ming; Yang, Dong; Zhou, Hao; Gansemans, Yannick; Zhu, Xiangcheng; Huang, Yong; Zhao, Li-Xing; Jiang, Yi; Cheng, Jinhua; Van Nieuwerburgh, Filip; Suh, Joo-Won; Duan, Yanwen; Shen, Ben
2017-12-26
Nature's ability to generate diverse natural products from simple building blocks has inspired combinatorial biosynthesis. The knowledge-based approach to combinatorial biosynthesis has allowed the production of designer analogs by rational metabolic pathway engineering. While successful, structural alterations are limited, with designer analogs often produced in compromised titers. The discovery-based approach to combinatorial biosynthesis complements the knowledge-based approach by exploring the vast combinatorial biosynthesis repertoire found in Nature. Here we showcase the discovery-based approach to combinatorial biosynthesis by targeting the domain of unknown function and cysteine lyase domain (DUF-SH) didomain, specific for sulfur incorporation from the leinamycin (LNM) biosynthetic machinery, to discover the LNM family of natural products. By mining bacterial genomes from public databases and the actinomycetes strain collection at The Scripps Research Institute, we discovered 49 potential producers that could be grouped into 18 distinct clades based on phylogenetic analysis of the DUF-SH didomains. Further analysis of the representative genomes from each of the clades identified 28 lnm -type gene clusters. Structural diversities encoded by the LNM-type biosynthetic machineries were predicted based on bioinformatics and confirmed by in vitro characterization of selected adenylation proteins and isolation and structural elucidation of the guangnanmycins and weishanmycins. These findings demonstrate the power of the discovery-based approach to combinatorial biosynthesis for natural product discovery and structural diversity and highlight Nature's rich biosynthetic repertoire. Comparative analysis of the LNM-type biosynthetic machineries provides outstanding opportunities to dissect Nature's biosynthetic strategies and apply these findings to combinatorial biosynthesis for natural product discovery and structural diversity.
Liu, Zhi-Hua; Xie, Shangxian; Lin, Furong; Jin, Mingjie; Yuan, Joshua S
2018-01-01
Lignin valorization has recently been considered to be an essential process for sustainable and cost-effective biorefineries. Lignin represents a potential new feedstock for value-added products. Oleaginous bacteria such as Rhodococcus opacus can produce intracellular lipids from biodegradation of aromatic substrates. These lipids can be used for biofuel production, which can potentially replace petroleum-derived chemicals. However, the low reactivity of lignin produced from pretreatment and the underdeveloped fermentation technology hindered lignin bioconversion to lipids. In this study, combinatorial pretreatment with an optimized fermentation strategy was evaluated to improve lignin valorization into lipids using R. opacus PD630. As opposed to single pretreatment, combinatorial pretreatment produced a 12.8-75.6% higher lipid concentration in fermentation using lignin as the carbon source. Gas chromatography-mass spectrometry analysis showed that combinatorial pretreatment released more aromatic monomers, which could be more readily utilized by lignin-degrading strains. Three detoxification strategies were used to remove potential inhibitors produced from pretreatment. After heating detoxification of the lignin stream, the lipid concentration further increased by 2.9-9.7%. Different fermentation strategies were evaluated in scale-up lipid fermentation using a 2.0-l fermenter. With laccase treatment of the lignin stream produced from combinatorial pretreatment, the highest cell dry weight and lipid concentration were 10.1 and 1.83 g/l, respectively, in fed-batch fermentation, with a total soluble substrate concentration of 40 g/l. The improvement of the lipid fermentation performance may have resulted from lignin depolymerization by the combinatorial pretreatment and laccase treatment, reduced inhibition effects by fed-batch fermentation, adequate oxygen supply, and an accurate pH control in the fermenter. Overall, these results demonstrate that combinatorial pretreatment, together with fermentation optimization, favorably improves lipid production using lignin as the carbon source. Combinatorial pretreatment integrated with fed-batch fermentation was an effective strategy to improve the bioconversion of lignin into lipids, thus facilitating lignin valorization in biorefineries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, J.A.
This report is a sequel to ORNL/CSD-46: KWIC Index For Numerical Algebra (March 1980) and in turn the both of these ae sequels to Professor A.S. Householer's report of the same name: ORNL-4778 Revised (June 1975). The area covered remains nearly the same as Professor Householder's guidelines, roughly delineated by the American Mathematical Society classification 15, 65F and 65H together with their secondary categories, but with little coverage of infinite matrices, matrices over fields of positive characteristic, operator theory, optimization and those facets of matrix theory primarily combinatorial in nature. The period covered by this report is roughly July 1979more » through December 1980, as measured by the appearance of the articles in the American Mathematical Society's Contents of Mathematical Publications. The review citations are limited to the Mathematical Reviews (MR) and Das Zentralblatt fur Mathematik Und Ihre Grenzgebiete (ZBL).« less
NASA Astrophysics Data System (ADS)
Filmer, M. S.; Hughes, C. W.; Woodworth, P. L.; Featherstone, W. E.; Bingham, R. J.
2018-04-01
The direct method of vertical datum unification requires estimates of the ocean's mean dynamic topography (MDT) at tide gauges, which can be sourced from either geodetic or oceanographic approaches. To assess the suitability of different types of MDT for this purpose, we evaluate 13 physics-based numerical ocean models and six MDTs computed from observed geodetic and/or ocean data at 32 tide gauges around the Australian coast. We focus on the viability of numerical ocean models for vertical datum unification, classifying the 13 ocean models used as either independent (do not contain assimilated geodetic data) or non-independent (do contain assimilated geodetic data). We find that the independent and non-independent ocean models deliver similar results. Maximum differences among ocean models and geodetic MDTs reach >150 mm at several Australian tide gauges and are considered anomalous at the 99% confidence level. These differences appear to be of geodetic origin, but without additional independent information, or formal error estimates for each model, some of these errors remain inseparable. Our results imply that some ocean models have standard deviations of differences with other MDTs (using geodetic and/or ocean observations) at Australian tide gauges, and with levelling between some Australian tide gauges, of ˜ ± 50 mm . This indicates that they should be considered as an alternative to geodetic MDTs for the direct unification of vertical datums. They can also be used as diagnostics for errors in geodetic MDT in coastal zones, but the inseparability problem remains, where the error cannot be discriminated between the geoid model or altimeter-derived mean sea surface.
An Indexed Combinatorial Library: The Synthesis and Testing of Insect Repellents
NASA Astrophysics Data System (ADS)
Miles, William H.; Gelato, Kathy A.; Pompizzi, Kristen M.; Scarbinsky, Aislinn M.; Albrecht, Brian K.; Reynolds, Elaine R.
2001-04-01
An indexed combinatorial library of amides was prepared by the reaction of amines and acid chlorides. A simple test for insect repellency using fruit flies (Drosophila melanogaster) allowed the determination of the most repellent sublibraries. The student-generated data were collected and analyzed to determine the most active amide(s) in the library. This experiment illustrates the fundamentals of combinatorial chemistry, a field that has undergone explosive growth in the last decade.
Unification Principle and a Geometric Field Theory
NASA Astrophysics Data System (ADS)
Wanas, Mamdouh I.; Osman, Samah N.; El-Kholy, Reham I.
2015-08-01
In the context of the geometrization philosophy, a covariant field theory is constructed. The theory satisfies the unification principle. The field equations of the theory are constructed depending on a general differential identity in the geometry used. The Lagrangian scalar used in the formalism is neither curvature scalar nor torsion scalar, but an alloy made of both, the W-scalar. The physical contents of the theory are explored depending on different methods. The analysis shows that the theory is capable of dealing with gravity, electromagnetism and material distribution with possible mutual interactions. The theory is shown to cover the domain of general relativity under certain conditions.
By design: James Clerk Maxwell and the evangelical unification of science.
Stanley, Matthew
2012-03-01
James Clerk Maxwell's electromagnetic theory famously unified many of the Victorian laws of physics. This essay argues that Maxwell saw a deep theological significance in the unification of physical laws. He postulated a variation on the design argument that focused on the unity of phenomena rather than Paley's emphasis on complexity. This argument of Maxwell's is shown to be connected to his particular evangelical religious views. His evangelical perspective provided encouragement for him to pursue a unified physics that supplemented his other philosophical, technical and social influences. Maxwell's version of the argument from design is also contrasted with modern 'intelligent-design' theory.
Magnetic conveyor belt for transporting and merging trapped atom clouds.
Hänsel, W; Reichel, J; Hommelhoff, P; Hänsch, T W
2001-01-22
We demonstrate an integrated magnetic device which transports cold atoms near a surface with very high positioning accuracy. Time-dependent currents in a lithographic conductor pattern create a moving chain of potential wells; atoms are transported in these wells while remaining confined in all three dimensions. We achieve mean fluxes up to 10(6) s(-1) with a negligible heating rate. An extension of this device allows merging of atom clouds by unification of two Ioffe-Pritchard potentials. The unification, which we demonstrate experimentally, can be performed without loss of phase space density. This novel, all-magnetic atom manipulation offers exciting perspectives, such as trapped-atom interferometry.
2008-01-01
A review of studies on the adaptation problems of North Korean defectors in South Korean society and studies of people's adaptation to political and cultural changes in other countries suggests that similar adaptation problems may occur in the process of and after unification. Defectors have various adaptation problems and some of them have psychiatric disorders such as depression and post-traumatic stress disorder (PTSD). The reasons for this were revealed to be the difference in the culture and personality between South and North Korea, which have developed for the last 60 years without any communication with each other, in spite of their common racial and cultural heritage. Economic factors including the lack of skills and knowledge for working at industrialized and competitive society like South Korean society, also aggravate the severity of such adaptation problems. Research on defectors' adaptation problems and on the differences in the culture and mentality between North and South Korea can provide useful information on what kinds of problems may arise during the process of and after unification and what should be done to achieve mutual adaptation and harmonious and peaceful unification. PMID:20046402
A Strategy Toward Reconstructing the Healthcare System of a Unified Korea
Lee, Yo Han; Kim, Seok Hyang; Shin, Hyun-Woung; Lee, Jin Yong; Kim, Beomsoo; Kim, Young Ae; Yoon, Jangho; Shin, Young Seok
2013-01-01
This road map aims to establish a stable and integrated healthcare system for the Korean Peninsula by improving health conditions and building a foundation for healthcare in North Korea through a series of effective healthcare programs. With a basic time frame extending from the present in stages towards unification, the roadmap is composed of four successive phases. The first and second phases, each expected to last five years, respectively, focus on disease treatment and nutritional treatment. These phases would thereby safeguard the health of the most vulnerable populations in North Korea, while fulfilling the basic health needs of other groups by modernizing existing medical facilities. Based on the gains of the first two phases, the third phase, for ten years, would prepare for unification of the Koreas by promoting the health of all the North Korean people and improving basic infrastructural elements such as health workforce capacity and medical institutions. The fourth phase, assuming that unification will take place, provides fundamental principles and directions for establishing an integrated healthcare system across the Korean Peninsula. We are hoping to increase the consistency of the program and overcome several existing concerns of the current program with this roadmap. PMID:23766871
A strategy toward reconstructing the healthcare system of a unified Korea.
Lee, Yo Han; Yoon, Seok-Jun; Kim, Seok Hyang; Shin, Hyun-Woung; Lee, Jin Yong; Kim, Beomsoo; Kim, Young Ae; Yoon, Jangho; Shin, Young Seok
2013-05-01
This road map aims to establish a stable and integrated healthcare system for the Korean Peninsula by improving health conditions and building a foundation for healthcare in North Korea through a series of effective healthcare programs. With a basic time frame extending from the present in stages towards unification, the roadmap is composed of four successive phases. The first and second phases, each expected to last five years, respectively, focus on disease treatment and nutritional treatment. These phases would thereby safeguard the health of the most vulnerable populations in North Korea, while fulfilling the basic health needs of other groups by modernizing existing medical facilities. Based on the gains of the first two phases, the third phase, for ten years, would prepare for unification of the Koreas by promoting the health of all the North Korean people and improving basic infrastructural elements such as health workforce capacity and medical institutions. The fourth phase, assuming that unification will take place, provides fundamental principles and directions for establishing an integrated healthcare system across the Korean Peninsula. We are hoping to increase the consistency of the program and overcome several existing concerns of the current program with this roadmap.
Geoid modeling in Mexico and the collaboration with Central America and the Caribbean.
NASA Astrophysics Data System (ADS)
Avalos, D.; Gomez, R.
2012-12-01
The model of geoidal heights for Mexico, named GGM10, is presented as a geodetic tool to support vertical positioning in the context of regional height system unification. It is a purely gravimetric solution computed by the Stokes-Helmert technique in resolution of 2.5 arc minutes. This product from the Instituto Nacional de Estadistica y Geografia (INEGI) is released together with a series of 10 gravimetric models which add to the improvements in description of the gravity field. In the recent years, the INEGI joined the initiative of the U.S. National Geodetic Survey and the Canada's Geodetic Survey Division to promote the regional height system unification. In an effort to further improve the compatibility among national geoid models in the region, the INEGI has begun to champion a network of specialists that includes national representatives from Central America and the Caribbean. Through the opening of opportunities for training and more direct access to international agreements and discussions, the tropical region is gaining participation. Now a significantly increased number of countries is pushing for a future North and Central American geoid-based vertical datum as support of height system unification.eoidal height in Mexico, mapped from the model GGM10.
Combinatorial fabrication and screening of organic light-emitting device arrays
NASA Astrophysics Data System (ADS)
Shinar, Joseph; Shinar, Ruth; Zhou, Zhaoqun
2007-11-01
The combinatorial fabrication and screening of 2-dimensional (2-d) small molecular UV-violet organic light-emitting device (OLED) arrays, 1-d blue-to-red arrays, 1-d intense white OLED libraries, 1-d arrays to study Förster energy transfer in guest-host OLEDs, and 2-d arrays to study exciplex emission from OLEDs is described. The results demonstrate the power of combinatorial approaches for screening OLED materials and configurations, and for studying their basic properties.
Combinatorial Dyson-Schwinger equations and inductive data types
NASA Astrophysics Data System (ADS)
Kock, Joachim
2016-06-01
The goal of this contribution is to explain the analogy between combinatorial Dyson-Schwinger equations and inductive data types to a readership of mathematical physicists. The connection relies on an interpretation of combinatorial Dyson-Schwinger equations as fixpoint equations for polynomial functors (established elsewhere by the author, and summarised here), combined with the now-classical fact that polynomial functors provide semantics for inductive types. The paper is expository, and comprises also a brief introduction to type theory.
Combinatorial chemistry on solid support in the search for central nervous system agents.
Zajdel, Paweł; Pawłowski, Maciej; Martinez, Jean; Subra, Gilles
2009-08-01
The advent of combinatorial chemistry was one of the most important developments, that has significantly contributed to the drug discovery process. Within just a few years, its initial concept aimed at production of libraries containing huge number of compounds (thousands to millions), so called screening libraries, has shifted towards preparation of small and medium-sized rationally designed libraries. When applicable, the use of solid supports for the generation of libraries has been a real breakthrough in enhancing productivity. With a limited amount of resin and simple manual workups, the split/mix procedure may generate thousands of bead-tethered compounds. Beads can be chemically or physically encoded to facilitate the identification of a hit after the biological assay. Compartmentalization of solid supports using small reactors like teabags, kans or pellicular discrete supports like Lanterns resulted in powerful sort and combine technologies, relying on codes 'written' on the reactor, and thus reducing the need for automation and improving the number of compounds synthesized. These methods of solid-phase combinatorial chemistry have been recently supported by introduction of solid-supported reagents and scavenger resins. The first part of this review discusses the general premises of combinatorial chemistry and some methods used in the design of primary and focused combinatorial libraries. The aim of the second part is to present combinatorial chemistry methodologies aimed at discovering bioactive compounds acting on diverse GPCR involved in central nervous system disorders.
Combinatorial stresses kill pathogenic Candida species
Kaloriti, Despoina; Tillmann, Anna; Cook, Emily; Jacobsen, Mette; You, Tao; Lenardon, Megan; Ames, Lauren; Barahona, Mauricio; Chandrasekaran, Komelapriya; Coghill, George; Goodman, Daniel; Gow, Neil A. R.; Grebogi, Celso; Ho, Hsueh-Lui; Ingram, Piers; McDonagh, Andrew; De Moura, Alessandro P. S.; Pang, Wei; Puttnam, Melanie; Radmaneshfar, Elahe; Romano, Maria Carmen; Silk, Daniel; Stark, Jaroslav; Stumpf, Michael; Thiel, Marco; Thorne, Thomas; Usher, Jane; Yin, Zhikang; Haynes, Ken; Brown, Alistair J. P.
2012-01-01
Pathogenic microbes exist in dynamic niches and have evolved robust adaptive responses to promote survival in their hosts. The major fungal pathogens of humans, Candida albicans and Candida glabrata, are exposed to a range of environmental stresses in their hosts including osmotic, oxidative and nitrosative stresses. Significant efforts have been devoted to the characterization of the adaptive responses to each of these stresses. In the wild, cells are frequently exposed simultaneously to combinations of these stresses and yet the effects of such combinatorial stresses have not been explored. We have developed a common experimental platform to facilitate the comparison of combinatorial stress responses in C. glabrata and C. albicans. This platform is based on the growth of cells in buffered rich medium at 30°C, and was used to define relatively low, medium and high doses of osmotic (NaCl), oxidative (H 2O2) and nitrosative stresses (e.g., dipropylenetriamine (DPTA)-NONOate). The effects of combinatorial stresses were compared with the corresponding individual stresses under these growth conditions. We show for the first time that certain combinations of combinatorial stress are especially potent in terms of their ability to kill C. albicans and C. glabrata and/or inhibit their growth. This was the case for combinations of osmotic plus oxidative stress and for oxidative plus nitrosative stress. We predict that combinatorial stresses may be highly signif cant in host defences against these pathogenic yeasts. PMID:22463109
Chen, Hong-Zhang; Liu, Zhi-Hua
2015-06-01
Pretreatment is a key unit operation affecting the refinery efficiency of plant biomass. However, the poor efficiency of pretreatment and the lack of basic theory are the main challenges to the industrial implementation of the plant biomass refinery. The purpose of this work is to review steam explosion and its combinatorial pretreatment as a means of overcoming the intrinsic characteristics of plant biomass, including recalcitrance, heterogeneity, multi-composition, and diversity. The main advantages of the selective use of steam explosion and other combinatorial pretreatments across the diversity of raw materials are introduced. Combinatorial pretreatment integrated with other unit operations is proposed as a means to exploit the high-efficiency production of bio-based products from plant biomass. Finally, several pilot- and demonstration-scale operations of the plant biomass refinery are described. Based on the principle of selective function and structure fractionation, and multi-level and directional composition conversion, an integrated process with the combinatorial pretreatments of steam explosion and other pretreatments as the core should be feasible and conform to the plant biomass refinery concept. Combinatorial pretreatments of steam explosion and other pretreatments should be further exploited based on the type and intrinsic characteristics of the plant biomass used, the bio-based products to be made, and the complementarity of the processes. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Rivera, Susana
Throughout the last century, since the last decades of the XIX century, until present day, there had been many attempts to achieve the unification of the Forces of Nature. First unification was done by James Clerk Maxwell, with his Electromagnetic Theory. Then Max Plank developed his Quantum Theory. In 1905, Albert Einstein gave birth to the Special Relativity Theory, and in 1916 he came out with his General Relativity Theory. He noticed that there was an evident parallelism between the Gravitational Force, and the Electromagnetic Force. So, he tried to unify these forces of Nature. But Quantum Theory interposed on his way. On the 1940’s it had been developed the Quantum Electrodynamics (QED), and with it, the unified field theory had an arise interest. On the 60’s and 70’s there was developed the Quantum Chromodynamics (QCD). Along with these theories came the discovery of the strong interaction force and weak interaction force. And though there had been many attempts to unify all these forces of the nature, it could only be achieved the Unification of strong interaction, weak interaction and Electromagnetic Force. On the late 80”s and throughout the last two decades, theories such as “super-string theory”, “or the “M-theory”, among others, groups of Scientists, had been doing grand efforts and finally they came out with the unification of the forces of nature, being the only limitation the use of more than 11 dimensions. Using an ingenious mathematical tool known as the super symmetries, based on the Kaluza - Klein work, they achieve this goal. The strings of these theories are in the rank of 10-33 m. Which make them undetectable. There are many other string theories. The GEUFT theory is based on the existence of concentrated energy lines, which vibrates, expands and contracts, submitting and absorbing energy, matter and antimatter, and which yields a determined geometry, that gives as a result the formation of stars, galaxies, nebulae, clusters on the Macrocosmic level, and that allows the formation of fundamental particles on the Microcosmic level. The strings are described by a function named Symbiosis (σ), which depends on four energetic contributions: (1) Radiation Energy (2) Plasma Energy (3) Conducted Flux Energy and (4) Mass Energy. There is an intimate relation between them, and depending on the value they have at a certain moment and at a certain time, the string dynamics and its geometry are settled. That means that symbiosis describes the strings state in any point of the geometer - energy field. σ = F [Er(σ), Ep(σ), Ef(σ), Em(σ)] (1) This work is an attempt to achieve the unification of the forces of nature, based on the existence of a four dimension Universe.
Nursing Classification Systems
Henry, Suzanne Bakken; Mead, Charles N.
1997-01-01
Abstract Our premise is that from the perspective of maximum flexibility of data usage by computer-based record (CPR) systems, existing nursing classification systems are necessary, but not sufficient, for representing important aspects of “what nurses do.” In particular, we have focused our attention on those classification systems that represent nurses' clinical activities through the abstraction of activities into categories of nursing interventions. In this theoretical paper, we argue that taxonomic, combinatorial vocabularies capable of coding atomic-level nursing activities are required to effectively capture in a reproducible and reversible manner the clinical decisions and actions of nurses, and that, without such vocabularies and associated grammars, potentially important clinical process data is lost during the encoding process. Existing nursing intervention classification systems do not fulfill these criteria. As background to our argument, we first present an overview of the content, methods, and evaluation criteria used in previous studies whose focus has been to evaluate the effectiveness of existing coding and classification systems. Next, using the Ingenerf typology of taxonomic vocabularies, we categorize the formal type and structure of three existing nursing intervention classification systems—Nursing Interventions Classification, Omaha System, and Home Health Care Classification. Third, we use records from home care patients to show examples of lossy data transformation, the loss of potentially significant atomic data, resulting from encoding using each of the three systems. Last, we provide an example of the application of a formal representation methodology (conceptual graphs) which we believe could be used as a model to build the required combinatorial, taxonomic vocabulary for representing nursing interventions. PMID:9147341
Evaluation of the Current Status of the Combinatorial Approach for the Study of Phase Diagrams
Wong-Ng, W.
2012-01-01
This paper provides an evaluation of the effectiveness of using the high throughput combinatorial approach for preparing phase diagrams of thin film and bulk materials. Our evaluation is based primarily on examples of combinatorial phase diagrams that have been reported in the literature as well as based on our own laboratory experiments. Various factors that affect the construction of these phase diagrams are examined. Instrumentation and analytical approaches needed to improve data acquisition and data analysis are summarized. PMID:26900530
NASA Astrophysics Data System (ADS)
Lu, Hai-Bo; Liu, Wei-Qiang
2014-04-01
Validated by the correlated experiments, a nose-tip with forward-facing cavity/opposing jet/the combinatorial configuration of forward-facing cavity and opposing jet thermal protection system (TPS) are investigated numerically. The physical mechanism of these TPS is discussed, and the cooling efficiency of them is compared. The combinatorial system is more suitable to be the TPS for the high speed vehicles which need fly under various flow conditions with long-range and long time.
NASA Astrophysics Data System (ADS)
Jakubczyk, Dorota; Jakubczyk, Paweł
2018-02-01
We propose combinatorial approach to the representation of Schur-Weyl duality in physical systems on the example of one-dimensional spin chains. Exploiting the Robinson-Schensted-Knuth algorithm, we perform decomposition of the dual group representations into irreducible representations in a fully combinatorial way. As representation space, we choose the Hilbert space of the spin chains, but this approach can be easily generalized to an arbitrary physical system where the Schur-Weyl duality works.
Massively multiplex single-cell Hi-C
Ramani, Vijay; Deng, Xinxian; Qiu, Ruolan; Gunderson, Kevin L; Steemers, Frank J; Disteche, Christine M; Noble, William S; Duan, Zhijun; Shendure, Jay
2016-01-01
We present single-cell combinatorial indexed Hi-C (sciHi-C), which applies the concept of combinatorial cellular indexing to chromosome conformation capture. In this proof-of-concept, we generate and sequence six sciHi-C libraries comprising a total of 10,696 single cells. We use sciHi-C data to separate cells by karytoypic and cell-cycle state differences and identify cell-to-cell heterogeneity in mammalian chromosomal conformation. Our results demonstrate that combinatorial indexing is a generalizable strategy for single-cell genomics. PMID:28135255
Combinatorial Interdependence in Lottery
ERIC Educational Resources Information Center
Helman, Danny
2005-01-01
This paper examines a real life question of gamble facing lottery players. Combinatorial dependence plays a central role in shaping the game probabilistic structure, but might not carry the merited weight in punters' considerations.
A Systematic Study of Simple Combinatorial Configurations.
ERIC Educational Resources Information Center
Dubois, Jean-Guy
1984-01-01
A classification of the simple combinatorial configurations which correspond to various cases of distribution and ordering of objects into boxes is given (in French). Concrete descriptions, structured relations, translations, and formalizations are discussed. (MNS)
Combinatorial Mathematics: Research into Practice
ERIC Educational Resources Information Center
Sriraman, Bharath; English, Lyn D.
2004-01-01
Implications and suggestions for using combinatorial mathematics in the classroom through a survey and synthesis of numerous research studies are presented. The implications revolve around five major themes that emerge from analysis of these studies.
A Bioinformatics Approach for Detecting Repetitive Nested Motifs using Pattern Matching.
Romero, José R; Carballido, Jessica A; Garbus, Ingrid; Echenique, Viviana C; Ponzoni, Ignacio
2016-01-01
The identification of nested motifs in genomic sequences is a complex computational problem. The detection of these patterns is important to allow the discovery of transposable element (TE) insertions, incomplete reverse transcripts, deletions, and/or mutations. In this study, a de novo strategy for detecting patterns that represent nested motifs was designed based on exhaustive searches for pairs of motifs and combinatorial pattern analysis. These patterns can be grouped into three categories, motifs within other motifs, motifs flanked by other motifs, and motifs of large size. The methodology used in this study, applied to genomic sequences from the plant species Aegilops tauschii and Oryza sativa , revealed that it is possible to identify putative nested TEs by detecting these three types of patterns. The results were validated through BLAST alignments, which revealed the efficacy and usefulness of the new method, which is called Mamushka.
Guided molecular self-assembly: a review of recent efforts
NASA Astrophysics Data System (ADS)
Huie, Jiyun C.
2003-04-01
This paper serves as an introductory review of significant and novel successes achieved in the fields of nanotechnology, particularly in the formation of nanostructures using guided molecular self-assembly methods. Self-assembly is a spontaneous process by which molecules and nanophase entities may materialize into organized aggregates or networks. Through various interactive mechanisms of self-assembly, such as electrostatics, chemistry, surface properties, and via other mediating agents, the technique proves indispensable to recent functional materials and device realizations. The discussion will extend to spontaneous and Langmuir-Blodgett formation of self-assembled monolayers on various substrates, and a number of different categories of self-assembly techniques based on the type of interaction exploited. Combinatorial techniques, known as soft lithography, of micro-contact printing and dip-pen nanolithography, which can be effectively used to up-size nanostructured molecular assemblies to submicrometer and micrometer scale patterns, will also be mentioned.
Morphological priming by itself: a study of Portuguese conjugations.
Veríssimo, João; Clahsen, Harald
2009-07-01
Does the language processing system make use of abstract grammatical categories and representations that are not directly visible from the surface form of a linguistic expression? This study examines stem-formation processes and conjugation classes, a case of 'pure' morphology that provides insight into the role of grammatical structure in language processing. We report results from a cross-modal priming experiment examining 1st and 3rd conjugation verb forms in Portuguese. Although items were closely matched with respect to a range of non-morphological factors, distinct priming patterns were found for 1st and 3rd conjugation stems. We attribute the observed priming patterns to different representations of conjugational stems, combinatorial morphologically structured ones for 1st conjugation and un-analyzed morphologically unstructured ones for 3rd conjugation stems. Our findings underline the importance of morphology for language comprehension indicating that morphological analysis goes beyond the identification of grammatical morphemes.
Park, Je Won; Nam, Sang-Jip; Yoon, Yeo Joon
2017-06-15
Nature has a talent for inventing a vast number of natural products, including hybrids generated by blending different scaffolds, resulting in a myriad of bioactive chemical entities. Herein, we review the highlights and recent trends (2010-2016) in the combinatorial biosynthesis of sugar-containing antibiotics where nature's structural diversification capabilities are exploited to enable the creation of new anti-infective and anti-proliferative drugs. In this review, we describe the modern combinatorial biosynthetic approaches for polyketide synthase-derived complex and aromatic polyketides, non-ribosomal peptide synthetase-directed lipo-/glycopeptides, aminoglycosides, nucleoside antibiotics, and alkaloids, along with their therapeutic potential. Finally, we present the feasible nexus between combinatorial biosynthesis, systems biology, and synthetic biology as a toolbox to provide new antibiotics that will be indispensable in the post-antibiotic era. Copyright © 2016 Elsevier Inc. All rights reserved.
Combinatorial vector fields and the valley structure of fitness landscapes.
Stadler, Bärbel M R; Stadler, Peter F
2010-12-01
Adaptive (downhill) walks are a computationally convenient way of analyzing the geometric structure of fitness landscapes. Their inherently stochastic nature has limited their mathematical analysis, however. Here we develop a framework that interprets adaptive walks as deterministic trajectories in combinatorial vector fields and in return associate these combinatorial vector fields with weights that measure their steepness across the landscape. We show that the combinatorial vector fields and their weights have a product structure that is governed by the neutrality of the landscape. This product structure makes practical computations feasible. The framework presented here also provides an alternative, and mathematically more convenient, way of defining notions of valleys, saddle points, and barriers in landscape. As an application, we propose a refined approximation for transition rates between macrostates that are associated with the valleys of the landscape.
Measuring and Specifying Combinatorial Coverage of Test Input Configurations
Kuhn, D. Richard; Kacker, Raghu N.; Lei, Yu
2015-01-01
A key issue in testing is how many tests are needed for a required level of coverage or fault detection. Estimates are often based on error rates in initial testing, or on code coverage. For example, tests may be run until a desired level of statement or branch coverage is achieved. Combinatorial methods present an opportunity for a different approach to estimating required test set size, using characteristics of the test set. This paper describes methods for estimating the coverage of, and ability to detect, t-way interaction faults of a test set based on a covering array. We also develop a connection between (static) combinatorial coverage and (dynamic) code coverage, such that if a specific condition is satisfied, 100% branch coverage is assured. Using these results, we propose practical recommendations for using combinatorial coverage in specifying test requirements. PMID:28133442
Combinatorial chemical bath deposition of CdS contacts for chalcogenide photovoltaics
Mokurala, Krishnaiah; Baranowski, Lauryn L.; de Souza Lucas, Francisco W.; ...
2016-08-01
Contact layers play an important role in thin film solar cells, but new material development and optimization of its thickness is usually a long and tedious process. A high-throughput experimental approach has been used to accelerate the rate of research in photovoltaic (PV) light absorbers and transparent conductive electrodes, however the combinatorial research on contact layers is less common. Here, we report on the chemical bath deposition (CBD) of CdS thin films by combinatorial dip coating technique and apply these contact layers to Cu(In,Ga)Se 2 (CIGSe) and Cu 2ZnSnSe 4 (CZTSe) light absorbers in PV devices. Combinatorial thickness steps ofmore » CdS thin films were achieved by removal of the substrate from the chemical bath, at regular intervals of time, and in equal distance increments. The trends in the photoconversion efficiency and in the spectral response of the PV devices as a function of thickness of CdS contacts were explained with the help of optical and morphological characterization of the CdS thin films. The maximum PV efficiency achieved for the combinatorial dip-coating CBD was similar to that for the PV devices processed using conventional CBD. Finally, the results of this study lead to the conclusion that combinatorial dip-coating can be used to accelerate the optimization of PV device performance of CdS and other candidate contact layers for a wide range of emerging absorbers.« less
Polynomial functors and combinatorial Dyson-Schwinger equations
NASA Astrophysics Data System (ADS)
Kock, Joachim
2017-04-01
We present a general abstract framework for combinatorial Dyson-Schwinger equations, in which combinatorial identities are lifted to explicit bijections of sets, and more generally equivalences of groupoids. Key features of combinatorial Dyson-Schwinger equations are revealed to follow from general categorical constructions and universal properties. Rather than beginning with an equation inside a given Hopf algebra and referring to given Hochschild 1-cocycles, our starting point is an abstract fixpoint equation in groupoids, shown canonically to generate all the algebraic structures. Precisely, for any finitary polynomial endofunctor P defined over groupoids, the system of combinatorial Dyson-Schwinger equations X = 1 + P(X) has a universal solution, namely the groupoid of P-trees. The isoclasses of P-trees generate naturally a Connes-Kreimer-like bialgebra, in which the abstract Dyson-Schwinger equation can be internalised in terms of canonical B+-operators. The solution to this equation is a series (the Green function), which always enjoys a Faà di Bruno formula, and hence generates a sub-bialgebra isomorphic to the Faà di Bruno bialgebra. Varying P yields different bialgebras, and cartesian natural transformations between various P yield bialgebra homomorphisms and sub-bialgebras, corresponding for example to truncation of Dyson-Schwinger equations. Finally, all constructions can be pushed inside the classical Connes-Kreimer Hopf algebra of trees by the operation of taking core of P-trees. A byproduct of the theory is an interpretation of combinatorial Green functions as inductive data types in the sense of Martin-Löf type theory (expounded elsewhere).
A proposal for unification of fatigue crack growth law
NASA Astrophysics Data System (ADS)
Kobelev, V.
2017-05-01
In the present paper, the new fractional-differential dependences of cycles to failure for a given initial crack length upon the stress amplitude in the linear fracture approach are proposed. The anticipated unified propagation function describes the infinitesimal crack length growths per increasing number of load cycles, supposing that the load ratio remains constant over the load history. Two unification fractional-differential functions with different number of fitting parameters are proposed. An alternative, threshold formulations for the fractional-differential propagation functions are suggested. The mean stress dependence is the immediate consequence from the considered laws. The corresponding formulas for crack length over the number of cycles are derived in closed form.
Unification of the family of Garrison-Wright's phases.
Cui, Xiao-Dong; Zheng, Yujun
2014-07-24
Inspired by Garrison and Wight's seminal work on complex-valued geometric phases, we generalize the concept of Pancharatnam's "in-phase" in interferometry and further develop a theoretical framework for unification of the abelian geometric phases for a biorthogonal quantum system modeled by a parameterized or time-dependent nonhermitian hamiltonian with a finite and nondegenerate instantaneous spectrum, that is, the family of Garrison-Wright's phases, which will no longer be confined in the adiabatic and nonadiabatic cyclic cases. Besides, we employ a typical example, Bethe-Lamb model, to illustrate how to apply our theory to obtain an explicit result for the Garrison-Wright's noncyclic geometric phase, and also to present its potential applications in quantum computation and information.
Combinatorial invariants and covariants as tools for conical intersections.
Ryb, Itai; Baer, Roi
2004-12-01
The combinatorial invariant and covariant are introduced as practical tools for analysis of conical intersections in molecules. The combinatorial invariant is a quantity depending on adiabatic electronic states taken at discrete nuclear configuration points. It is invariant to the phase choice (gauge) of these states. In the limit that the points trace a loop in nuclear configuration space, the value of the invariant approaches the corresponding Berry phase factor. The Berry phase indicates the presence of an odd or even number of conical intersections on surfaces bounded by these loops. Based on the combinatorial invariant, we develop a computationally simple and efficient method for locating conical intersections. The method is robust due to its use of gauge invariant nature. It does not rely on the landscape of intersecting potential energy surfaces nor does it require the computation of nonadiabatic couplings. We generalize the concept to open paths and combinatorial covariants for higher dimensions obtaining a technique for the construction of the gauge-covariant adiabatic-diabatic transformation matrix. This too does not make use of nonadiabatic couplings. The importance of using gauge-covariant expressions is underlined throughout. These techniques can be readily implemented by standard quantum chemistry codes. (c) 2004 American Institute of Physics.
Gobin, Oliver C; Schüth, Ferdi
2008-01-01
Genetic algorithms are widely used to solve and optimize combinatorial problems and are more often applied for library design in combinatorial chemistry. Because of their flexibility, however, their implementation can be challenging. In this study, the influence of the representation of solid catalysts on the performance of genetic algorithms was systematically investigated on the basis of a new, constrained, multiobjective, combinatorial test problem with properties common to problems in combinatorial materials science. Constraints were satisfied by penalty functions, repair algorithms, or special representations. The tests were performed using three state-of-the-art evolutionary multiobjective algorithms by performing 100 optimization runs for each algorithm and test case. Experimental data obtained during the optimization of a noble metal-free solid catalyst system active in the selective catalytic reduction of nitric oxide with propene was used to build up a predictive model to validate the results of the theoretical test problem. A significant influence of the representation on the optimization performance was observed. Binary encodings were found to be the preferred encoding in most of the cases, and depending on the experimental test unit, repair algorithms or penalty functions performed best.
Kim, Kyung Lock; Park, Kyeng Min; Murray, James; Kim, Kimoon; Ryu, Sung Ho
2018-05-23
Combinatorial post-translational modifications (PTMs), which can serve as dynamic "molecular barcodes", have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.
2018-01-01
Combinatorial post-translational modifications (PTMs), which can serve as dynamic “molecular barcodes”, have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.
Lexicographic goal programming and assessment tools for a combinatorial production problem.
DOT National Transportation Integrated Search
2008-01-01
NP-complete combinatorial problems often necessitate the use of near-optimal solution techniques including : heuristics and metaheuristics. The addition of multiple optimization criteria can further complicate : comparison of these solution technique...
NASA Astrophysics Data System (ADS)
Yeung, L.
2015-12-01
I present a mode of isotopic ordering that has purely combinatorial origins. It can be important when identical rare isotopes are paired by coincidence (e.g., they are neighbors on the same molecule), or when extrinsic factors govern the isotopic composition of the two atoms that share a chemical bond. By itself, combinatorial isotope pairing yields products with isotopes either randomly distributed or with a deficit relative to a random distribution of isotopes. These systematics arise because of an unconventional coupling between the formation of singly- and multiply-substituted isotopic moieties. In a random distribution, rare isotopes are symmetrically distributed: Single isotopic substitutions (e.g., H‒D and D‒H in H2) occur with equal probability, and double isotopic substitutions (e.g., D2) occur according to random chance. The absence of symmetry in a bond-making complex can yield unequal numbers of singly-substituted molecules (e.g., more H‒D than D‒H in H2), which is recorded in the product molecule as a deficit in doubly-substituted moieties and an "anticlumped" isotope distribution (i.e., Δn < 0). Enzymatic isotope pairing reactions, which can have site-specific isotopic fractionation factors and atom reservoirs, should express this class of combinatorial isotope effect. Chemical-kinetic isotope effects, which are related to the bond-forming transition state, arise independently and express second-order combinatorial effects. In general, both combinatorial and chemical factors are important for calculating and interpreting clumped-isotope signatures of individual reactions. In many reactions relevant to geochemical oxygen, carbon, and nitrogen cycling, combinatorial isotope pairing likely plays a strong role in the clumped isotope distribution of the products. These isotopic signatures, manifest as either directly bound isotope clumps or as features of a molecule's isotopic anatomy, could be exploited as tracers of biogeochemistry that can relate molecular mechanisms to signals observable at environmentally relevant spatial scales.
Effects of grand unification interactions on weak symmetry breaking in supergravity theories
NASA Astrophysics Data System (ADS)
Moxhay, Peter; Yamamoto, Katsuji
Possible effects of grand unification interactions on SU(2) × U(1) breaking are investigated by explicitly considering a supersymmetric SU(5) model coupled to N = 1 supergravity. Some remarkable features concerning the effects of renormalization on the effective soft supersymmetry breaking terms of SU(5) in the GUT region MP - MG are clarified, which are relevant for determining the SU(3) × SU(2) × U(1) theory below MG. In particular, the (mass) 2 of the Higgs doublets, g Hm g2and g overlineHm g2, might become significantly small at M G (g H ⋍ g overlineH ≈ 0.1) through the effect of SU(5) couplings such as overlineHø EH . Then, gH can rather easily become negative below MG, so as to realize SU(2) × U(1) breaking naturally even for the "diet" top quark case ( mt ≈ 40 GeV). On the other hand, if g H ⋍ g overlineH ⋍ 1 at M G by neglecting the grand unification interactions, some careful tuning of μ32/ mg2 is required with an accuracy ⪅10 -2 to achieve SU(2) × U(1) breaking with "diet" top quark, though a mass term μ 32( overlineHH) may be present.
NASA Astrophysics Data System (ADS)
Osmaston, Miles F.
I trace the historical and scientific origin of Continuum Theory, from its observationally enforced beginning in 1959, in never-to-be-repeated military circumstances, and follow this by a discussion of some of its more recent developments. The presence of this and of several other CT-related contributions to this symposium volume on Unified Field Mechanics can be justified by a view that CT, as currently developing, could, in a very real sense, be given an alternative name `Unified Aether Mechanics'. The substitution of `field' by `aether' reflects Newton's 1692 thesis that `fields' cannot exist per se, a view that persisted for over 200 years; they must have an agent or medium within which they exist and are communicated between objects. Hence the term `aether mechanics' would be appropriate. A principal aim in `unification', moreover, has always been the unification of gravitation into the family of forces. Einstein's response was the meanderings of space-time. CT achieves its unification into the electromagnetic family by its implementation of the Maxwell's equations aether, with insightful results, apparently regardless of scale. Particletied in nature, the existence of such an aether is was effectively demonstrated experimentally by the Michelson-Morley finding of 1887.
[Laboratory unification: advantages and disadvantages for clinical microbiology].
Andreu, Antonia; Matas, Lurdes
2010-10-01
This article aims to reflect on which areas or tasks of microbiology laboratories could be unified with those of clinical biochemistry, hematology, immunology or pathology laboratories to benefit patients and the health system, as well as the areas that should remain independent since their amalgamation would not only fail to provide a benefit but could even jeopardize the quality of microbiological diagnosis, and consequently patient care. To do this, the distinct analytic phases of diagnosis are analyzed, and the advantages and disadvantages of amalgamation are evaluated in each phase. The pros and cons of the unification of certain areas such as the computer system, occupational risk units, customer service, purchasing logistics, and materials storage, etc, are also discussed. Lastly, the effect of unification on urgent microbiology diagnosis is analyzed. Microbiological diagnosis should be unique. The microbiologist should perform an overall evaluation of the distinct techniques used for a particular patient, both those that involve direct diagnosis (staining, culture, antigen detection techniques or molecular techniques) and indirect diagnosis (antibody detection). Moreover, the microbiology laboratory should be independent, with highly trained technicians and specialists in microbiology that provide added value as experts in infection and as key figures in the process of establishing a correct etiological diagnosis. Copyright © 2010 Elsevier España S.L. All rights reserved.
Neutralino dark matter and other LHC predictions from quasi Yukawa unification
Shafi, Qaisar; Tanyıldızı, Şükrü Hanif; Ün, Cem Salih
2015-10-01
We explore the dark matter and LHC implications of t-b-τt-b-τ quasi Yukawa unification in the framework of supersymmetric models based on the gauge symmetry G=SU(4) c×SU(2) L×SU(2) R. The deviation from exact Yukawa unification is quantified by a dimensionless parameter C (|C|≲0.2|C|≲0.2), such that the Yukawa couplings at M GUT are related by y t:y b:y τ=|1+C|:|1-C|:|1+3C|. In contrast to earlier studies which focused on universal gaugino masses, we consider non-universal gaugino masses at M GUT that are compatible with the gauge symmetry G. Our results reveal a variety of neutralino dark matter scenarios consistent with the observations. These includemore » stau and chargino coannihilation scenarios, the A -resonance scenario, as well as Higgsino dark matter solutions which are more readily probed by direct detection searches. The gluino mass is found to be ≲4 TeV≲4 TeV, the stop mass is ≳2 TeV≳2 TeV, while the first two family squarks and sleptons are of order 4–5 TeV and 3 TeV respectively.« less
Balancing focused combinatorial libraries based on multiple GPCR ligands
NASA Astrophysics Data System (ADS)
Soltanshahi, Farhad; Mansley, Tamsin E.; Choi, Sun; Clark, Robert D.
2006-08-01
G-Protein coupled receptors (GPCRs) are important targets for drug discovery, and combinatorial chemistry is an important tool for pharmaceutical development. The absence of detailed structural information, however, limits the kinds of combinatorial design techniques that can be applied to GPCR targets. This is particularly problematic given the current emphasis on focused combinatorial libraries. By linking an incremental construction method (OptDesign) to the very fast shape-matching capability of ChemSpace, we have created an efficient method for designing targeted sublibraries that are topomerically similar to known actives. Multi-objective scoring allows consideration of multiple queries (actives) simultaneously. This can lead to a distribution of products skewed towards one particular query structure, however, particularly when the ligands of interest are quite dissimilar to one another. A novel pivoting technique is described which makes it possible to generate promising designs even under those circumstances. The approach is illustrated by application to some serotonergic agonists and chemokine antagonists.
NASA Astrophysics Data System (ADS)
Simonton, Dean Keith
2010-06-01
Campbell (1960) proposed that creative thought should be conceived as a blind-variation and selective-retention process (BVSR). This article reviews the developments that have taken place in the half century that has elapsed since his proposal, with special focus on the use of combinatorial models as formal representations of the general theory. After defining the key concepts of blind variants, creative thought, and disciplinary context, the combinatorial models are specified in terms of individual domain samples, variable field size, ideational combination, and disciplinary communication. Empirical implications are then derived with respect to individual, domain, and field systems. These abstract combinatorial models are next provided substantive reinforcement with respect to findings concerning the cognitive processes, personality traits, developmental factors, and social contexts that contribute to creativity. The review concludes with some suggestions regarding future efforts to explicate creativity according to BVSR theory.
Combinatorial Color Space Models for Skin Detection in Sub-continental Human Images
NASA Astrophysics Data System (ADS)
Khaled, Shah Mostafa; Saiful Islam, Md.; Rabbani, Md. Golam; Tabassum, Mirza Rehenuma; Gias, Alim Ul; Kamal, Md. Mostafa; Muctadir, Hossain Muhammad; Shakir, Asif Khan; Imran, Asif; Islam, Saiful
Among different color models HSV, HLS, YIQ, YCbCr, YUV, etc. have been most popular for skin detection. Most of the research done in the field of skin detection has been trained and tested on human images of African, Mongolian and Anglo-Saxon ethnic origins, skin colors of Indian sub-continentals have not been focused separately. Combinatorial algorithms, without affecting asymptotic complexity can be developed using the skin detection concepts of these color models for boosting detection performance. In this paper a comparative study of different combinatorial skin detection algorithms have been made. For training and testing 200 images (skin and non skin) containing pictures of sub-continental male and females have been used to measure the performance of the combinatorial approaches, and considerable development in success rate with True Positive of 99.5% and True Negative of 93.3% have been observed.
Optimized Reaction Conditions for Amide Bond Formation in DNA-Encoded Combinatorial Libraries.
Li, Yizhou; Gabriele, Elena; Samain, Florent; Favalli, Nicholas; Sladojevich, Filippo; Scheuermann, Jörg; Neri, Dario
2016-08-08
DNA-encoded combinatorial libraries are increasingly being used as tools for the discovery of small organic binding molecules to proteins of biological or pharmaceutical interest. In the majority of cases, synthetic procedures for the formation of DNA-encoded combinatorial libraries incorporate at least one step of amide bond formation between amino-modified DNA and a carboxylic acid. We investigated reaction conditions and established a methodology by using 1-ethyl-3-(3-(dimethylamino)propyl)carbodiimide, 1-hydroxy-7-azabenzotriazole and N,N'-diisopropylethylamine (EDC/HOAt/DIPEA) in combination, which provided conversions greater than 75% for 423/543 (78%) of the carboxylic acids tested. These reaction conditions were efficient with a variety of primary and secondary amines, as well as with various types of amino-modified oligonucleotides. The reaction conditions, which also worked efficiently over a broad range of DNA concentrations and reaction scales, should facilitate the synthesis of novel DNA-encoded combinatorial libraries.
Combinatorial games with a pass: a dynamical systems approach.
Morrison, Rebecca E; Friedman, Eric J; Landsberg, Adam S
2011-12-01
By treating combinatorial games as dynamical systems, we are able to address a longstanding open question in combinatorial game theory, namely, how the introduction of a "pass" move into a game affects its behavior. We consider two well known combinatorial games, 3-pile Nim and 3-row Chomp. In the case of Nim, we observe that the introduction of the pass dramatically alters the game's underlying structure, rendering it considerably more complex, while for Chomp, the pass move is found to have relatively minimal impact. We show how these results can be understood by recasting these games as dynamical systems describable by dynamical recursion relations. From these recursion relations, we are able to identify underlying structural connections between these "games with passes" and a recently introduced class of "generic (perturbed) games." This connection, together with a (non-rigorous) numerical stability analysis, allows one to understand and predict the effect of a pass on a game.
Chang, Yi-Pin; Chu, Yen-Ho
2014-05-16
The design, synthesis and screening of diversity-oriented peptide libraries using a "libraries from libraries" strategy for the development of inhibitors of α1-antitrypsin deficiency are described. The major buttress of the biochemical approach presented here is the use of well-established solid-phase split-and-mix method for the generation of mixture-based libraries. The combinatorial technique iterative deconvolution was employed for library screening. While molecular diversity is the general consideration of combinatorial libraries, exquisite design through systematic screening of small individual libraries is a prerequisite for effective library screening and can avoid potential problems in some cases. This review will also illustrate how large peptide libraries were designed, as well as how a conformation-sensitive assay was developed based on the mechanism of the conformational disease. Finally, the combinatorially selected peptide inhibitor capable of blocking abnormal protein aggregation will be characterized by biophysical, cellular and computational methods.
Structure-based design of combinatorial mutagenesis libraries
Verma, Deeptak; Grigoryan, Gevorg; Bailey-Kellogg, Chris
2015-01-01
The development of protein variants with improved properties (thermostability, binding affinity, catalytic activity, etc.) has greatly benefited from the application of high-throughput screens evaluating large, diverse combinatorial libraries. At the same time, since only a very limited portion of sequence space can be experimentally constructed and tested, an attractive possibility is to use computational protein design to focus libraries on a productive portion of the space. We present a general-purpose method, called “Structure-based Optimization of Combinatorial Mutagenesis” (SOCoM), which can optimize arbitrarily large combinatorial mutagenesis libraries directly based on structural energies of their constituents. SOCoM chooses both positions and substitutions, employing a combinatorial optimization framework based on library-averaged energy potentials in order to avoid explicitly modeling every variant in every possible library. In case study applications to green fluorescent protein, β-lactamase, and lipase A, SOCoM optimizes relatively small, focused libraries whose variants achieve energies comparable to or better than previous library design efforts, as well as larger libraries (previously not designable by structure-based methods) whose variants cover greater diversity while still maintaining substantially better energies than would be achieved by representative random library approaches. By allowing the creation of large-scale combinatorial libraries based on structural calculations, SOCoM promises to increase the scope of applicability of computational protein design and improve the hit rate of discovering beneficial variants. While designs presented here focus on variant stability (predicted by total energy), SOCoM can readily incorporate other structure-based assessments, such as the energy gap between alternative conformational or bound states. PMID:25611189
Structure-based design of combinatorial mutagenesis libraries.
Verma, Deeptak; Grigoryan, Gevorg; Bailey-Kellogg, Chris
2015-05-01
The development of protein variants with improved properties (thermostability, binding affinity, catalytic activity, etc.) has greatly benefited from the application of high-throughput screens evaluating large, diverse combinatorial libraries. At the same time, since only a very limited portion of sequence space can be experimentally constructed and tested, an attractive possibility is to use computational protein design to focus libraries on a productive portion of the space. We present a general-purpose method, called "Structure-based Optimization of Combinatorial Mutagenesis" (SOCoM), which can optimize arbitrarily large combinatorial mutagenesis libraries directly based on structural energies of their constituents. SOCoM chooses both positions and substitutions, employing a combinatorial optimization framework based on library-averaged energy potentials in order to avoid explicitly modeling every variant in every possible library. In case study applications to green fluorescent protein, β-lactamase, and lipase A, SOCoM optimizes relatively small, focused libraries whose variants achieve energies comparable to or better than previous library design efforts, as well as larger libraries (previously not designable by structure-based methods) whose variants cover greater diversity while still maintaining substantially better energies than would be achieved by representative random library approaches. By allowing the creation of large-scale combinatorial libraries based on structural calculations, SOCoM promises to increase the scope of applicability of computational protein design and improve the hit rate of discovering beneficial variants. While designs presented here focus on variant stability (predicted by total energy), SOCoM can readily incorporate other structure-based assessments, such as the energy gap between alternative conformational or bound states. © 2015 The Protein Society.
Transport of calcium ions through a bulk membrane by use of a dynamic combinatorial library.
Saggiomo, Vittorio; Lüning, Ulrich
2009-07-07
In a bulk membrane transport experiment, a dynamic combinatorial library (DCL) has been used to transport calcium ions; the calcium ions amplify the formation of a macrocyclic carrier which results in transport.
Counting Pizza Pieces and Other Combinatorial Problems.
ERIC Educational Resources Information Center
Maier, Eugene
1988-01-01
The general combinatorial problem of counting the number of regions into which the interior of a circle is divided by a family of lines is considered. A general formula is developed and its use is illustrated in two situations. (PK)
On the existence of binary simplex codes. [using combinatorial construction
NASA Technical Reports Server (NTRS)
Taylor, H.
1977-01-01
Using a simple combinatorial construction, the existence of a binary simplex code with m codewords for all m is greater than or equal to 1 is proved. The problem of the shortest possible length is left open.
Application of combinatorial biocatalysis for a unique ring expansion of dihydroxymethylzearalenone
USDA-ARS?s Scientific Manuscript database
Combinatorial biocatalysis was applied to generate a diverse set of dihydroxymethylzearalenone derivatives with modified ring structure. In one chemoenzymatic reaction sequence, dihydroxymethylzearalenone was first subjected to a unique enzyme-catalyzed oxidative ring opening reaction that creates ...
Criticism of EFSA's scientific opinion on combinatorial effects of 'stacked' GM plants.
Bøhn, Thomas
2018-01-01
Recent genetically modified plants tend to include both insect resistance and herbicide tolerance traits. Some of these 'stacked' GM plants have multiple Cry-toxins expressed as well as tolerance to several herbicides. This means that non-target organisms in the environment (biodiversity) will be co-exposed to multiple stressors simultaneously. A similar co-exposure may happen to consumers through chemical residues in the food chain. EFSA, the responsible unit for minimizing risk of harm in European food chains, has expressed its scientific interest in combinatorial effects. However, when new data showed how two Cry-toxins acted in combination (added toxicity), and that the same Cry-toxins showed combinatorial effects when co-exposed with Roundup (Bøhn et al., 2016), EFSA dismissed these new peer-reviewed results. In effect, EFSA claimed that combinatorial effects are not relevant for itself. EFSA was justifying this by referring to a policy question, and by making invalid assumptions, which could have been checked directly with the lead-author. With such approach, EFSA may miss the opportunity to improve its environmental and health risk assessment of toxins and pesticides in the food chain. Failure to follow its own published requests for combinatorial effects research, may also risk jeopardizing EFSA's scientific and public reputation. Copyright © 2017. Published by Elsevier Ltd.
Donato, David I.; Shapiro, Jason L.
2016-12-13
An effort to build a unified collection of geospatial data for use in land-change modeling (LCM) led to new insights into the requirements and challenges of building an LCM data infrastructure. A case study of data compilation and unification for the Richmond, Va., Metropolitan Statistical Area (MSA) delineated the problems of combining and unifying heterogeneous data from many independent localities such as counties and cities. The study also produced conclusions and recommendations for use by the national LCM community, emphasizing the critical need for simple, practical data standards and conventions for use by localities. This report contributes an uncopyrighted core glossary and a much needed operational definition of data unification.
Running Clubs--A Combinatorial Investigation.
ERIC Educational Resources Information Center
Nissen, Phillip; Taylor, John
1991-01-01
Presented is a combinatorial problem based on the Hash House Harriers rule which states that the route of the run should not have previously been traversed by the club. Discovered is how many weeks the club can meet before the rule has to be broken. (KR)
Quantum Resonance Approach to Combinatorial Optimization
NASA Technical Reports Server (NTRS)
Zak, Michail
1997-01-01
It is shown that quantum resonance can be used for combinatorial optimization. The advantage of the approach is in independence of the computing time upon the dimensionality of the problem. As an example, the solution to a constraint satisfaction problem of exponential complexity is demonstrated.
Comprehensive human transcription factor binding site map for combinatory binding motifs discovery.
Müller-Molina, Arnoldo J; Schöler, Hans R; Araúzo-Bravo, Marcos J
2012-01-01
To know the map between transcription factors (TFs) and their binding sites is essential to reverse engineer the regulation process. Only about 10%-20% of the transcription factor binding motifs (TFBMs) have been reported. This lack of data hinders understanding gene regulation. To address this drawback, we propose a computational method that exploits never used TF properties to discover the missing TFBMs and their sites in all human gene promoters. The method starts by predicting a dictionary of regulatory "DNA words." From this dictionary, it distills 4098 novel predictions. To disclose the crosstalk between motifs, an additional algorithm extracts TF combinatorial binding patterns creating a collection of TF regulatory syntactic rules. Using these rules, we narrowed down a list of 504 novel motifs that appear frequently in syntax patterns. We tested the predictions against 509 known motifs confirming that our system can reliably predict ab initio motifs with an accuracy of 81%-far higher than previous approaches. We found that on average, 90% of the discovered combinatorial binding patterns target at least 10 genes, suggesting that to control in an independent manner smaller gene sets, supplementary regulatory mechanisms are required. Additionally, we discovered that the new TFBMs and their combinatorial patterns convey biological meaning, targeting TFs and genes related to developmental functions. Thus, among all the possible available targets in the genome, the TFs tend to regulate other TFs and genes involved in developmental functions. We provide a comprehensive resource for regulation analysis that includes a dictionary of "DNA words," newly predicted motifs and their corresponding combinatorial patterns. Combinatorial patterns are a useful filter to discover TFBMs that play a major role in orchestrating other factors and thus, are likely to lock/unlock cellular functional clusters.
Comprehensive Human Transcription Factor Binding Site Map for Combinatory Binding Motifs Discovery
Müller-Molina, Arnoldo J.; Schöler, Hans R.; Araúzo-Bravo, Marcos J.
2012-01-01
To know the map between transcription factors (TFs) and their binding sites is essential to reverse engineer the regulation process. Only about 10%–20% of the transcription factor binding motifs (TFBMs) have been reported. This lack of data hinders understanding gene regulation. To address this drawback, we propose a computational method that exploits never used TF properties to discover the missing TFBMs and their sites in all human gene promoters. The method starts by predicting a dictionary of regulatory “DNA words.” From this dictionary, it distills 4098 novel predictions. To disclose the crosstalk between motifs, an additional algorithm extracts TF combinatorial binding patterns creating a collection of TF regulatory syntactic rules. Using these rules, we narrowed down a list of 504 novel motifs that appear frequently in syntax patterns. We tested the predictions against 509 known motifs confirming that our system can reliably predict ab initio motifs with an accuracy of 81%—far higher than previous approaches. We found that on average, 90% of the discovered combinatorial binding patterns target at least 10 genes, suggesting that to control in an independent manner smaller gene sets, supplementary regulatory mechanisms are required. Additionally, we discovered that the new TFBMs and their combinatorial patterns convey biological meaning, targeting TFs and genes related to developmental functions. Thus, among all the possible available targets in the genome, the TFs tend to regulate other TFs and genes involved in developmental functions. We provide a comprehensive resource for regulation analysis that includes a dictionary of “DNA words,” newly predicted motifs and their corresponding combinatorial patterns. Combinatorial patterns are a useful filter to discover TFBMs that play a major role in orchestrating other factors and thus, are likely to lock/unlock cellular functional clusters. PMID:23209563
Random vs. Combinatorial Methods for Discrete Event Simulation of a Grid Computer Network
NASA Technical Reports Server (NTRS)
Kuhn, D. Richard; Kacker, Raghu; Lei, Yu
2010-01-01
This study compared random and t-way combinatorial inputs of a network simulator, to determine if these two approaches produce significantly different deadlock detection for varying network configurations. Modeling deadlock detection is important for analyzing configuration changes that could inadvertently degrade network operations, or to determine modifications that could be made by attackers to deliberately induce deadlock. Discrete event simulation of a network may be conducted using random generation, of inputs. In this study, we compare random with combinatorial generation of inputs. Combinatorial (or t-way) testing requires every combination of any t parameter values to be covered by at least one test. Combinatorial methods can be highly effective because empirical data suggest that nearly all failures involve the interaction of a small number of parameters (1 to 6). Thus, for example, if all deadlocks involve at most 5-way interactions between n parameters, then exhaustive testing of all n-way interactions adds no additional information that would not be obtained by testing all 5-way interactions. While the maximum degree of interaction between parameters involved in the deadlocks clearly cannot be known in advance, covering all t-way interactions may be more efficient than using random generation of inputs. In this study we tested this hypothesis for t = 2, 3, and 4 for deadlock detection in a network simulation. Achieving the same degree of coverage provided by 4-way tests would have required approximately 3.2 times as many random tests; thus combinatorial methods were more efficient for detecting deadlocks involving a higher degree of interactions. The paper reviews explanations for these results and implications for modeling and simulation.
Formal Operations and Ego Identity in Adolescence.
ERIC Educational Resources Information Center
Wagner, Janis A.
1987-01-01
Investigated the relationship between the development of formal operations and the formation of ego identity in adolescence. Obtained significant positive correlations between combinatorial ability and degree of identity, suggesting that high identity may facilitate the application of combinatorial operations. Found some gender differences in task…
Manipulating Combinatorial Structures.
ERIC Educational Resources Information Center
Labelle, Gilbert
This set of transparencies shows how the manipulation of combinatorial structures in the context of modern combinatorics can easily lead to interesting teaching and learning activities at every level of education from elementary school to university. The transparencies describe: (1) the importance and relations of combinatorics to science and…
Gian-Carlos Rota and Combinatorial Math.
ERIC Educational Resources Information Center
Kolata, Gina Bari
1979-01-01
Presents the first of a series of occasional articles about mathematics as seen through the eyes of its prominent scholars. In an interview with Gian-Carlos Rota of the Massachusetts Institute of Technology he discusses how combinatorial mathematics began as a field and its future. (HM)
A Model of Students' Combinatorial Thinking
ERIC Educational Resources Information Center
Lockwood, Elise
2013-01-01
Combinatorial topics have become increasingly prevalent in K-12 and undergraduate curricula, yet research on combinatorics education indicates that students face difficulties when solving counting problems. The research community has not yet addressed students' ways of thinking at a level that facilitates deeper understanding of how students…
The LATL as locus of composition: MEG evidence from English and Arabic.
Westerlund, Masha; Kastner, Itamar; Al Kaabi, Meera; Pylkkänen, Liina
2015-02-01
Neurolinguistic investigations into the processing of structured sentences as well as simple adjective-noun phrases point to the left anterior temporal lobe (LATL) as a leading candidate for basic linguistic composition. Here, we characterized the combinatory profile of the LATL over a variety of syntactic and semantic environments, and across two languages, English and Arabic. The contribution of the LATL was investigated across two types of composition: the optional modification of a predicate (modification) and the satisfaction of a predicate's argument position (argument saturation). Target words were presented during MEG recordings, either in combinatory contexts (e.g. "eats meat") or in non-combinatory contexts (preceded by an unpronounceable consonant string, e.g. "xqkr meat"). Across both languages, the LATL showed increased responses to words in combinatory contexts, an effect that was robust to composition type and word order. Together with related findings, these results solidify the role of the LATL in basic semantic composition. Copyright © 2014 Elsevier Inc. All rights reserved.
DNA Assembly Techniques for Next Generation Combinatorial Biosynthesis of Natural Products
Cobb, Ryan E.; Ning, Jonathan C.; Zhao, Huimin
2013-01-01
Natural product scaffolds remain important leads for pharmaceutical development. However, transforming a natural product into a drug entity often requires derivatization to enhance the compound’s therapeutic properties. A powerful method by which to perform this derivatization is combinatorial biosynthesis, the manipulation of the genes in the corresponding pathway to divert synthesis towards novel derivatives. While these manipulations have traditionally been carried out via restriction digestion/ligation-based cloning, the shortcomings of such techniques limit their throughput and thus the scope of corresponding combinatorial biosynthesis experiments. In the burgeoning field of synthetic biology, the demand for facile DNA assembly techniques has promoted the development of a host of novel DNA assembly strategies. Here we describe the advantages of these recently-developed tools for rapid, efficient synthesis of large DNA constructs. We also discuss their potential to facilitate the simultaneous assembly of complete libraries of natural product biosynthetic pathways, ushering in the next generation of combinatorial biosynthesis. PMID:24127070
Wang, Yen-Ling
2014-01-01
Checkpoint kinase 2 (Chk2) has a great effect on DNA-damage and plays an important role in response to DNA double-strand breaks and related lesions. In this study, we will concentrate on Chk2 and the purpose is to find the potential inhibitors by the pharmacophore hypotheses (PhModels), combinatorial fusion, and virtual screening techniques. Applying combinatorial fusion into PhModels and virtual screening techniques is a novel design strategy for drug design. We used combinatorial fusion to analyze the prediction results and then obtained the best correlation coefficient of the testing set (r test) with the value 0.816 by combining the BesttrainBesttest and FasttrainFasttest prediction results. The potential inhibitors were selected from NCI database by screening according to BesttrainBesttest + FasttrainFasttest prediction results and molecular docking with CDOCKER docking program. Finally, the selected compounds have high interaction energy between a ligand and a receptor. Through these approaches, 23 potential inhibitors for Chk2 are retrieved for further study. PMID:24864236
A methodology to find the elementary landscape decomposition of combinatorial optimization problems.
Chicano, Francisco; Whitley, L Darrell; Alba, Enrique
2011-01-01
A small number of combinatorial optimization problems have search spaces that correspond to elementary landscapes, where the objective function f is an eigenfunction of the Laplacian that describes the neighborhood structure of the search space. Many problems are not elementary; however, the objective function of a combinatorial optimization problem can always be expressed as a superposition of multiple elementary landscapes if the underlying neighborhood used is symmetric. This paper presents theoretical results that provide the foundation for algebraic methods that can be used to decompose the objective function of an arbitrary combinatorial optimization problem into a sum of subfunctions, where each subfunction is an elementary landscape. Many steps of this process can be automated, and indeed a software tool could be developed that assists the researcher in finding a landscape decomposition. This methodology is then used to show that the subset sum problem is a superposition of two elementary landscapes, and to show that the quadratic assignment problem is a superposition of three elementary landscapes.
Grand Unification as a Bridge Between String Theory and Phenomenology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pati, Jogesh C.
2006-06-09
In the first part of the talk, I explain what empirical evidence points to the need for having an effective grand unification-like symmetry possessing the symmetry SU(4)-color in 4D. If one assumes the premises of a future predictive theory including gravity--be it string/M theory or a reincarnation--this evidence then suggests that such a theory should lead to an effective grand unification-like symmetry as above in 4D, near the string-GUT-scale, rather than the standard model symmetry. Advantages of an effective supersymmetric G(224) = SU(2){sub L} x SU(2){sub R} x SU(4){sup c} or SO(10) symmetry in 4D in explaining (1) observed neutrinomore » oscillations, (2) baryogenesis via leptogenesis, and (3) certain fermion mass-relations are noted. And certain distinguishing tests of a SUSY G(224) or SO(10)-framework involving CP and flavor violations (as in {mu} {yields} e{gamma}, {tau} {yields} {mu}{gamma}, edm's of the neutron and the electron) as well as proton decay are briefly mentioned. Recalling some of the successes we have had in our understanding of nature so far, and the current difficulties of string/M theory as regards the large multiplicity of string vacua, some comments are made on the traditional goal of understanding vis a vis the recently evolved view of landscape and anthropism.« less
Unification of quantum information theory
NASA Astrophysics Data System (ADS)
Abeyesinghe, Anura
We present the unification of many previously disparate results in noisy quantum Shannon theory and the unification of all of noiseless quantum Shannon theory. More specifically we deal here with bipartite, unidirectional, and memoryless quantum Shannon theory. We find all the optimal protocols and quantify the relationship between the resources used, both for the one-shot and for the ensemble case, for what is arguably the most fundamental task in quantum information theory: sharing entangled states between a sender and a receiver. We find that all of these protocols are derived from our one-shot superdense coding protocol and relate nicely to each other. We then move on to noisy quantum information theory and give a simple, direct proof of the "mother" protocol, or rather her generalization to the Fully Quantum Slepian-Wolf protocol (FQSW). FQSW simultaneously accomplishes two goals: quantum communication-assisted entanglement distillation, and state transfer from the sender to the receiver. As a result, in addition to her other "children," the mother protocol generates the state merging primitive of Horodecki, Oppenheim, and Winter as well as a new class of distributed compression protocols for correlated quantum sources, which are optimal for sources described by separable density operators. Moreover, the mother protocol described here is easily transformed into the so-called "father" protocol, demonstrating that the division of single-sender/single-receiver protocols into two families was unnecessary: all protocols in the family are children of the mother.
Common origin of 3.55 keV x-ray line and gauge coupling unification with left-right dark matter
NASA Astrophysics Data System (ADS)
Borah, Debasish; Dasgupta, Arnab; Patra, Sudhanwa
2017-12-01
We present a minimal left-right dark matter framework that can simultaneously explain the recently observed 3.55 keV x-ray line from several galaxy clusters and gauge coupling unification at high energy scale. Adopting a minimal dark matter strategy, we consider both left and right handed triplet fermionic dark matter candidates which are stable by virtue of a remnant Z2≃(-1 )B -L symmetry arising after the spontaneous symmetry breaking of left-right gauge symmetry to that of the standard model. A scalar bitriplet field is incorporated whose first role is to allow radiative decay of right handed triplet dark matter into the left handed one and a photon with energy 3.55 keV. The other role this bitriplet field at TeV scale plays is to assist in achieving gauge coupling unification at a high energy scale within a nonsupersymmetric S O (10 ) model while keeping the scale of left-right gauge symmetry around the TeV corner. Apart from solving the neutrino mass problem and giving verifiable new contributions to neutrinoless double beta decay and charged lepton flavor violation, the model with TeV scale gauge bosons can also give rise to interesting collider signatures like diboson excess, dilepton plus two jets excess reported recently in the large hadron collider data.
Spatiotemporal Coding of Individual Chemicals by the Gustatory System
Reiter, Sam; Campillo Rodriguez, Chelsey; Sun, Kui
2015-01-01
Four of the five major sensory systems (vision, olfaction, somatosensation, and audition) are thought to use different but partially overlapping sets of neurons to form unique representations of vast numbers of stimuli. The only exception is gustation, which is thought to represent only small numbers of basic taste categories. However, using new methods for delivering tastant chemicals and making electrophysiological recordings from the tractable gustatory system of the moth Manduca sexta, we found chemical-specific information is as follows: (1) initially encoded in the population of gustatory receptor neurons as broadly distributed spatiotemporal patterns of activity; (2) dramatically integrated and temporally transformed as it propagates to monosynaptically connected second-order neurons; and (3) observed in tastant-specific behavior. Our results are consistent with an emerging view of the gustatory system: rather than constructing basic taste categories, it uses a spatiotemporal population code to generate unique neural representations of individual tastant chemicals. SIGNIFICANCE STATEMENT Our results provide a new view of taste processing. Using a new, relatively simple model system and a new set of techniques to deliver taste stimuli and to examine gustatory receptor neurons and their immediate followers, we found no evidence for labeled line connectivity, or basic taste categories such as sweet, salty, bitter, and sour. Rather, individual tastant chemicals are represented as patterns of spiking activity distributed across populations of receptor neurons. These representations are transformed substantially as multiple types of receptor neurons converge upon follower neurons, leading to a combinatorial coding format that uniquely, rapidly, and efficiently represents individual taste chemicals. Finally, we found that the information content of these neurons can drive tastant-specific behavior. PMID:26338341
Spatiotemporal Coding of Individual Chemicals by the Gustatory System.
Reiter, Sam; Campillo Rodriguez, Chelsey; Sun, Kui; Stopfer, Mark
2015-09-02
Four of the five major sensory systems (vision, olfaction, somatosensation, and audition) are thought to use different but partially overlapping sets of neurons to form unique representations of vast numbers of stimuli. The only exception is gustation, which is thought to represent only small numbers of basic taste categories. However, using new methods for delivering tastant chemicals and making electrophysiological recordings from the tractable gustatory system of the moth Manduca sexta, we found chemical-specific information is as follows: (1) initially encoded in the population of gustatory receptor neurons as broadly distributed spatiotemporal patterns of activity; (2) dramatically integrated and temporally transformed as it propagates to monosynaptically connected second-order neurons; and (3) observed in tastant-specific behavior. Our results are consistent with an emerging view of the gustatory system: rather than constructing basic taste categories, it uses a spatiotemporal population code to generate unique neural representations of individual tastant chemicals. Our results provide a new view of taste processing. Using a new, relatively simple model system and a new set of techniques to deliver taste stimuli and to examine gustatory receptor neurons and their immediate followers, we found no evidence for labeled line connectivity, or basic taste categories such as sweet, salty, bitter, and sour. Rather, individual tastant chemicals are represented as patterns of spiking activity distributed across populations of receptor neurons. These representations are transformed substantially as multiple types of receptor neurons converge upon follower neurons, leading to a combinatorial coding format that uniquely, rapidly, and efficiently represents individual taste chemicals. Finally, we found that the information content of these neurons can drive tastant-specific behavior. Copyright © 2015 the authors 0270-6474/15/3512309-13$15.00/0.
Houghten, Richard A; Dooley, Colette T; Appel, Jon R
2006-05-26
The use of combinatorial libraries for the identification of novel opiate and related ligands in opioid receptor assays is reviewed. Case studies involving opioid assays used to demonstrate the viability of combinatorial libraries are described. The identification of new opioid peptides composed of L-amino acids, D-amino acids, or L-, D-, and unnatural amino acids is reviewed. New opioid compounds have also been identified from peptidomimetic libraries, such as peptoids and alkylated dipeptides, and those identified from acyclic (eg, polyamine, urea) and heterocyclic (eg, bicyclic guanidine) libraries are reviewed.
Sentence Processing in an Artificial Language: Learning and Using Combinatorial Constraints
ERIC Educational Resources Information Center
Amato, Michael S.; MacDonald, Maryellen C.
2010-01-01
A study combining artificial grammar and sentence comprehension methods investigated the learning and online use of probabilistic, nonadjacent combinatorial constraints. Participants learned a small artificial language describing cartoon monsters acting on objects. Self-paced reading of sentences in the artificial language revealed comprehenders'…
Experimental Design for Combinatorial and High Throughput Materials Development
NASA Astrophysics Data System (ADS)
Cawse, James N.
2002-12-01
In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.
2014-01-01
All-oxide-based photovoltaics (PVs) encompass the potential for extremely low cost solar cells, provided they can obtain an order of magnitude improvement in their power conversion efficiencies. To achieve this goal, we perform a combinatorial materials study of metal oxide based light absorbers, charge transporters, junctions between them, and PV devices. Here we report the development of a combinatorial internal quantum efficiency (IQE) method. IQE measures the efficiency associated with the charge separation and collection processes, and thus is a proxy for PV activity of materials once placed into devices, discarding optical properties that cause uncontrolled light harvesting. The IQE is supported by high-throughput techniques for bandgap fitting, composition analysis, and thickness mapping, which are also crucial parameters for the combinatorial investigation cycle of photovoltaics. As a model system we use a library of 169 solar cells with a varying thickness of sprayed titanium dioxide (TiO2) as the window layer, and covarying thickness and composition of binary compounds of copper oxides (Cu–O) as the light absorber, fabricated by Pulsed Laser Deposition (PLD). The analysis on the combinatorial devices shows the correlation between compositions and bandgap, and their effect on PV activity within several device configurations. The analysis suggests that the presence of Cu4O3 plays a significant role in the PV activity of binary Cu–O compounds. PMID:24410367
Making Temporal Logic Calculational: A Tool for Unification and Discovery
NASA Astrophysics Data System (ADS)
Boute, Raymond
In temporal logic, calculational proofs beyond simple cases are often seen as challenging. The situation is reversed by making temporal logic calculational, yielding shorter and clearer proofs than traditional ones, and serving as a (mental) tool for unification and discovery. A side-effect of unifying theories is easier access by practicians. The starting point is a simple generic (software tool independent) Functional Temporal Calculus (FTC). Specific temporal logics are then captured via endosemantic functions. This concept reflects tacit conventions throughout mathematics and, once identified, is general and useful. FTC also yields a reasoning style that helps discovering theorems by calculation rather than just proving given facts. This is illustrated by deriving various theorems, most related to liveness issues in TLA+, and finding strengthenings of known results. Educational issues are addressed in passing.
USDA-ARS?s Scientific Manuscript database
Plant cell wall polysaccharides, which consist of polymeric backbones with various types of substitution, were studied using the concept of combinatorial enzyme technology for conversion of agricultural fibers to functional products. Using citrus pectin as the starting substrate, an active oligo spe...
Students' Verification Strategies for Combinatorial Problems
ERIC Educational Resources Information Center
Mashiach Eizenberg, Michal; Zaslavsky, Orit
2004-01-01
We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…
ERIC Educational Resources Information Center
Hubert, Lawrence J.; Baker, Frank B.
1978-01-01
The "Traveling Salesman" and similar combinatorial programming tasks encountered in operations research are discussed as possible data analysis models in psychology, for example, in developmental scaling, Guttman scaling, profile smoothing, and data array clustering. A short overview of various computational approaches from this area of…
NASA Astrophysics Data System (ADS)
Vergos, Georgios S.; Erol, Bihter; Natsiopoulos, Dimitrios A.; Grigoriadis, Vassilios N.; Serkan Işık, Mustafa; Tziavos, Ilias N.
2016-04-01
The unification of local vertical Datums (LVDs) at a country-wide scale has gained significant attention lately, due to the availability of GOCE-based Global Geopotential Models (GGMs). The latter, offer unprecedented geoid height accuracies at the 1-1.5 cm level for spherical harmonic expansions to d/o 225-230. Within a single country, several LVDs may be used, especially in the event of islandic nations, therefore the unification of all of them to a single nation-wide LVD is of utmost importance. The same holds for neighboring countries, where the unification of their vertical datums is necessary as a tool of engineering, cross-border collaboration and environmental and risk management projects. The aforementioned set the main scope of the work carried out in the frame of the present study, which referred to the use of GOCE and GOCE/GRACE GGMs in order to unify the LVDs of Greece and Turkey. It is well-known that the two countries share common borders and are a path for large-scale engineering projects in the energy sector. Therefore, the availability of a common reference for orthometric heights in both countries and/or the determination of the relative offset of their individual zero-level geopotential value poses an emerging issue. The determination of the geopotential value Wo(LVD) for the Greek and Turkish LVDs was first carried out separately for each region performing as well different estimates for the marine area of the Aegean Sea and the terrestrial border-region along eastern Thrace. From that, possible biases of the Hellenic and Turkish LVDs themselves have been drawn and analyzed to determine spatial correlations. Then, the relative offset between the two LVDs was determined employing GPS/Levelling data for both areas and the latest GO-DIR-R5, GO-TIM-R5 and GOCO05s models as well as EGM2008. The estimation of the mean offset was used to provide as well a direct link between the Greek and Turkish LVDs with the IAG conventional value recently proposed as a Wo for a global WHS.
Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Schneider, Gisbert
2014-01-07
Using the example of the Ugi three-component reaction we report a fast and efficient microfluidic-assisted entry into the imidazopyridine scaffold, where building block prioritization was coupled to a new computational method for predicting ligand-target associations. We identified an innovative GPCR-modulating combinatorial chemotype featuring ligand-efficient adenosine A1/2B and adrenergic α1A/B receptor antagonists. Our results suggest the tight integration of microfluidics-assisted synthesis with computer-based target prediction as a viable approach to rapidly generate bioactivity-focused combinatorial compound libraries with high success rates. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Generation of Dynamic Combinatorial Libraries Using Hydrazone‐Functionalized Surface Mimetics
Hewitt, Sarah H.
2018-01-01
Dynamic combinatorial chemistry (DCC) represents an approach, whereby traditional supramolecular scaffolds used for protein surface recognition might be exploited to achieve selective high affinity target recognition. Synthesis, in situ screening and amplification under selection pressure allows the generation of ligands, which bear different moieties capable of making multivalent non‐covalent interactions with target proteins. Generic tetracarboxyphenyl porphyrin scaffolds bearing four hydrazide moieties have been used to form dynamic combinatorial libraries (DCLs) using aniline‐catalyzed reversible hydrazone exchange reactions, in 10 % DMSO, 5 mm NH4OAc, at pH 6.75. High resolution mass spectrometry (HRMS) was used to monitor library composition and establish conditions under which equilibria were established.
Transcriptional Architecture of Synaptic Communication Delineates GABAergic Neuron Identity.
Paul, Anirban; Crow, Megan; Raudales, Ricardo; He, Miao; Gillis, Jesse; Huang, Z Josh
2017-10-19
Understanding the organizational logic of neural circuits requires deciphering the biological basis of neuronal diversity and identity, but there is no consensus on how neuron types should be defined. We analyzed single-cell transcriptomes of a set of anatomically and physiologically characterized cortical GABAergic neurons and conducted a computational genomic screen for transcriptional profiles that distinguish them from one another. We discovered that cardinal GABAergic neuron types are delineated by a transcriptional architecture that encodes their synaptic communication patterns. This architecture comprises 6 categories of ∼40 gene families, including cell-adhesion molecules, transmitter-modulator receptors, ion channels, signaling proteins, neuropeptides and vesicular release components, and transcription factors. Combinatorial expression of select members across families shapes a multi-layered molecular scaffold along the cell membrane that may customize synaptic connectivity patterns and input-output signaling properties. This molecular genetic framework of neuronal identity integrates cell phenotypes along multiple axes and provides a foundation for discovering and classifying neuron types. Copyright © 2017 Elsevier Inc. All rights reserved.
Automated Combinatorial Chemistry in the Organic Chemistry Majors Laboratory
ERIC Educational Resources Information Center
Nichols, Christopher J.; Hanne, Larry F.
2010-01-01
A multidisciplinary experiment has been developed in which students each synthesize a combinatorial library of 48 hydrazones with the aid of a liquid-handling robot. Each product is then subjected to a Kirby-Bauer disk diffusion assay to assess its antibacterial activity. Students gain experience working with automation and at the…
More Combinatorial Proofs via Flagpole Arrangements
ERIC Educational Resources Information Center
DeTemple, Duane; Reynolds, H. David, II
2006-01-01
Combinatorial identities are proved by counting the number of arrangements of a flagpole and guy wires on a row of blocks that satisfy a set of conditions. An identity is proved by first deriving and then equating two expressions that each count the number of permissible arrangements. Identities for binomial coefficients and recursion relations…
ERIC Educational Resources Information Center
Tsai, Yu-Ling; Chang, Ching-Kuch
2009-01-01
This article reports an alternative approach, called the combinatorial model, to learning multiplicative identities, and investigates the effects of implementing results for this alternative approach. Based on realistic mathematics education theory, the new instructional materials or modules of the new approach were developed by the authors. From…
Children's Strategies for Solving Two- and Three-Dimensional Combinatorial Problems.
ERIC Educational Resources Information Center
English, Lyn D.
1993-01-01
Investigated strategies that 7- to 12-year-old children (n=96) spontaneously applied in solving novel combinatorial problems. With experience in solving two-dimensional problems, children were able to refine their strategies and adapt them to three dimensions. Results on some problems indicated significant effects of age. (Contains 32 references.)…
Identities for Generalized Fibonacci Numbers: A Combinatorial Approach
ERIC Educational Resources Information Center
Plaza, A.; Falcon, S.
2008-01-01
This note shows a combinatorial approach to some identities for generalized Fibonacci numbers. While it is a straightforward task to prove these identities with induction, and also by arithmetical manipulations such as rearrangements, the approach used here is quite simple to follow and eventually reduces the proof to a counting problem. (Contains…
ERIC Educational Resources Information Center
Kittredge, Kevin W.; Marine, Susan S.; Taylor, Richard T.
2004-01-01
A molecule possessing other functional groups that could be hydrogenerated is examined, where a variety of metal catalysts are evaluated under similar reaction conditions. Optimizing organic reactions is both time and labor intensive, and the use of a combinatorial parallel synthesis reactor was great time saving device, as per summary.
Human Performance on the Traveling Salesman and Related Problems: A Review
ERIC Educational Resources Information Center
MacGregor, James N.; Chu, Yun
2011-01-01
The article provides a review of recent research on human performance on the traveling salesman problem (TSP) and related combinatorial optimization problems. We discuss what combinatorial optimization problems are, why they are important, and why they may be of interest to cognitive scientists. We next describe the main characteristics of human…
ERIC Educational Resources Information Center
Brusco, Michael J.; Kohn, Hans-Friedrich; Stahl, Stephanie
2008-01-01
Dynamic programming methods for matrix permutation problems in combinatorial data analysis can produce globally-optimal solutions for matrices up to size 30x30, but are computationally infeasible for larger matrices because of enormous computer memory requirements. Branch-and-bound methods also guarantee globally-optimal solutions, but computation…
Iconicity and the Emergence of Combinatorial Structure in Language
ERIC Educational Resources Information Center
Verhoef, Tessa; Kirby, Simon; de Boer, Bart
2016-01-01
In language, recombination of a discrete set of meaningless building blocks forms an unlimited set of possible utterances. How such combinatorial structure emerged in the evolution of human language is increasingly being studied. It has been shown that it can emerge when languages culturally evolve and adapt to human cognitive biases. How the…
Osaba, E; Carballedo, R; Diaz, F; Onieva, E; de la Iglesia, I; Perallos, A
2014-01-01
Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test.
Osaba, E.; Carballedo, R.; Diaz, F.; Onieva, E.; de la Iglesia, I.; Perallos, A.
2014-01-01
Since their first formulation, genetic algorithms (GAs) have been one of the most widely used techniques to solve combinatorial optimization problems. The basic structure of the GAs is known by the scientific community, and thanks to their easy application and good performance, GAs are the focus of a lot of research works annually. Although throughout history there have been many studies analyzing various concepts of GAs, in the literature there are few studies that analyze objectively the influence of using blind crossover operators for combinatorial optimization problems. For this reason, in this paper a deep study on the influence of using them is conducted. The study is based on a comparison of nine techniques applied to four well-known combinatorial optimization problems. Six of the techniques are GAs with different configurations, and the remaining three are evolutionary algorithms that focus exclusively on the mutation process. Finally, to perform a reliable comparison of these results, a statistical study of them is made, performing the normal distribution z-test. PMID:25165731
Lin, Chun-Yuan; Wang, Yen-Ling
2014-01-01
Checkpoint kinase 2 (Chk2) has a great effect on DNA-damage and plays an important role in response to DNA double-strand breaks and related lesions. In this study, we will concentrate on Chk2 and the purpose is to find the potential inhibitors by the pharmacophore hypotheses (PhModels), combinatorial fusion, and virtual screening techniques. Applying combinatorial fusion into PhModels and virtual screening techniques is a novel design strategy for drug design. We used combinatorial fusion to analyze the prediction results and then obtained the best correlation coefficient of the testing set (r test) with the value 0.816 by combining the Best(train)Best(test) and Fast(train)Fast(test) prediction results. The potential inhibitors were selected from NCI database by screening according to Best(train)Best(test) + Fast(train)Fast(test) prediction results and molecular docking with CDOCKER docking program. Finally, the selected compounds have high interaction energy between a ligand and a receptor. Through these approaches, 23 potential inhibitors for Chk2 are retrieved for further study.
A combinatorial filtering method for magnetotelluric time-series based on Hilbert-Huang transform
NASA Astrophysics Data System (ADS)
Cai, Jianhua
2014-11-01
Magnetotelluric (MT) time-series are often contaminated with noise from natural or man-made processes. A substantial improvement is possible when the time-series are presented as clean as possible for further processing. A combinatorial method is described for filtering of MT time-series based on the Hilbert-Huang transform that requires a minimum of human intervention and leaves good data sections unchanged. Good data sections are preserved because after empirical mode decomposition the data are analysed through hierarchies, morphological filtering, adaptive threshold and multi-point smoothing, allowing separation of noise from signals. The combinatorial method can be carried out without any assumption about the data distribution. Simulated data and the real measured MT time-series from three different regions, with noise caused by baseline drift, high frequency noise and power-line contribution, are processed to demonstrate the application of the proposed method. Results highlight the ability of the combinatorial method to pick out useful signals, and the noise is suppressed greatly so that their deleterious influence is eliminated for the MT transfer function estimation.
Yu, Jong-Sung; Kim, Min-Sik; Kim, Jung Ho
2010-12-14
Combinatorial synthesis and screening were used to identify methanol-tolerant non-platinum cathode electrocatalysts for use in direct methanol fuel cells (DMFCs). Oxygen reduction consumes protons at the surface of DMFC cathode catalysts. In combinatorial screening, this pH change allows one to differentiate active catalysts using fluorescent acid-base indicators. Combinatorial libraries of carbon-supported catalyst compositions containing Ru, Mo, W, Sn, and Se were screened. Ternary and quaternary compositions containing Ru, Sn, Mo, Se were more active than the "standard" Alonso-Vante catalyst, Ru(3)Mo(0.08)Se(2), when tested in liquid-feed DMFCs. Physical characterization of the most active catalysts by powder X-ray diffraction, gas adsorption, and X-ray photoelectron spectroscopy revealed that the predominant crystalline phase was hexagonal close-packed (hcp) ruthenium, and showed a surface mostly covered with oxide. The best new catalyst, Ru(7.0)Sn(1.0)Se(1.0), was significantly more active than Ru(3)Se(2)Mo(0.08), even though the latter contained smaller particles.
Combinatorial Characterization of TiO2 Chemical Vapor Deposition Utilizing Titanium Isopropoxide.
Reinke, Michael; Ponomarev, Evgeniy; Kuzminykh, Yury; Hoffmann, Patrik
2015-07-13
The combinatorial characterization of the growth kinetics in chemical vapor deposition processes is challenging because precise information about the local precursor flow is usually difficult to access. In consequence, combinatorial chemical vapor deposition techniques are utilized more to study functional properties of thin films as a function of chemical composition, growth rate or crystallinity than to study the growth process itself. We present an experimental procedure which allows the combinatorial study of precursor surface kinetics during the film growth using high vacuum chemical vapor deposition. As consequence of the high vacuum environment, the precursor transport takes place in the molecular flow regime, which allows predicting and modifying precursor impinging rates on the substrate with comparatively little experimental effort. In this contribution, we study the surface kinetics of titanium dioxide formation using titanium tetraisopropoxide as precursor molecule over a large parameter range. We discuss precursor flux and temperature dependent morphology, crystallinity, growth rates, and precursor deposition efficiency. We conclude that the surface reaction of the adsorbed precursor molecules comprises a higher order reaction component with respect to precursor surface coverage.
Azimi, Sayyed M; Sheridan, Steven D; Ghannad-Rezaie, Mostafa; Eimon, Peter M; Yanik, Mehmet Fatih
2018-05-01
Identification of optimal transcription-factor expression patterns to direct cellular differentiation along a desired pathway presents significant challenges. We demonstrate massively combinatorial screening of temporally-varying mRNA transcription factors to direct differentiation of neural progenitor cells using a dynamically-reconfigurable magnetically-guided spotting technology for localizing mRNA, enabling experiments on millimetre size spots. In addition, we present a time-interleaved delivery method that dramatically reduces fluctuations in the delivered transcription-factor copy-numbers per cell. We screened combinatorial and temporal delivery of a pool of midbrain-specific transcription factors to augment the generation of dopaminergic neurons. We show that the combinatorial delivery of LMX1A, FOXA2 and PITX3 is highly effective in generating dopaminergic neurons from midbrain progenitors. We show that LMX1A significantly increases TH -expression levels when delivered to neural progenitor cells either during proliferation or after induction of neural differentiation, while FOXA2 and PITX3 increase expression only when delivered prior to induction, demonstrating temporal dependence of factor addition. © 2018, Azimi et al.
Particle-Based Microarrays of Oligonucleotides and Oligopeptides.
Nesterov-Mueller, Alexander; Maerkle, Frieder; Hahn, Lothar; Foertsch, Tobias; Schillo, Sebastian; Bykovskaya, Valentina; Sedlmayr, Martyna; Weber, Laura K; Ridder, Barbara; Soehindrijo, Miriam; Muenster, Bastian; Striffler, Jakob; Bischoff, F Ralf; Breitling, Frank; Loeffler, Felix F
2014-10-28
In this review, we describe different methods of microarray fabrication based on the use of micro-particles/-beads and point out future tendencies in the development of particle-based arrays. First, we consider oligonucleotide bead arrays, where each bead is a carrier of one specific sequence of oligonucleotides. This bead-based array approach, appearing in the late 1990s, enabled high-throughput oligonucleotide analysis and had a large impact on genome research. Furthermore, we consider particle-based peptide array fabrication using combinatorial chemistry. In this approach, particles can directly participate in both the synthesis and the transfer of synthesized combinatorial molecules to a substrate. Subsequently, we describe in more detail the synthesis of peptide arrays with amino acid polymer particles, which imbed the amino acids inside their polymer matrix. By heating these particles, the polymer matrix is transformed into a highly viscous gel, and thereby, imbedded monomers are allowed to participate in the coupling reaction. Finally, we focus on combinatorial laser fusing of particles for the synthesis of high-density peptide arrays. This method combines the advantages of particles and combinatorial lithographic approaches.
Particle-Based Microarrays of Oligonucleotides and Oligopeptides
Nesterov-Mueller, Alexander; Maerkle, Frieder; Hahn, Lothar; Foertsch, Tobias; Schillo, Sebastian; Bykovskaya, Valentina; Sedlmayr, Martyna; Weber, Laura K.; Ridder, Barbara; Soehindrijo, Miriam; Muenster, Bastian; Striffler, Jakob; Bischoff, F. Ralf; Breitling, Frank; Loeffler, Felix F.
2014-01-01
In this review, we describe different methods of microarray fabrication based on the use of micro-particles/-beads and point out future tendencies in the development of particle-based arrays. First, we consider oligonucleotide bead arrays, where each bead is a carrier of one specific sequence of oligonucleotides. This bead-based array approach, appearing in the late 1990s, enabled high-throughput oligonucleotide analysis and had a large impact on genome research. Furthermore, we consider particle-based peptide array fabrication using combinatorial chemistry. In this approach, particles can directly participate in both the synthesis and the transfer of synthesized combinatorial molecules to a substrate. Subsequently, we describe in more detail the synthesis of peptide arrays with amino acid polymer particles, which imbed the amino acids inside their polymer matrix. By heating these particles, the polymer matrix is transformed into a highly viscous gel, and thereby, imbedded monomers are allowed to participate in the coupling reaction. Finally, we focus on combinatorial laser fusing of particles for the synthesis of high-density peptide arrays. This method combines the advantages of particles and combinatorial lithographic approaches. PMID:27600347
Minovski, Nikola; Perdih, Andrej; Solmajer, Tom
2012-05-01
The virtual combinatorial chemistry approach as a methodology for generating chemical libraries of structurally-similar analogs in a virtual environment was employed for building a general mixed virtual combinatorial library with a total of 53.871 6-FQ structural analogs, introducing the real synthetic pathways of three well known 6-FQ inhibitors. The druggability properties of the generated combinatorial 6-FQs were assessed using an in-house developed drug-likeness filter integrating the Lipinski/Veber rule-sets. The compounds recognized as drug-like were used as an external set for prediction of the biological activity values using a neural-networks (NN) model based on an experimentally-determined set of active 6-FQs. Furthermore, a subset of compounds was extracted from the pool of drug-like 6-FQs, with predicted biological activity, and subsequently used in virtual screening (VS) campaign combining pharmacophore modeling and molecular docking studies. This complex scheme, a powerful combination of chemometric and molecular modeling approaches provided novel QSAR guidelines that could aid in the further lead development of 6-FQs agents.
Chuah, Yon Jin; Zhang, Ying; Wu, Yingnan; Menon, Nishanth V; Goh, Ghim Hian; Lee, Ann Charlene; Chan, Vincent; Zhang, Yilei; Kang, Yuejun
2015-09-01
Cell sheet engineering has been exploited as an alternative approach in tissue regeneration and the use of stem cells to generate cell sheets has further showed its potential in stem cell-mediated tissue regeneration. There exist vast interests in developing strategies to enhance the formation of stem cell sheets for downstream applications. It has been proved that stem cells are sensitive to the biophysical cues of the microenvironment. Therefore we hypothesized that the combinatorial substratum properties could be tailored to modulate the development of cell sheet formation and further influence its multipotency. For validation, polydimethylsiloxane (PDMS) of different combinatorial substratum properties (including stiffness, roughness and wettability) were created, on which the human bone marrow derived mesenchymal stem cells (BMSCs) were cultured to form cell sheets with their multipotency evaluated after induced differentiation. The results showed that different combinatorial effects of these substratum properties were able to influence BMSC behavior such as adhesion, spreading and proliferation during cell sheet development. Collagen formation within the cell sheet was enhanced on substrates with lower stiffness, higher hydrophobicity and roughness, which further assisted the induced chondrogenesis and osteogenesis, respectively. These findings suggested that combinatorial substratum properties had profound effects on BMSC cell sheet integrity and multipotency, which had significant implications for future biomaterials and scaffold designs in the field of BMSC-mediated tissue regeneration. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yan, Zongkai; Zhang, Xiaokun; Li, Guang; Cui, Yuxing; Jiang, Zhaolian; Liu, Wen; Peng, Zhi; Xiang, Yong
2018-01-01
The conventional methods for designing and preparing thin film based on wet process remain a challenge due to disadvantages such as time-consuming and ineffective, which hinders the development of novel materials. Herein, we present a high-throughput combinatorial technique for continuous thin film preparation relied on chemical bath deposition (CBD). The method is ideally used to prepare high-throughput combinatorial material library with low decomposition temperatures and high water- or oxygen-sensitivity at relatively high-temperature. To check this system, a Cu(In, Ga)Se (CIGS) thin films library doped with 0-19.04 at.% of antimony (Sb) was taken as an example to evaluate the regulation of varying Sb doping concentration on the grain growth, structure, morphology and electrical properties of CIGS thin film systemically. Combined with the Energy Dispersive Spectrometer (EDS), X-ray Photoelectron Spectroscopy (XPS), automated X-ray Diffraction (XRD) for rapid screening and Localized Electrochemical Impedance Spectroscopy (LEIS), it was confirmed that this combinatorial high-throughput system could be used to identify the composition with the optimal grain orientation growth, microstructure and electrical properties systematically, through accurately monitoring the doping content and material composition. According to the characterization results, a Sb2Se3 quasi-liquid phase promoted CIGS film-growth model has been put forward. In addition to CIGS thin film reported here, the combinatorial CBD also could be applied to the high-throughput screening of other sulfide thin film material systems.
Bemis, Douglas K.; Pylkkänen, Liina
2013-01-01
Debates surrounding the evolution of language often hinge upon its relationship to cognition more generally and many investigations have attempted to demark the boundary between the two. Though results from these studies suggest that language may recruit domain-general mechanisms during certain types of complex processing, the domain-generality of basic combinatorial mechanisms that lie at the core of linguistic processing is still unknown. Our previous work (Bemis and Pylkkänen, 2011, 2012) used magnetoencephalography to isolate neural activity associated with the simple composition of an adjective and a noun (“red boat”) and found increased activity during this processing localized to the left anterior temporal lobe (lATL), ventro-medial prefrontal cortex (vmPFC), and left angular gyrus (lAG). The present study explores the domain-generality of these effects and their associated combinatorial mechanisms through two parallel non-linguistic combinatorial tasks designed to be as minimal and natural as the linguistic paradigm. In the first task, we used pictures of colored shapes to elicit combinatorial conceptual processing similar to that evoked by the linguistic expressions and find increased activity again localized to the vmPFC during combinatorial processing. This result suggests that a domain-general semantic combinatorial mechanism operates during basic linguistic composition, and that activity generated by its processing localizes to the vmPFC. In the second task, we recorded neural activity as subjects performed simple addition between two small numerals. Consistent with a wide array of recent results, we find no effects related to basic addition that coincide with our linguistic effects and instead find increased activity localized to the intraparietal sulcus. This result suggests that the scope of the previously identified linguistic effects is restricted to compositional operations and does not extend generally to all tasks that are merely similar in form. PMID:23293621
Roberts, Gareth; Lewandowski, Jirka; Galantucci, Bruno
2015-08-01
Communication systems are exposed to two different pressures: a pressure for transmission efficiency, such that messages are simple to produce and perceive, and a pressure for referential efficiency, such that messages are easy to understand with their intended meaning. A solution to the first pressure is combinatoriality--the recombination of a few basic meaningless forms to express an infinite number of meanings. A solution to the second is iconicity--the use of forms that resemble what they refer to. These two solutions appear to be incompatible with each other, as iconic forms are ill-suited for use as meaningless combinatorial units. Furthermore, in the early stages of a communication system, when basic referential forms are in the process of being established, the pressure for referential efficiency is likely to be particularly strong, which may lead it to trump the pressure for transmission efficiency. This means that, where iconicity is available as a strategy, it is likely to impede the emergence of combinatoriality. Although this hypothesis seems consistent with some observations of natural language, it was unclear until recently how it could be soundly tested. This has changed thanks to the development of a line of research, known as Experimental Semiotics, in which participants construct novel communication systems in the laboratory using an unfamiliar medium. We conducted an Experimental Semiotic study in which we manipulated the opportunity for iconicity by varying the kind of referents to be communicated, while keeping the communication medium constant. We then measured the combinatoriality and transmission efficiency of the communication systems. We found that, where iconicity was available, it provided scaffolding for the construction of communication systems and was overwhelmingly adopted. Where it was not available, however, the resulting communication systems were more combinatorial and their forms more efficient to produce. This study enriches our understanding of the fundamental design principles of human communication and contributes tools to enrich it further. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Green, Martin L.; Takeuchi, Ichiro; Hattrick-Simpers, Jason R.
2013-06-01
High throughput (combinatorial) materials science methodology is a relatively new research paradigm that offers the promise of rapid and efficient materials screening, optimization, and discovery. The paradigm started in the pharmaceutical industry but was rapidly adopted to accelerate materials research in a wide variety of areas. High throughput experiments are characterized by synthesis of a "library" sample that contains the materials variation of interest (typically composition), and rapid and localized measurement schemes that result in massive data sets. Because the data are collected at the same time on the same "library" sample, they can be highly uniform with respect to fixed processing parameters. This article critically reviews the literature pertaining to applications of combinatorial materials science for electronic, magnetic, optical, and energy-related materials. It is expected that high throughput methodologies will facilitate commercialization of novel materials for these critically important applications. Despite the overwhelming evidence presented in this paper that high throughput studies can effectively inform commercial practice, in our perception, it remains an underutilized research and development tool. Part of this perception may be due to the inaccessibility of proprietary industrial research and development practices, but clearly the initial cost and availability of high throughput laboratory equipment plays a role. Combinatorial materials science has traditionally been focused on materials discovery, screening, and optimization to combat the extremely high cost and long development times for new materials and their introduction into commerce. Going forward, combinatorial materials science will also be driven by other needs such as materials substitution and experimental verification of materials properties predicted by modeling and simulation, which have recently received much attention with the advent of the Materials Genome Initiative. Thus, the challenge for combinatorial methodology will be the effective coupling of synthesis, characterization and theory, and the ability to rapidly manage large amounts of data in a variety of formats.
ERIC Educational Resources Information Center
Thomsen, Dietrick E.
1979-01-01
Supersymmetry is a newly developed principle with which theorists are attempting to continue the work of unification. This article examines the principle of supersymmetry at the subatomic level and relates it to the quest for a unity theory. (MA)
Unification with vector-like fermions and signals at LHC
NASA Astrophysics Data System (ADS)
Bhattacherjee, Biplob; Byakti, Pritibhajan; Kushwaha, Ashwani; Vempati, Sudhir K.
2018-05-01
We look for minimal extensions of Standard Model with vector like fermions leading to precision unification of gauge couplings. Constraints from proton decay, Higgs stability and perturbativity are considered. The simplest models contain several copies of vector fermions in two different (incomplete) representations. Some of these models encompass Type III seesaw mechanism for neutrino masses whereas some others have a dark matter candidate. In all the models, at least one of the candidates has non-trivial representation under SU(3)color. In the limit of vanishing Yukawa couplings, new QCD bound states are formed, which can be probed at LHC. The present limits based on results from 13 TeV already probe these particles for masses around a TeV. Similar models can be constructed with three or four vector representations, examples of which are presented.
Requirements for energy based constitutive modeling in tire mechanics
NASA Technical Reports Server (NTRS)
Luchini, John R.; Peters, Jim M.; Mars, Will V.
1995-01-01
The history, requirements, and theoretical basis of a new energy based constitutive model for (rubber) material elasticity, hysteresis, and failure are presented. Energy based elasticity is handled by many constitutive models, both in one dimension and in three dimensions. Conversion of mechanical energy to heat can be modeled with viscoelasticity or as structural hysteresis. We are seeking unification of elasticity, hysteresis, and failure mechanisms such as fatigue and wear. An energy state characterization for failure criteria of (rubber) materials may provide this unification and also help explain the interaction of temperature effects with failure mechanisms which are described as creation of growth of internal crack surface. Improved structural modeling of tires with FEM should result from such a unified constitutive theory. The theory will also guide experimental work and should enable better interpretation of the results of computational stress analyses.
The minimal SUSY B - L model: simultaneous Wilson lines and string thresholds
Deen, Rehan; Ovrut, Burt A.; Purves, Austin
2016-07-08
In previous work, we presented a statistical scan over the soft supersymmetry breaking parameters of the minimal SUSY B - L model. For specificity of calculation, unification of the gauge parameters was enforced by allowing the two Z 3×Z 3 Wilson lines to have mass scales separated by approximately an order of magnitude. This introduced an additional “left-right” sector below the unification scale. In this paper, for three important reasons, we modify our previous analysis by demanding that the mass scales of the two Wilson lines be simultaneous and equal to an “average unification” mass U >. The present analysismore » is 1) more “natural” than the previous calculations, which were only valid in a very specific region of the Calabi-Yau moduli space, 2) the theory is conceptually simpler in that the left-right sector has been removed and 3) in the present analysis the lack of gauge unification is due to threshold effects — particularly heavy string thresholds, which we calculate statistically in detail. As in our previous work, the theory is renormalization group evolved from U > to the electroweak scale — being subjected, sequentially, to the requirement of radiative B - L and electroweak symmetry breaking, the present experimental lower bounds on the B - L vector boson and sparticle masses, as well as the lightest neutral Higgs mass of ~125 GeV. The subspace of soft supersymmetry breaking masses that satisfies all such constraints is presented and shown to be substantial.« less
Computational Unification: a Vision for Connecting Researchers
NASA Astrophysics Data System (ADS)
Troy, R. M.; Kingrey, O. J.
2002-12-01
Computational Unification of science, once only a vision, is becoming a reality. This technology is based upon a scientifically defensible, general solution for Earth Science data management and processing. The computational unification of science offers a real opportunity to foster inter and intra-discipline cooperation, and the end of 're-inventing the wheel'. As we move forward using computers as tools, it is past time to move from computationally isolating, "one-off" or discipline-specific solutions into a unified framework where research can be more easily shared, especially with researchers in other disciplines. The author will discuss how distributed meta-data, distributed processing and distributed data objects are structured to constitute a working interdisciplinary system, including how these resources lead to scientific defensibility through known lineage of all data products. Illustration of how scientific processes are encapsulated and executed illuminates how previously written processes and functions are integrated into the system efficiently and with minimal effort. Meta-data basics will illustrate how intricate relationships may easily be represented and used to good advantage. Retrieval techniques will be discussed including trade-offs of using meta-data versus embedded data, how the two may be integrated, and how simplifying assumptions may or may not help. This system is based upon the experience of the Sequoia 2000 and BigSur research projects at the University of California, Berkeley, whose goals were to find an alternative to the Hughes EOS-DIS system and is presently offered by Science Tools corporation, of which the author is a principal.
Split Dirac Supersymmetry: An Ultraviolet Completion of Higgsino Dark Matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, Patrick J.; Kribs, Graham D.; Martin, Adam
2014-10-07
Motivated by the observation that the Higgs quartic coupling runs to zero at an intermediate scale, we propose a new framework for models of split supersymmetry, in which gauginos acquire intermediate scale Dirac masses ofmore » $$\\sim 10^{8-11}$$ GeV. Scalar masses arise from one-loop finite contributions as well as direct gravity-mediated contributions. Like split supersymmetry, one Higgs doublet is fine-tuned to be light. The scale at which the Dirac gauginos are introduced to make the Higgs quartic zero is the same as is necessary for gauge coupling unification. Thus, gauge coupling unification persists (nontrivially, due to adjoint multiplets), though with a somewhat higher unification scale $$\\gtrsim 10^{17}$$ GeV. The $$\\mu$$-term is naturally at the weak scale, and provides an opportunity for experimental verification. We present two manifestations of Split Dirac Supersymmetry. In the "Pure Dirac" model, the lightest Higgsino must decay through R-parity violating couplings, leading to an array of interesting signals in colliders. In the "Hypercharge Impure" model, the bino acquires a Majorana mass that is one-loop suppressed compared with the Dirac gluino and wino. This leads to weak scale Higgsino dark matter whose overall mass scale, as well as the mass splitting between the neutral components, is naturally generated from the same UV dynamics. We outline the challenges to discovering pseudo-Dirac Higgsino dark matter in collider and dark matter detection experiments.« less
Lie-Santilli isoapproach to the unification of gravity and electromagnetism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Animalu, A.O.E.
1996-06-01
The author reviews the problem of Einstein`s original proposal for the unification of gravity and electromagnetism in space-time differential geometry along the lines of the recent contributions by A.A. Logunov, R.M. Santilli, D.F. Lopez and others. The author presents a new method of unification based on the Lie-Santilli isotopic theory whereby the unified field tensor g = (g{sub {mu}{nu}}) is constructed from the symmetric Riemannian gravitational tensor, g = (g{mu}{nu}), and the antisymmetric electromagnetic field tensor F = (F{sub {mu}{nu}}) via an isotopic lifting g {yields} {cflx g} = Fg of the type of Lax pairing, where det F {ne}more » 0, the unified field {cflx g} satisfies Logunov-Santilli equations while g and F are treated as Lax pair. Because of Santilli`s isotopic equivalence between Minkowskian and Riemannian geometries, the author infers that in the Minkowskian limit F = f, g = {eta}, the metric {eta} satisfies Lax`s equation of motion {partial_derivative}{eta}/{partial_derivative}t = f{eta} {minus} {eta}f which insures the conservation of the eigenvalues of g. The invariance of the electromagnetic group of transformations (F) in Minkowski space is determined by the eigenvalue equations, det (F{sub {mu}{nu}}){minus}{lambda}{eta}{sub {mu}{nu}} = 0, from which the author deduces a Lie-isotopic {open_quotes}extended{close_quotes} relativity principle. A wave equation for a spin-2 particle in the unified field is derived, and the experimental consequences of the theory are discussed.« less
2008-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS INTEGRATING MONETARY AND NON-MONETARY REENLISTMENT INCENTIVES UTILIZING THE...Monetary and Non- monetary Reenlistment Incentives Utilizing the Combinatorial Retention Auction Mechanism (CRAM) 6. AUTHOR(S) Brooke Zimmerman 5...iii Approved for public release; distribution is unlimited INTEGRATING MONETARY AND NON-MONETARY REENLISTMENT INCENTIVES UTILIZING THE
ERIC Educational Resources Information Center
Fuller, Amelia A.
2016-01-01
A five-week, research-based experiment suitable for second-semester introductory organic laboratory students is described. Each student designs, prepares, and analyzes a combinatorial array of six aromatic oligoamides. Molecules are prepared on solid phase via a six-step synthetic sequence, and purities and identities are determined by analysis of…
Two-dimensional combinatorial screening enables the bottom-up design of a microRNA-10b inhibitor.
Velagapudi, Sai Pradeep; Disney, Matthew D
2014-03-21
The RNA motifs that bind guanidinylated kanamycin A (G Kan A) and guanidinylated neomycin B (G Neo B) were identified via two-dimensional combinatorial screening (2DCS). The results of these studies enabled the "bottom-up" design of a small molecule inhibitor of oncogenic microRNA-10b.
ERIC Educational Resources Information Center
Prodromou, Theodosia
2012-01-01
This article seeks to address a pedagogical theory of introducing the classicist and the frequentist approach to probability, by investigating important elements in 9th grade students' learning process while working with a "TinkerPlots2" combinatorial problem. Results from this research study indicate that, after the students had seen…
An Onto-Semiotic Analysis of Combinatorial Problems and the Solving Processes by University Students
ERIC Educational Resources Information Center
Godino, Juan D.; Batanero, Carmen; Roa, Rafael
2005-01-01
In this paper we describe an ontological and semiotic model for mathematical knowledge, using elementary combinatorics as an example. We then apply this model to analyze the solving process of some combinatorial problems by students with high mathematical training, and show its utility in providing a semiotic explanation for the difficulty of…
Combinatorial synthesis of ceramic materials
Lauf, Robert J [Oak Ridge, TN; Walls, Claudia A [Oak Ridge, TN; Boatner, Lynn A [Oak Ridge, TN
2010-02-23
A combinatorial library includes a gelcast substrate defining a plurality of cavities in at least one surface thereof; and a plurality of gelcast test materials in the cavities, at least two of the test materials differing from the substrate in at least one compositional characteristic, the two test materials differing from each other in at least one compositional characteristic.
Combinatorial synthesis of ceramic materials
Lauf, Robert J.; Walls, Claudia A.; Boatner, Lynn A.
2006-11-14
A combinatorial library includes a gelcast substrate defining a plurality of cavities in at least one surface thereof; and a plurality of gelcast test materials in the cavities, at least two of the test materials differing from the substrate in at least one compositional characteristic, the two test materials differing from each other in at least one compositional characteristic.
ERIC Educational Resources Information Center
Abrahamson, Dor
2006-01-01
This snapshot introduces a computer-based representation and activity that enables students to simultaneously "see" the combinatorial space of a stochastic device (e.g., dice, spinner, coins) and its outcome distribution. The author argues that the "ambiguous" representation fosters student insight into probability. [Snapshots are subject to peer…
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2005-10-01
A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.
Combinatorial complexity of pathway analysis in metabolic networks.
Klamt, Steffen; Stelling, Jörg
2002-01-01
Elementary flux mode analysis is a promising approach for a pathway-oriented perspective of metabolic networks. However, in larger networks it is hampered by the combinatorial explosion of possible routes. In this work we give some estimations on the combinatorial complexity including theoretical upper bounds for the number of elementary flux modes in a network of a given size. In a case study, we computed the elementary modes in the central metabolism of Escherichia coli while utilizing four different substrates. Interestingly, although the number of modes occurring in this complex network can exceed half a million, it is still far below the upper bound. Hence, to a certain extent, pathway analysis of central catabolism is feasible to assess network properties such as flexibility and functionality.
Building synthetic gene circuits from combinatorial libraries: screening and selection strategies.
Schaerli, Yolanda; Isalan, Mark
2013-07-01
The promise of wide-ranging biotechnology applications inspires synthetic biologists to design novel genetic circuits. However, building such circuits rationally is still not straightforward and often involves painstaking trial-and-error. Mimicking the process of natural selection can help us to bridge the gap between our incomplete understanding of nature's design rules and our desire to build functional networks. By adopting the powerful method of directed evolution, which is usually applied to protein engineering, functional networks can be obtained through screening or selecting from randomised combinatorial libraries. This review first highlights the practical options to introduce combinatorial diversity into gene circuits and then examines strategies for identifying the potentially rare library members with desired functions, either by screening or selection.
Supergravity and the Unification of the Laws of Physics
ERIC Educational Resources Information Center
Freedman, Daniel Z.; van Nieuwenhuizen, Peter
1978-01-01
In this new theory the gravitational force arises from a symmetry relating particles with vastly different properties. The ultimate result may be a unified theory of all the basic forces in nature. (Author/BB)
ERIC Educational Resources Information Center
Marshak, Marvin L.
1984-01-01
Provides the rationale for and examples of experiments designed to test the stability of protons and bound neutrons. Also considers the unification question, cosmological implications, current and future detectors, and current status of knowledge on proton decay. (JN)
Theoretical & Experimental Research in Weak, Electromagnetic & Strong Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandi, Satyanarayan; Babu, Kaladi; Rizatdinova, Flera
The conducted research spans a wide range of topics in the theoretical, experimental and phenomenological aspects of elementary particle interactions. Theory projects involve topics in both the energy frontier and the intensity frontier. The experimental research involves energy frontier with the ATLAS Collaboration at the Large Hadron Collider (LHC). In theoretical research, novel ideas going beyond the Standard Model with strong theoretical motivations were proposed, and their experimental tests at the LHC and forthcoming neutrino facilities were outlined. These efforts fall into the following broad categories: (i) TeV scale new physics models for LHC Run 2, including left-right symmetry andmore » trinification symmetry, (ii) unification of elementary particles and forces, including the unification of gauge and Yukawa interactions, (iii) supersummetry and mechanisms of supersymmetry breaking, (iv) superworld without supersymmetry, (v) general models of extra dimensions, (vi) comparing signals of extra dimensions with those of supersymmetry, (vii) models with mirror quarks and mirror leptons at the TeV scale, (viii) models with singlet quarks and singlet Higgs and their implications for Higgs physics at the LHC, (ix) new models for the dark matter of the universe, (x) lepton flavor violation in Higgs decays, (xi) leptogenesis in radiative models of neutrino masses, (xii) light mediator models of non-standard neutrino interactions, (xiii) anomalous muon decay and short baseline neutrino anomalies, (xiv) baryogenesis linked to nucleon decay, and (xv) a new model for recently observed diboson resonance at the LHC and its other phenomenological implications. The experimental High Energy Physics group has been, and continues to be, a successful and productive contributor to the ATLAS experiment at the LHC. Members of the group performed search for gluinos decaying to stop and top quarks, new heavy gauge bosons decaying to top and bottom quarks, and vector-like quarks produced in pairs and decaying to light quarks. Members of the OSU group played a leading role in the detailed optimization studies for the future ATLAS Inner Tracker (ITk), which will be installed during the Phase-II upgrade, replacing the current tracking system. The proposed studies aim to enhance the ATLAS discovery potential in the high-luminosity LHC era. The group members have contributed to the calibration of algorithms for identifying boosted vector bosons and b-jets, which will help expand the ATLAS reach in many searches for new physics.« less
Ligand design by a combinatorial approach based on modeling and experiment: application to HLA-DR4
NASA Astrophysics Data System (ADS)
Evensen, Erik; Joseph-McCarthy, Diane; Weiss, Gregory A.; Schreiber, Stuart L.; Karplus, Martin
2007-07-01
Combinatorial synthesis and large scale screening methods are being used increasingly in drug discovery, particularly for finding novel lead compounds. Although these "random" methods sample larger areas of chemical space than traditional synthetic approaches, only a relatively small percentage of all possible compounds are practically accessible. It is therefore helpful to select regions of chemical space that have greater likelihood of yielding useful leads. When three-dimensional structural data are available for the target molecule this can be achieved by applying structure-based computational design methods to focus the combinatorial library. This is advantageous over the standard usage of computational methods to design a small number of specific novel ligands, because here computation is employed as part of the combinatorial design process and so is required only to determine a propensity for binding of certain chemical moieties in regions of the target molecule. This paper describes the application of the Multiple Copy Simultaneous Search (MCSS) method, an active site mapping and de novo structure-based design tool, to design a focused combinatorial library for the class II MHC protein HLA-DR4. Methods for the synthesizing and screening the computationally designed library are presented; evidence is provided to show that binding was achieved. Although the structure of the protein-ligand complex could not be determined, experimental results including cross-exclusion of a known HLA-DR4 peptide ligand (HA) by a compound from the library. Computational model building suggest that at least one of the ligands designed and identified by the methods described binds in a mode similar to that of native peptides.
Zhou, Yikang; Li, Gang; Dong, Junkai; Xing, Xin-Hui; Dai, Junbiao; Zhang, Chong
2018-05-01
Facing boosting ability to construct combinatorial metabolic pathways, how to search the metabolic sweet spot has become the rate-limiting step. We here reported an efficient Machine-learning workflow in conjunction with YeastFab Assembly strategy (MiYA) for combinatorial optimizing the large biosynthetic genotypic space of heterologous metabolic pathways in Saccharomyces cerevisiae. Using β-carotene biosynthetic pathway as example, we first demonstrated that MiYA has the power to search only a small fraction (2-5%) of combinatorial space to precisely tune the expression level of each gene with a machine-learning algorithm of an artificial neural network (ANN) ensemble to avoid over-fitting problem when dealing with a small number of training samples. We then applied MiYA to improve the biosynthesis of violacein. Feed with initial data from a colorimetric plate-based, pre-screened pool of 24 strains producing violacein, MiYA successfully predicted, and verified experimentally, the existence of a strain that showed a 2.42-fold titer improvement in violacein production among 3125 possible designs. Furthermore, MiYA was able to largely avoid the branch pathway of violacein biosynthesis that makes deoxyviolacein, and produces very pure violacein. Together, MiYA combines the advantages of standardized building blocks and machine learning to accelerate the Design-Build-Test-Learn (DBTL) cycle for combinatorial optimization of metabolic pathways, which could significantly accelerate the development of microbial cell factories. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
The Combinatorial Trace Method in Action
ERIC Educational Resources Information Center
Krebs, Mike; Martinez, Natalie C.
2013-01-01
On any finite graph, the number of closed walks of length k is equal to the sum of the kth powers of the eigenvalues of any adjacency matrix. This simple observation is the basis for the combinatorial trace method, wherein we attempt to count (or bound) the number of closed walks of a given length so as to obtain information about the graph's…
Lin, En-Chiang; Cole, Jesse J; Jacobs, Heiko O
2010-11-10
This article reports and applies a recently discovered programmable multimaterial deposition process to the formation and combinatorial improvement of 3D nanostructured devices. The gas-phase deposition process produces charged <5 nm particles of silver, tungsten, and platinum and uses externally biased electrodes to control the material flux and to turn deposition ON/OFF in selected domains. Domains host nanostructured dielectrics to define arrays of electrodynamic 10 × nanolenses to further control the flux to form <100 nm resolution deposits. The unique feature of the process is that material type, amount, and sequence can be altered from one domain to the next leading to different types of nanostructures including multimaterial bridges, interconnects, or nanowire arrays with 20 nm positional accuracy. These features enable combinatorial nanostructured materials and device discovery. As a first demonstration, we produce and identify in a combinatorial way 3D nanostructured electrode designs that improve light scattering, absorption, and minority carrier extraction of bulk heterojunction photovoltaic cells. Photovoltaic cells from domains with long and dense nanowire arrays improve the relative power conversion efficiency by 47% when compared to flat domains on the same substrate.
Home - Deep Underground Neutrino ExperimentDeep Underground Neutrino
understanding of neutrinos and their role in the universe. DUNE prototype detectors are under construction at understanding of neutrinos and their role in the universe. DUNE_Forces_011116_FINAL Unification of Forces With
The Tie That Binds:. A Fundamental Unit of `Change' in Space and Time
NASA Astrophysics Data System (ADS)
Beichler, James E.
2013-09-01
Why, despite all efforts to the contrary, have attempts at unification based on the supposedly more fundamental quantum theory failed miserably? The truth is that the essential idea or concept of the quantum itself has never been fully understood. What is the quantum, or rather, what is its ultimate nature? Science may be able to work adequately with the quantum; in a sense science is quite articulate in the language of the quantum, i.e., its mathematical interpretation of the quantum mechanics, but science has no idea of the true physical nature of the quantum. Scientists and philosophers have wasted energy and efforts on irrelevant issues such as the debate over determinism and indeterminism instead of carefully analyzing the physical source of the quantum. Only with a true understanding of the physical nature of the quantum will the unification of the quantum and relativity ever become a reality.
The problem of the Grand Unification Theory
NASA Astrophysics Data System (ADS)
Treder, H.-J.
The evolution and fundamental questions of physical theories unifying the gravitational, electromagnetic, and quantum-mechanical interactions are explored, taking Pauli's aphorism as a motto: 'Let no man join what God has cast asunder.' The contributions of Faraday and Riemann, Lorentz, Einstein, and others are discussed, and the criterion of Pauli is applied to Grand Unification Theories (GUT) in general and to those seeking to link gravitation and electromagnetism in particular. Formal mathematical symmetry principles must be shown to have real physical relevance by predicting measurable phenomena not explainable without a GUT; these phenomena must be macroscopic because gravitational effects are to weak to be measured on the microscopic level. It is shown that empirical and theoretical studies of 'gravomagnetism', 'gravoelectricity', or possible links between gravoelectrity and the cosmic baryon assymmetry eventually lead back to basic questions which appear philosophical or purely mathematical but actually challenge physics to seek verifiable answers.
Unification of force and substance.
Wilczek, Frank
2016-08-28
Maxwell's mature presentation of his equations emphasized the unity of electromagnetism and mechanics, subsuming both as 'dynamical systems'. That intuition of unity has proved both fruitful, as a source of pregnant concepts, and broadly inspiring. A deep aspect of Maxwell's work is its use of redundant potentials, and the associated requirement of gauge symmetry. Those concepts have become central to our present understanding of fundamental physics, but they can appear to be rather formal and esoteric. Here I discuss two things: the physical significance of gauge invariance, in broad terms; and some tantalizing prospects for further unification, building on that concept, that are visible on the horizon today. If those prospects are realized, Maxwell's vision of the unity of field and substance will be brought to a new level.This article is part of the themed issue 'Unifying physics and technology in light of Maxwell's equations'. © 2016 The Author(s).
Hyperquarks and bosonic preon bound states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmid, Michael L.; Buchmann, Alfons J.
2009-11-01
In a model in which leptons, quarks, and the recently introduced hyperquarks are built up from two fundamental spin-(1/2) preons, the standard model weak gauge bosons emerge as preon bound states. In addition, the model predicts a host of new composite gauge bosons, in particular, those responsible for hyperquark and proton decay. Their presence entails a left-right symmetric extension of the standard model weak interactions and a scheme for a partial and grand unification of nongravitational interactions based on, respectively, the effective gauge groups SU(6){sub P} and SU(9){sub G}. This leads to a prediction of the Weinberg angle at lowmore » energies in good agreement with experiment. Furthermore, using evolution equations for the effective coupling strengths, we calculate the partial and grand unification scales, the hyperquark mass scale, as well as the mass and decay rate of the lightest hyperhadron.« less
SO(10) Yukawa unification: SUSY on the edge
NASA Astrophysics Data System (ADS)
Raby, Stuart
2016-06-01
In this talk we discuss SO(10) Yukawa unification and its ramifications for phenomenology. The initial constraints come from fitting the top, bottom and tau masses, requiring large tan β ~ 50 and particular values for soft SUSY breaking parameters. We perform a global χ2 analysis, fitting the recently observed `Higgs' with mass of order 125 GeV in addition to fermion masses and mixing angles and several flavor violating observables. We discuss two distinct GUT scale boundary conditions for soft SUSY breaking masses. In both cases we have a universal cubic scalar parameter, A0, non-universal Higgs masses and universal squark and slepton masses, m16. In the first case we consider universal gaugino masses, while in the latter case we have non-universal gaugino masses. We discuss the spectrum of SUSY particle masses, consequences for the LHC and the issue of fine-tuning.
Kuorikoski, Jaakko; Marchionni, Caterina
2014-12-01
We examine the diversity of strategies of modelling networks in (micro) economics and (analytical) sociology. Field-specific conceptions of what explaining (with) networks amounts to or systematic preference for certain kinds of explanatory factors are not sufficient to account for differences in modelling methodologies. We argue that network models in both sociology and economics are abstract models of network mechanisms and that differences in their modelling strategies derive to a large extent from field-specific conceptions of the way in which a good model should be a general one. Whereas the economics models aim at unification, the sociological models aim at a set of mechanism schemas that are extrapolatable to the extent that the underlying psychological mechanisms are general. These conceptions of generality induce specific biases in mechanistic explanation and are related to different views of when knowledge from different fields should be seen as relevant.
Particle physics in the very early universe
NASA Technical Reports Server (NTRS)
Schramm, D. N.
1981-01-01
Events in the very early big bang universe in which elementary particle physics effects may have been dominant are discussed, with attention to the generation of a net baryon number by way of grand unification theory, and emphasis on the possible role of massive neutrinos in increasing current understanding of various cosmological properties and of the constraints placed on neutrino properties by cosmology. It is noted that when grand unification theories are used to describe very early universe interactions, an initially baryon-symmetrical universe can evolve a net baryon excess of 10 to the -9th to 10 to the -11th per photon, given reasonable parameters. If neutrinos have mass, the bulk of the mass of the universe may be in the form of leptons, implying that the form of matter most familiar to physical science may not be the dominant form of matter in the universe.
Parsing with logical variables (logic-based programming systems)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finin, T.W.; Stone Palmer, M.
1983-01-01
Logic based programming systems have enjoyed an increasing popularity in applied AI work in the last few years. One of the contributions to computational linguistics made by the logic programming paradigm has been the definite clause grammar. In comparing DCGS with previous parsing mechanisms such as ATNS, certain clear advantages are seen. The authors feel that the most important of these advantages are due to the use of logical variables with unification as the fundamental operation on them. To illustrate the power of the logical variable, they have implemented an experimental atn system which treats atn registers as logical variablesmore » and provides a unification operation over them. They aim to simultaneously encourage the use of the powerful mechanisms available in DCGS and demonstrate that some of these techniques can be captured without reference to a resolution theorem prover. 14 references.« less
Construction of a scFv Library with Synthetic, Non-combinatorial CDR Diversity.
Bai, Xuelian; Shim, Hyunbo
2017-01-01
Many large synthetic antibody libraries have been designed, constructed, and successfully generated high-quality antibodies suitable for various demanding applications. While synthetic antibody libraries have many advantages such as optimized framework sequences and a broader sequence landscape than natural antibodies, their sequence diversities typically are generated by random combinatorial synthetic processes which cause the incorporation of many undesired CDR sequences. Here, we describe the construction of a synthetic scFv library using oligonucleotide mixtures that contain predefined, non-combinatorially synthesized CDR sequences. Each CDR is first inserted to a master scFv framework sequence and the resulting single-CDR libraries are subjected to a round of proofread panning. The proofread CDR sequences are assembled to produce the final scFv library with six diversified CDRs.
Natural products and combinatorial chemistry: back to the future.
Ortholand, Jean-Yves; Ganesan, A
2004-06-01
The introduction of high-throughput synthesis and combinatorial chemistry has precipitated a global decline in the screening of natural products by the pharmaceutical industry. Some companies terminated their natural products program, despite the unproven success of the new technologies. This was a premature decision, as natural products have a long history of providing important medicinal agents. Furthermore, they occupy a complementary region of chemical space compared with the typical synthetic compound library. For these reasons, the interest in natural products has been rekindled. Various approaches have evolved that combine the power of natural products and organic chemistry, ranging from the combinatorial total synthesis of analogues to the exploration of natural product scaffolds and the design of completely unnatural molecules that resemble natural products in their molecular characteristics.
Two is better than one; toward a rational design of combinatorial therapy.
Chen, Sheng-Hong; Lahav, Galit
2016-12-01
Drug combination is an appealing strategy for combating the heterogeneity of tumors and evolution of drug resistance. However, the rationale underlying combinatorial therapy is often not well established due to lack of understandings of the specific pathways responding to the drugs, and their temporal dynamics following each treatment. Here we present several emerging trends in harnessing properties of biological systems for the optimal design of drug combinations, including the type of drugs, specific concentration, sequence of addition and the temporal schedule of treatments. We highlight recent studies showing different approaches for efficient design of drug combinations including single-cell signaling dynamics, adaption and pathway crosstalk. Finally, we discuss novel and feasible approaches that can facilitate the optimal design of combinatorial therapy. Copyright © 2016 Elsevier Ltd. All rights reserved.
Romero, Jennifer V; Smith, Jock W H; Sullivan, Braden M; Croll, Lisa M; Dahn, J R
2012-01-09
Ternary libraries of 64 ZnO/CuO/CuCl(2) impregnated activated carbon samples were prepared on untreated or HNO(3)-treated carbon and evaluated for their SO(2) and NH(3) gas adsorption properties gravimetrically using a combinatorial method. CuCl(2) is shown to be a viable substitute for HNO(3) and some compositions of ternary ZnO/CuO/CuCl(2) impregnated carbon samples prepared on untreated carbon provided comparable SO(2) and NH(3) gas removal capacities to the materials prepared on HNO(3)-treated carbon. Through combinatorial methods, it was determined that the use of HNO(3) in this multigas adsorbent formulation can be avoided.
High-throughput screening for combinatorial thin-film library of thermoelectric materials.
Watanabe, Masaki; Kita, Takuji; Fukumura, Tomoteru; Ohtomo, Akira; Ueno, Kazunori; Kawasaki, Masashi
2008-01-01
A high-throughput method has been developed to evaluate the Seebeck coefficient and electrical resistivity of combinatorial thin-film libraries of thermoelectric materials from room temperature to 673 K. Thin-film samples several millimeters in size were deposited on an integrated Al2O3 substrate with embedded lead wires and local heaters for measurement of the thermopower under a controlled temperature gradient. An infrared camera was used for real-time observation of the temperature difference Delta T between two electrical contacts on the sample to obtain the Seebeck coefficient. The Seebeck coefficient and electrical resistivity of constantan thin films were shown to be almost identical to standard data for bulk constantan. High-throughput screening was demonstrated for a thermoelectric Mg-Si-Ge combinatorial library.
Library fingerprints: a novel approach to the screening of virtual libraries.
Klon, Anthony E; Diller, David J
2007-01-01
We propose a novel method to prioritize libraries for combinatorial synthesis and high-throughput screening that assesses the viability of a particular library on the basis of the aggregate physical-chemical properties of the compounds using a naïve Bayesian classifier. This approach prioritizes collections of related compounds according to the aggregate values of their physical-chemical parameters in contrast to single-compound screening. The method is also shown to be useful in screening existing noncombinatorial libraries when the compounds in these libraries have been previously clustered according to their molecular graphs. We show that the method used here is comparable or superior to the single-compound virtual screening of combinatorial libraries and noncombinatorial libraries and is superior to the pairwise Tanimoto similarity searching of a collection of combinatorial libraries.
Leptoquark mechanism of neutrino masses within the grand unification framework
NASA Astrophysics Data System (ADS)
Doršner, Ilja; Fajfer, Svjetlana; Košnik, Nejc
2017-06-01
We demonstrate the viability of the one-loop neutrino mass mechanism within the framework of grand unification when the loop particles comprise scalar leptoquarks (LQs) and quarks of the matching electric charge. This mechanism can be implemented in both supersymmetric and non-supersymmetric models and requires the presence of at least one LQ pair. The appropriate pairs for the neutrino mass generation via the up-type and down-type quark loops are S_3-R_2 and S_{1, 3}-\\tilde{R}_2, respectively. We consider two distinct regimes for the LQ masses in our analysis. The first regime calls for very heavy LQs in the loop. It can be naturally realized with the S_{1, 3}-\\tilde{R}_2 scenarios when the LQ masses are roughly between 10^{12} and 5 × 10^{13} GeV. These lower and upper bounds originate from experimental limits on partial proton decay lifetimes and perturbativity constraints, respectively. Second regime corresponds to the collider accessible LQs in the neutrino mass loop. That option is viable for the S_3-\\tilde{R}_2 scenario in the models of unification that we discuss. If one furthermore assumes the presence of the type II see-saw mechanism there is an additional contribution from the S_3-R_2 scenario that needs to be taken into account beside the type II see-saw contribution itself. We provide a complete list of renormalizable operators that yield necessary mixing of all aforementioned LQ pairs using the language of SU(5). We furthermore discuss several possible embeddings of this mechanism in SU(5) and SO(10) gauge groups.
2009-03-01
homeport, geographic stability for two tours and compressed work week; homeport, lump sum SRB, and telecommuting ). The Monte Carlo simulation...Geographic stability 2 tours, and compressed work week). The Add 2 combination includes home port choice, lump sum SRB, and telecommuting ...VALUATION OF NON-MONETARY INCENTIVES: MOTIVATING AND IMPLEMENTING THE COMBINATORIAL RETENTION AUCTION MECHANISM by Jason Blake Ellis March 2009
2011-03-01
Carcinoma Cells and Tumor Associated Pericytes with Antibody-Based Immunotherapy and Metronomic Chemotherapy. PRINCIPAL INVESTIGATOR: Soldano...Combinatorial Targeting of Prostate Carcinoma Cells and Tumor Associated Pericytes with Antibody-Based Immunotherapy and Metronomic Chemotherapy. 5b. GRANT...SUPPLEMENTARY NOTES 14. ABSTRACT Seventy seven 10 week old TRAMP mice were enrolled in the study. Administration of metronomic chemotherapy with
Computer Description of Black Hawk Helicopter
1979-06-01
Model Combinatorial Geometry Models Black Hawk Helicopter Helicopter GIFT Computer Code Geometric Description of Targets 20. ABSTRACT...description was made using the technique of combinatorial geometry (COM-GEOM) and will be used as input to the GIFT computer code which generates Tliic...rnHp The data used bv the COVART comtmter code was eenerated bv the Geometric Information for Targets ( GIFT )Z computer code. This report documents
Designed Electroresponsive Biomaterials: Sequence-Controlled Behavior
2010-06-29
protein of the M13 . Traditional phage and yeast display methodologies indicate that peptide sequences with high affinities for electrode materials...drug delivery. The original vision for this work was to employ combinatorial tools such as phage and yeast display under electrical selection pressure...and drug delivery. The original vision for this work was to employ combinatorial tools such as phage and yeast display under electrical selection
Nonlinear Multidimensional Assignment Problems Efficient Conic Optimization Methods and Applications
2015-06-24
WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Arizona State University School of Mathematical & Statistical Sciences 901 S...SUPPLEMENTARY NOTES 14. ABSTRACT The major goals of this project were completed: the exact solution of previously unsolved challenging combinatorial optimization... combinatorial optimization problem, the Directional Sensor Problem, was solved in two ways. First, heuristically in an engineering fashion and second, exactly
TARCMO: Theory and Algorithms for Robust, Combinatorial, Multicriteria Optimization
2016-11-28
objective 9 4.6 On The Recoverable Robust Traveling Salesman Problem . . . . . 11 4.7 A Bicriteria Approach to Robust Optimization...be found. 4.6 On The Recoverable Robust Traveling Salesman Problem The traveling salesman problem (TSP) is a well-known combinatorial optimiza- tion...procedure for the robust traveling salesman problem . While this iterative algorithms results in an optimal solution to the robust TSP, computation
The fusion of MBB with VFW finally brought to completion
NASA Technical Reports Server (NTRS)
1981-01-01
Two newspaper type articles describing the final, long awaited unification of the two German Air and Space companies, MBB and VFW are presented. Government participation in this "fusion" arrangement and the advantages expected to accrue are discussed.
76 FR 15209 - 150th Anniversary of the Unification of Italy, 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-21
... millions of American women and men of Italian descent who strengthen and enrich our Nation. Italy and the..., and the universal human rights our countries both respect and uphold. As we mark this important...
Gravitational Self-Energy as the Litmus of Reality
NASA Astrophysics Data System (ADS)
Jones, K. R. W.
It is argued that the correct physical treatment of self-energy in Newtonian quantum gravity offers a constrained and predictive discriminator for the interpretation of ψ, and thus a clear point of departure for the unification of modern physics.
Gurevich-Messina, Juan M; Giudicessi, Silvana L; Martínez-Ceron, María C; Acosta, Gerardo; Erra-Balsells, Rosa; Cascone, Osvaldo; Albericio, Fernando; Camperi, Silvia A
2015-01-01
Short cyclic peptides have a great interest in therapeutic, diagnostic and affinity chromatography applications. The screening of 'one-bead-one-peptide' combinatorial libraries combined with mass spectrometry (MS) is an excellent tool to find peptides with affinity for any target protein. The fragmentation patterns of cyclic peptides are quite more complex than those of their linear counterparts, and the elucidation of the resulting tandem mass spectra is rather more difficult. Here, we propose a simple protocol for combinatorial cyclic libraries synthesis and ring opening before MS analysis. In this strategy, 4-hydroxymethylbenzoic acid, which forms a benzyl ester with the first amino acid, was used as the linker. A glycolamidic ester group was incorporated after the combinatorial positions by adding glycolic acid. The library synthesis protocol consisted in the following: (i) incorporation of Fmoc-Asp[2-phenylisopropyl (OPp)]-OH to Ala-Gly-oxymethylbenzamide-ChemMatrix, (ii) synthesis of the combinatorial library, (iii) assembly of a glycolic acid, (iv) couple of an Ala residue in the N-terminal, (v) removal of OPp, (vi) peptide cyclisation through side chain Asp and N-Ala amino terminus and (vii) removal of side chain protecting groups. In order to simultaneously open the ring and release each peptide, benzyl and glycolamidic esters were cleaved with ammonia. Peptide sequences could be deduced from the tandem mass spectra of each single bead evaluated. The strategy herein proposed is suitable for the preparation of one-bead-one-cyclic depsipeptide libraries that can be easily open for its sequencing by matrix-assisted laser desorption/ionisation MS. It employs techniques and reagents frequently used in a broad range of laboratories without special expertise in organic synthesis. Copyright © 2014 European Peptide Society and John Wiley & Sons, Ltd.
"One-sample concept" micro-combinatory for high throughput TEM of binary films.
Sáfrán, György
2018-04-01
Phases of thin films may remarkably differ from that of bulk. Unlike to the comprehensive data files of Binary Phase Diagrams [1] available for bulk, complete phase maps for thin binary layers do not exist. This is due to both the diverse metastable, non-equilibrium or instable phases feasible in thin films and the required volume of characterization work with analytical techniques like TEM, SAED and EDS. The aim of the present work was to develop a method that remarkably facilitates the TEM study of the diverse binary phases of thin films, or the creation of phase maps. A micro-combinatorial method was worked out that enables both preparation and study of a gradient two-component film within a single TEM specimen. For a demonstration of the technique thin Mn x Al 1- x binary samples with evolving concentration from x = 0 to x = 1 have been prepared so that the transition from pure Mn to pure Al covers a 1.5 mm long track within the 3 mm diameter TEM grid. The proposed method enables the preparation and study of thin combinatorial samples including all feasible phases as a function of composition or other deposition parameters. Contrary to known "combinatorial chemistry", in which a series of different samples are deposited in one run, and investigated, one at a time, the present micro-combinatorial method produces a single specimen condensing a complete library of a binary system that can be studied, efficiently, within a single TEM session. That provides extremely high throughput for TEM characterization of composition-dependent phases, exploration of new materials, or the construction of phase diagrams of binary films. Copyright © 2018 Elsevier B.V. All rights reserved.
Anitha, A; Deepa, N; Chennazhi, K P; Lakshmanan, Vinoth-Kumar; Jayakumar, R
2014-09-01
Evaluation of the combinatorial anticancer effects of curcumin/5-fluorouracil loaded thiolated chitosan nanoparticles (CRC-TCS-NPs/5-FU-TCS-NPs) on colon cancer cells and the analysis of pharmacokinetics and biodistribution of CRC-TCS-NPs/5-FU-TCS-NPs in a mouse model. CRC-TCS-NPs/5-FU-TCS-NPs were developed by ionic cross-linking. The in vitro combinatorial anticancer effect of the nanomedicine was proven by different assays. Further the pharmacokinetics and biodistribution analyses were performed in Swiss Albino mouse using HPLC. The 5-FU-TCS-NPs (size: 150±40nm, zeta potential: +48.2±5mV) and CRC-TCS-NPs (size: 150±20nm, zeta potential: +35.7±3mV) were proven to be compatible with blood. The in vitro drug release studies at pH4.5 and 7.4 showed a sustained release profile over a period of 4 days, where both the systems exhibited a higher release in acidic pH. The in vitro combinatorial anticancer effects in colon cancer (HT29) cells using MTT, live/dead, mitochondrial membrane potential and cell cycle analysis measurements confirmed the enhanced anticancer effects (2.5 to 3 fold). The pharmacokinetic studies confirmed the improved plasma concentrations of 5-FU and CRC up to 72h, unlike bare CRC and 5-FU. To conclude, the combination of 5-FU-TCS-NPs and CRC-TCS-NPs showed enhanced anticancer effects on colon cancer cells in vitro and improved the bioavailability of the drugs in vivo. The enhanced anticancer effects of combinatorial nanomedicine are advantageous in terms of reduction in the dosage of 5-FU, thereby improving the chemotherapeutic efficacy and patient compliance of colorectal cancer cases. Copyright © 2014 Elsevier B.V. All rights reserved.
Sahib, Mouayad A.; Gambardella, Luca M.; Afzal, Wasif; Zamli, Kamal Z.
2016-01-01
Combinatorial test design is a plan of test that aims to reduce the amount of test cases systematically by choosing a subset of the test cases based on the combination of input variables. The subset covers all possible combinations of a given strength and hence tries to match the effectiveness of the exhaustive set. This mechanism of reduction has been used successfully in software testing research with t-way testing (where t indicates the interaction strength of combinations). Potentially, other systems may exhibit many similarities with this approach. Hence, it could form an emerging application in different areas of research due to its usefulness. To this end, more recently it has been applied in a few research areas successfully. In this paper, we explore the applicability of combinatorial test design technique for Fractional Order (FO), Proportional-Integral-Derivative (PID) parameter design controller, named as FOPID, for an automatic voltage regulator (AVR) system. Throughout the paper, we justify this new application theoretically and practically through simulations. In addition, we report on first experiments indicating its practical use in this field. We design different algorithms and adapted other strategies to cover all the combinations with an optimum and effective test set. Our findings indicate that combinatorial test design can find the combinations that lead to optimum design. Besides this, we also found that by increasing the strength of combination, we can approach to the optimum design in a way that with only 4-way combinatorial set, we can get the effectiveness of an exhaustive test set. This significantly reduced the number of tests needed and thus leads to an approach that optimizes design of parameters quickly. PMID:27829025
Bagheri, Neda; Shiina, Marisa; Lauffenburger, Douglas A; Korn, W Michael
2011-02-01
Oncolytic adenoviruses, such as ONYX-015, have been tested in clinical trials for currently untreatable tumors, but have yet to demonstrate adequate therapeutic efficacy. The extent to which viruses infect targeted cells determines the efficacy of this approach but many tumors down-regulate the Coxsackievirus and Adenovirus Receptor (CAR), rendering them less susceptible to infection. Disrupting MAPK pathway signaling by pharmacological inhibition of MEK up-regulates CAR expression, offering possible enhanced adenovirus infection. MEK inhibition, however, interferes with adenovirus replication due to resulting G1-phase cell cycle arrest. Therefore, enhanced efficacy will depend on treatment protocols that productively balance these competing effects. Predictive understanding of how to attain and enhance therapeutic efficacy of combinatorial treatment is difficult since the effects of MEK inhibitors, in conjunction with adenovirus/cell interactions, are complex nonlinear dynamic processes. We investigated combinatorial treatment strategies using a mathematical model that predicts the impact of MEK inhibition on tumor cell proliferation, ONYX-015 infection, and oncolysis. Specifically, we fit a nonlinear differential equation system to dedicated experimental data and analyzed the resulting simulations for favorable treatment strategies. Simulations predicted enhanced combinatorial therapy when both treatments were applied simultaneously; we successfully validated these predictions in an ensuing explicit test study. Further analysis revealed that a CAR-independent mechanism may be responsible for amplified virus production and cell death. We conclude that integrated computational and experimental analysis of combinatorial therapy provides a useful means to identify treatment/infection protocols that yield clinically significant oncolysis. Enhanced oncolytic therapy has the potential to dramatically improve non-surgical cancer treatment, especially in locally advanced or metastatic cases where treatment options remain limited.
Villagra, David; Goethe, John; Schwartz, Harold I; Szarek, Bonnie; Kocherla, Mohan; Gorowski, Krystyna; Windemuth, Andreas; Ruaño, Gualberto
2011-01-01
Aims We aim to demonstrate clinical relevance and utility of four novel drug-metabolism indices derived from a combinatory (multigene) approach to CYP2C9, CYP2C19 and CYP2D6 allele scoring. Each index considers all three genes as complementary components of a liver enzyme drug metabolism system and uniquely benchmarks innate hepatic drug metabolism reserve or alteration through CYP450 combinatory genotype scores. Methods A total of 1199 psychiatric referrals were genotyped for polymorphisms in the CYP2C9, CYP2C19 and CYP2D6 gene loci and were scored on each of the four indices. The data were used to create distributions and rankings of innate drug metabolism capacity to which individuals can be compared. Drug-specific indices are a combination of the drug metabolism indices with substrate-specific coefficients. Results The combinatory drug metabolism indices proved useful in positioning individuals relative to a population with regard to innate drug metabolism capacity prior to pharmacotherapy. Drug-specific indices generate pharmacogenetic guidance of immediate clinical relevance, and can be further modified to incorporate covariates in particular clinical cases. Conclusions We believe that this combinatory approach represents an improvement over the current gene-by-gene reporting by providing greater scope while still allowing for the resolution of a single-gene index when needed. This method will result in novel clinical and research applications, facilitating the translation from pharmacogenomics to personalized medicine, particularly in psychiatry where many drugs are metabolized or activated by multiple CYP450 isoenzymes. PMID:21861665
Zhao, Zheng; Bai, Jing; Wu, Aiwei; Wang, Yuan; Zhang, Jinwen; Wang, Zishan; Li, Yongsheng; Xu, Juan; Li, Xia
2015-01-01
Long non-coding RNAs (lncRNAs) are emerging as key regulators of diverse biological processes and diseases. However, the combinatorial effects of these molecules in a specific biological function are poorly understood. Identifying co-expressed protein-coding genes of lncRNAs would provide ample insight into lncRNA functions. To facilitate such an effort, we have developed Co-LncRNA, which is a web-based computational tool that allows users to identify GO annotations and KEGG pathways that may be affected by co-expressed protein-coding genes of a single or multiple lncRNAs. LncRNA co-expressed protein-coding genes were first identified in publicly available human RNA-Seq datasets, including 241 datasets across 6560 total individuals representing 28 tissue types/cell lines. Then, the lncRNA combinatorial effects in a given GO annotations or KEGG pathways are taken into account by the simultaneous analysis of multiple lncRNAs in user-selected individual or multiple datasets, which is realized by enrichment analysis. In addition, this software provides a graphical overview of pathways that are modulated by lncRNAs, as well as a specific tool to display the relevant networks between lncRNAs and their co-expressed protein-coding genes. Co-LncRNA also supports users in uploading their own lncRNA and protein-coding gene expression profiles to investigate the lncRNA combinatorial effects. It will be continuously updated with more human RNA-Seq datasets on an annual basis. Taken together, Co-LncRNA provides a web-based application for investigating lncRNA combinatorial effects, which could shed light on their biological roles and could be a valuable resource for this community. Database URL: http://www.bio-bigdata.com/Co-LncRNA/ PMID:26363020
Hegde, Mahesh; Mantelingu, Kempegowda; Pandey, Monica; Pavankumar, Chottanahalli S; Rangappa, Kanchugarakoppal S; Raghavan, Sathees C
2016-10-01
Cancer is a multifactorial disease, which makes it difficult to cure. Since more than one defective cellular component is often involved during oncogenesis, combination therapy is gaining prominence in the field of cancer therapeutics. The purpose of this study was to investigate the combinatorial effects of a novel PARP inhibitor, P10, and HDAC inhibitor, SAHA, in leukemic cells. Combinatorial effects of P10 and SAHA were tested using propidium iodide staining in different leukemic cells. Further, flowcytometry-based assays such as calcein-AM/ethidium homodimer staining, annexin-FITC/PI staining, and JC-1 staining were carried out to elucidate the mechanism of cell death. In addition, cell-cycle analysis, immunocytochemistry studies, and western blotting analysis were conducted to check the combinatorial effect in Nalm6 cells. Propidium iodide staining showed that P10 in combination with SAHA induced cell death in Nalm6 cells, in which PARP expression and activity is high with a combination index of <0.2. Annexin-FITC/PI staining, JC-1 staining, and other biochemical assays revealed that P10 in combination with SAHA induced apoptosis by causing a change in mitochondrial membrane potential in >65 % cells. Importantly, combinatorial treatment induced S phase arrest in 40-45 % cells due to DNA damage and plausible replicative stress. Finally, we demonstrated that treatment with P10 led to DNA strand breaks, which were further potentiated by SAHA (p < 0.01), leading to activation of apoptosis and increased cell death in PARP-positive leukemic cells. Our study reveals that coadministration of PARP inhibitor with SAHA could be used as a combination therapy against leukemic cells that possess high levels of intrinsic PARP activity.
Luo, Li; Luo, Le; Zhang, Xinli; He, Xiaoli
2017-07-10
Accurate forecasting of hospital outpatient visits is beneficial for the reasonable planning and allocation of healthcare resource to meet the medical demands. In terms of the multiple attributes of daily outpatient visits, such as randomness, cyclicity and trend, time series methods, ARIMA, can be a good choice for outpatient visits forecasting. On the other hand, the hospital outpatient visits are also affected by the doctors' scheduling and the effects are not pure random. Thinking about the impure specialty, this paper presents a new forecasting model that takes cyclicity and the day of the week effect into consideration. We formulate a seasonal ARIMA (SARIMA) model on a daily time series and then a single exponential smoothing (SES) model on the day of the week time series, and finally establish a combinatorial model by modifying them. The models are applied to 1 year of daily visits data of urban outpatients in two internal medicine departments of a large hospital in Chengdu, for forecasting the daily outpatient visits about 1 week ahead. The proposed model is applied to forecast the cross-sectional data for 7 consecutive days of daily outpatient visits over an 8-weeks period based on 43 weeks of observation data during 1 year. The results show that the two single traditional models and the combinatorial model are simplicity of implementation and low computational intensiveness, whilst being appropriate for short-term forecast horizons. Furthermore, the combinatorial model can capture the comprehensive features of the time series data better. Combinatorial model can achieve better prediction performance than the single model, with lower residuals variance and small mean of residual errors which needs to be optimized deeply on the next research step.
Discovery of Cationic Polymers for Non-viral Gene Delivery using Combinatorial Approaches
Barua, Sutapa; Ramos, James; Potta, Thrimoorthy; Taylor, David; Huang, Huang-Chiao; Montanez, Gabriela; Rege, Kaushal
2015-01-01
Gene therapy is an attractive treatment option for diseases of genetic origin, including several cancers and cardiovascular diseases. While viruses are effective vectors for delivering exogenous genes to cells, concerns related to insertional mutagenesis, immunogenicity, lack of tropism, decay and high production costs necessitate the discovery of non-viral methods. Significant efforts have been focused on cationic polymers as non-viral alternatives for gene delivery. Recent studies have employed combinatorial syntheses and parallel screening methods for enhancing the efficacy of gene delivery, biocompatibility of the delivery vehicle, and overcoming cellular level barriers as they relate to polymer-mediated transgene uptake, transport, transcription, and expression. This review summarizes and discusses recent advances in combinatorial syntheses and parallel screening of cationic polymer libraries for the discovery of efficient and safe gene delivery systems. PMID:21843141
Combinatorial Optimization in Project Selection Using Genetic Algorithm
NASA Astrophysics Data System (ADS)
Dewi, Sari; Sawaluddin
2018-01-01
This paper discusses the problem of project selection in the presence of two objective functions that maximize profit and minimize cost and the existence of some limitations is limited resources availability and time available so that there is need allocation of resources in each project. These resources are human resources, machine resources, raw material resources. This is treated as a consideration to not exceed the budget that has been determined. So that can be formulated mathematics for objective function (multi-objective) with boundaries that fulfilled. To assist the project selection process, a multi-objective combinatorial optimization approach is used to obtain an optimal solution for the selection of the right project. It then described a multi-objective method of genetic algorithm as one method of multi-objective combinatorial optimization approach to simplify the project selection process in a large scope.
Programming gene expression with combinatorial promoters
Cox, Robert Sidney; Surette, Michael G; Elowitz, Michael B
2007-01-01
Promoters control the expression of genes in response to one or more transcription factors (TFs). The architecture of a promoter is the arrangement and type of binding sites within it. To understand natural genetic circuits and to design promoters for synthetic biology, it is essential to understand the relationship between promoter function and architecture. We constructed a combinatorial library of random promoter architectures. We characterized 288 promoters in Escherichia coli, each containing up to three inputs from four different TFs. The library design allowed for multiple −10 and −35 boxes, and we observed varied promoter strength over five decades. To further analyze the functional repertoire, we defined a representation of promoter function in terms of regulatory range, logic type, and symmetry. Using these results, we identified heuristic rules for programming gene expression with combinatorial promoters. PMID:18004278
Analytical validation of a psychiatric pharmacogenomic test.
Jablonski, Michael R; King, Nina; Wang, Yongbao; Winner, Joel G; Watterson, Lucas R; Gunselman, Sandra; Dechairo, Bryan M
2018-05-01
The aim of this study was to validate the analytical performance of a combinatorial pharmacogenomics test designed to aid in the appropriate medication selection for neuropsychiatric conditions. Genomic DNA was isolated from buccal swabs. Twelve genes (65 variants/alleles) associated with psychotropic medication metabolism, side effects, and mechanisms of actions were evaluated by bead array, MALDI-TOF mass spectrometry, and/or capillary electrophoresis methods (GeneSight Psychotropic, Assurex Health, Inc.). The combinatorial pharmacogenomics test has a dynamic range of 2.5-20 ng/μl of input genomic DNA, with comparable performance for all assays included in the test. Both the precision and accuracy of the test were >99.9%, with individual gene components between 99.4 and 100%. This study demonstrates that the combinatorial pharmacogenomics test is robust and reproducible, making it suitable for clinical use.
Combinatorial Histone Acetylation Patterns Are Generated by Motif-Specific Reactions.
Blasi, Thomas; Feller, Christian; Feigelman, Justin; Hasenauer, Jan; Imhof, Axel; Theis, Fabian J; Becker, Peter B; Marr, Carsten
2016-01-27
Post-translational modifications (PTMs) are pivotal to cellular information processing, but how combinatorial PTM patterns ("motifs") are set remains elusive. We develop a computational framework, which we provide as open source code, to investigate the design principles generating the combinatorial acetylation patterns on histone H4 in Drosophila melanogaster. We find that models assuming purely unspecific or lysine site-specific acetylation rates were insufficient to explain the experimentally determined motif abundances. Rather, these abundances were best described by an ensemble of models with acetylation rates that were specific to motifs. The model ensemble converged upon four acetylation pathways; we validated three of these using independent data from a systematic enzyme depletion study. Our findings suggest that histone acetylation patterns originate through specific pathways involving motif-specific acetylation activity. Copyright © 2016 Elsevier Inc. All rights reserved.
Simulating the component counts of combinatorial structures.
Arratia, Richard; Barbour, A D; Ewens, W J; Tavaré, Simon
2018-02-09
This article describes and compares methods for simulating the component counts of random logarithmic combinatorial structures such as permutations and mappings. We exploit the Feller coupling for simulating permutations to provide a very fast method for simulating logarithmic assemblies more generally. For logarithmic multisets and selections, this approach is replaced by an acceptance/rejection method based on a particular conditioning relationship that represents the distribution of the combinatorial structure as that of independent random variables conditioned on a weighted sum. We show how to improve its acceptance rate. We illustrate the method by estimating the probability that a random mapping has no repeated component sizes, and establish the asymptotic distribution of the difference between the number of components and the number of distinct component sizes for a very general class of logarithmic structures. Copyright © 2018. Published by Elsevier Inc.
Single cell systems biology by super-resolution imaging and combinatorial labeling
Lubeck, Eric; Cai, Long
2012-01-01
Fluorescence microscopy is a powerful quantitative tool for exploring regulatory networks in single cells. However, the number of molecular species that can be measured simultaneously is limited by the spectral separability of fluorophores. Here we demonstrate a simple but general strategy to drastically increase the capacity for multiplex detection of molecules in single cells by using optical super-resolution microscopy (SRM) and combinatorial labeling. As a proof of principle, we labeled mRNAs with unique combinations of fluorophores using Fluorescence in situ Hybridization (FISH), and resolved the sequences and combinations of fluorophores with SRM. We measured the mRNA levels of 32 genes simultaneously in single S. cerevisiae cells. These experiments demonstrate that combinatorial labeling and super-resolution imaging of single cells provides a natural approach to bring systems biology into single cells. PMID:22660740
Maritime Security and the Strait of Malacca: A Strategic Analysis
2006-06-16
Combating Terrorism MAA Monitoring and Action Agencies MCA Malaysian Chinese Association MFG Manila Framework Group MIC Malaysian Indian...of racial tensions that inhibited successful unification. Malaysian Prime Minister Abdul Rahman, and his next three successors were all politically...
Taoistic Psychology of Creativity.
ERIC Educational Resources Information Center
Kuo, You-Yuh
1996-01-01
This article reinterprets the philosophy of Taoism and applies it to creativity. Taoistic cognition is described as intuition or personal knowledge. Taoistic creativity is explained as involving incubation, syntectic thinking, and the unification through opposites. Dialectical thinking, Taoistic meditation and intuition, and symbolic thinking are…
Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families
2008-03-01
of Defense, or the United States Government . AFIT/GCS/ENG/08-12 Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families...time, United States policy strongly encourages the sale and transfer of some military equipment to foreign governments and makes it easier for...Proceedings of the International Conference on Availability, Reliability and Security, 2007. 14. McDonald, J. Todd and Alec Yasinsac. “Of unicorns and random
Combinatorial study of degree assortativity in networks.
Estrada, Ernesto
2011-10-01
Why are some networks degree-degree correlated (assortative), while most of the real-world ones are anticorrelated (disassortative)? Here, we prove, by combinatorial methods, that the assortativity of a network depends only on three structural factors: transitivity (clustering coefficient), intermodular connectivity, and branching. Then, a network is assortative if the contributions of the first two factors are larger than that of the third. Highly branched networks are likely to be disassortative.
Potyrailo, Radislav A; Chisholm, Bret J; Morris, William G; Cawse, James N; Flanagan, William P; Hassib, Lamyaa; Molaison, Chris A; Ezbiansky, Karin; Medford, George; Reitz, Hariklia
2003-01-01
Coupling of combinatorial chemistry methods with high-throughput (HT) performance testing and measurements of resulting properties has provided a powerful set of tools for the 10-fold accelerated discovery of new high-performance coating materials for automotive applications. Our approach replaces labor-intensive steps with automated systems for evaluation of adhesion of 8 x 6 arrays of coating elements that are discretely deposited on a single 9 x 12 cm plastic substrate. Performance of coatings is evaluated with respect to their resistance to adhesion loss, because this parameter is one of the primary considerations in end-use automotive applications. Our HT adhesion evaluation provides previously unavailable capabilities of high speed and reproducibility of testing by using a robotic automation, an expanded range of types of tested coatings by using the coating tagging strategy, and an improved quantitation by using high signal-to-noise automatic imaging. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several coatings leads. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and adhesion loss testing. These validation results have confirmed the superb performance of combinatorially developed coatings over conventional coatings on the traditional scale.
Ma, Zhanjun
2017-01-01
Poor viability of engrafted bone marrow mesenchymal stem cells (BMSCs) often hinders their application for wound healing, and the strategy of how to take full advantage of their angiogenic capacity within wounds still remains unclear. Negative pressure wound therapy (NPWT) has been demonstrated to be effective for enhancing wound healing, especially for the promotion of angiogenesis within wounds. Here we utilized combinatory strategy using the transplantation of BMSCs and NPWT to investigate whether this combinatory therapy could accelerate angiogenesis in wounds. In vitro, after 9-day culture, BMSCs proliferation significantly increased in NPWT group. Furthermore, NPWT induced their differentiation into the angiogenic related cells, which are indispensable for wound angiogenesis. In vivo, rat full-thickness cutaneous wounds treated with BMSCs combined with NPWT exhibited better viability of the cells and enhanced angiogenesis and maturation of functional blood vessels than did local BMSC injection or NPWT alone. Expression of angiogenesis markers (NG2, VEGF, CD31, and α-SMA) was upregulated in wounds treated with combined BMSCs with NPWT. Our data suggest that NPWT may act as an inductive role to enhance BMSCs angiogenic capacity and this combinatorial therapy may serve as a simple but efficient clinical solution for complex wounds with large defects. PMID:28243602
Damer, Bruce; Deamer, David
2015-01-01
Hydrothermal fields on the prebiotic Earth are candidate environments for biogenesis. We propose a model in which molecular systems driven by cycles of hydration and dehydration in such sites undergo chemical evolution in dehydrated films on mineral surfaces followed by encapsulation and combinatorial selection in a hydrated bulk phase. The dehydrated phase can consist of concentrated eutectic mixtures or multilamellar liquid crystalline matrices. Both conditions organize and concentrate potential monomers and thereby promote polymerization reactions that are driven by reduced water activity in the dehydrated phase. In the case of multilamellar lipid matrices, polymers that have been synthesized are captured in lipid vesicles upon rehydration to produce a variety of molecular systems. Each vesicle represents a protocell, an “experiment” in a natural version of combinatorial chemistry. Two kinds of selective processes can then occur. The first is a physical process in which relatively stable molecular systems will be preferentially selected. The second is a chemical process in which rare combinations of encapsulated polymers form systems capable of capturing energy and nutrients to undergo growth by catalyzed polymerization. Given continued cycling over extended time spans, such combinatorial processes will give rise to molecular systems having the fundamental properties of life. PMID:25780958
A combinatorial perspective of the protein inference problem.
Yang, Chao; He, Zengyou; Yu, Weichuan
2013-01-01
In a shotgun proteomics experiment, proteins are the most biologically meaningful output. The success of proteomics studies depends on the ability to accurately and efficiently identify proteins. Many methods have been proposed to facilitate the identification of proteins from peptide identification results. However, the relationship between protein identification and peptide identification has not been thoroughly explained before. In this paper, we devote ourselves to a combinatorial perspective of the protein inference problem. We employ combinatorial mathematics to calculate the conditional protein probabilities (protein probability means the probability that a protein is correctly identified) under three assumptions, which lead to a lower bound, an upper bound, and an empirical estimation of protein probabilities, respectively. The combinatorial perspective enables us to obtain an analytical expression for protein inference. Our method achieves comparable results with ProteinProphet in a more efficient manner in experiments on two data sets of standard protein mixtures and two data sets of real samples. Based on our model, we study the impact of unique peptides and degenerate peptides (degenerate peptides are peptides shared by at least two proteins) on protein probabilities. Meanwhile, we also study the relationship between our model and ProteinProphet. We name our program ProteinInfer. Its Java source code, our supplementary document and experimental results are available at: >http://bioinformatics.ust.hk/proteininfer.
Ye, Yusen; Gao, Lin; Zhang, Shihua
2017-01-01
Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions. PMID:29033978
Ye, Yusen; Gao, Lin; Zhang, Shihua
2017-01-01
Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions.
A Key Pre-Distribution Scheme Based on µ-PBIBD for Enhancing Resilience in Wireless Sensor Networks.
Yuan, Qi; Ma, Chunguang; Yu, Haitao; Bian, Xuefen
2018-05-12
Many key pre-distribution (KPD) schemes based on combinatorial design were proposed for secure communication of wireless sensor networks (WSNs). Due to complexity of constructing the combinatorial design, it is infeasible to generate key rings using the corresponding combinatorial design in large scale deployment of WSNs. In this paper, we present a definition of new combinatorial design, termed “µ-partially balanced incomplete block design (µ-PBIBD)”, which is a refinement of partially balanced incomplete block design (PBIBD), and then describe a 2-D construction of µ-PBIBD which is mapped to KPD in WSNs. Our approach is of simple construction which provides a strong key connectivity and a poor network resilience. To improve the network resilience of KPD based on 2-D µ-PBIBD, we propose a KPD scheme based on 3-D Ex-µ-PBIBD which is a construction of µ-PBIBD from 2-D space to 3-D space. Ex-µ-PBIBD KPD scheme improves network scalability and resilience while has better key connectivity. Theoretical analysis and comparison with the related schemes show that key pre-distribution scheme based on Ex-µ-PBIBD provides high network resilience and better key scalability, while it achieves a trade-off between network resilience and network connectivity.
Xu, Yuquan; Zhou, Tong; Zhang, Shuwei; Espinosa-Artiles, Patricia; Wang, Luoyi; Zhang, Wei; Lin, Min; Gunatilaka, A A Leslie; Zhan, Jixun; Molnár, István
2014-08-26
Combinatorial biosynthesis aspires to exploit the promiscuity of microbial anabolic pathways to engineer the synthesis of new chemical entities. Fungal benzenediol lactone (BDL) polyketides are important pharmacophores with wide-ranging bioactivities, including heat shock response and immune system modulatory effects. Their biosynthesis on a pair of sequentially acting iterative polyketide synthases (iPKSs) offers a test case for the modularization of secondary metabolic pathways into "build-couple-pair" combinatorial synthetic schemes. Expression of random pairs of iPKS subunits from four BDL model systems in a yeast heterologous host created a diverse library of BDL congeners, including a polyketide with an unnatural skeleton and heat shock response-inducing activity. Pairwise heterocombinations of the iPKS subunits also helped to illuminate the innate, idiosyncratic programming of these enzymes. Even in combinatorial contexts, these biosynthetic programs remained largely unchanged, so that the iPKSs built their cognate biosynthons, coupled these building blocks into chimeric polyketide intermediates, and catalyzed intramolecular pairing to release macrocycles or α-pyrones. However, some heterocombinations also provoked stuttering, i.e., the relaxation of iPKSs chain length control to assemble larger homologous products. The success of such a plug and play approach to biosynthesize novel chemical diversity bodes well for bioprospecting unnatural polyketides for drug discovery.
Görlach, E; Richmond, R; Lewis, I
1998-08-01
For the last two years, the mass spectroscopy section of the Novartis Pharma Research Core Technology group has analyzed tens of thousands of multiple parallel synthesis samples from the Novartis Pharma Combinatorial Chemistry program, using an in-house developed automated high-throughput flow injection analysis electrospray ionization mass spectroscopy system. The electrospray spectra of these samples reflect the many structures present after the cleavage step from the solid support. The overall success of the sequential synthesis is mirrored in the purity of the expected end product, but the partial success of individual synthesis steps is evident in the impurities in the mass spectrum. However this latter reaction information, which is of considerable utility to the combinatorial chemist, is effectively hidden from view by the very large number of analyzed samples. This information is now revealed at the workbench of the combinatorial chemist by a novel three-dimensional display of each rack's complete mass spectral ion current using the in-house RackViewer Visual Basic application. Colorization of "forbidden loss" and "forbidden gas-adduct" zones, normalization to expected monoisotopic molecular weight, colorization of ionization intensity, and sorting by row or column were used in combination to highlight systematic patterns in the mass spectroscopy data.
Song, Suk-yoon; Hur, Byung-ung; Lee, Kyung-woo; Choi, Hyo-jung; Kim, Sung-soo; Kang, Goo; Cha, Sang-hoon
2009-03-31
The dual-vector system-II (DVS-II), which allows efficient display of Fab antibodies on phage, has been reported previously, but its practical applicability in a phage-displayed antibody library has not been verified. To resolve this issue, we created two small combinatorial human Fab antibody libraries using the DVS-II, and isolation of target-specific antibodies was attempted. Biopanning of one antibody library, termed DVFAB-1L library, which has a 1.3 x 10(7) combinatorial antibody complexity, against fluorescein-BSA resulted in successful isolation of human Fab clones specific for the antigen despite the presence of only a single light chain in the library. By using the unique feature of the DVS-II, an antibody library of a larger size, named DVFAB-131L, which has a 1.5 x 10(9) combinatorial antibody complexity, was also generated in a rapid manner by combining 1.3 x 10(7) heavy chains and 131 light chains and more diverse anti-fluorescein-BSA Fab antibody clones were successfully obtained. Our results demonstrate that the DVS-II can be applied readily in creating phage-displayed antibody libraries with much less effort, and target-specific antibody clones can be isolated reliably via light chain promiscuity of antibody molecule.
A Key Pre-Distribution Scheme Based on µ-PBIBD for Enhancing Resilience in Wireless Sensor Networks
Yuan, Qi; Ma, Chunguang; Yu, Haitao; Bian, Xuefen
2018-01-01
Many key pre-distribution (KPD) schemes based on combinatorial design were proposed for secure communication of wireless sensor networks (WSNs). Due to complexity of constructing the combinatorial design, it is infeasible to generate key rings using the corresponding combinatorial design in large scale deployment of WSNs. In this paper, we present a definition of new combinatorial design, termed “µ-partially balanced incomplete block design (µ-PBIBD)”, which is a refinement of partially balanced incomplete block design (PBIBD), and then describe a 2-D construction of µ-PBIBD which is mapped to KPD in WSNs. Our approach is of simple construction which provides a strong key connectivity and a poor network resilience. To improve the network resilience of KPD based on 2-D µ-PBIBD, we propose a KPD scheme based on 3-D Ex-µ-PBIBD which is a construction of µ-PBIBD from 2-D space to 3-D space. Ex-µ-PBIBD KPD scheme improves network scalability and resilience while has better key connectivity. Theoretical analysis and comparison with the related schemes show that key pre-distribution scheme based on Ex-µ-PBIBD provides high network resilience and better key scalability, while it achieves a trade-off between network resilience and network connectivity. PMID:29757244
Iconicity and the Emergence of Combinatorial Structure in Language.
Verhoef, Tessa; Kirby, Simon; de Boer, Bart
2016-11-01
In language, recombination of a discrete set of meaningless building blocks forms an unlimited set of possible utterances. How such combinatorial structure emerged in the evolution of human language is increasingly being studied. It has been shown that it can emerge when languages culturally evolve and adapt to human cognitive biases. How the emergence of combinatorial structure interacts with the existence of holistic iconic form-meaning mappings in a language is still unknown. The experiment presented in this paper studies the role of iconicity and human cognitive learning biases in the emergence of combinatorial structure in artificial whistled languages. Participants learned and reproduced whistled words for novel objects with the use of a slide whistle. Their reproductions were used as input for the next participant, to create transmission chains and simulate cultural transmission. Two conditions were studied: one in which the persistence of iconic form-meaning mappings was possible and one in which this was experimentally made impossible. In both conditions, cultural transmission caused the whistled languages to become more learnable and more structured, but this process was slightly delayed in the first condition. Our findings help to gain insight into when and how words may lose their iconic origins when they become part of an organized linguistic system. Copyright © 2015 Cognitive Science Society, Inc.
On the Unification of Psychology, Methodology, and Pedagogy.
ERIC Educational Resources Information Center
Wettersten, John
1987-01-01
The psychological and methodological bases of the Agassi teaching method are described to provide a context for evaluating the theory. A brief history of Selzian psychology and Popper's methodology is given. The Agassi method, which stresses learning through questioning, is detailed. (JL)
The geobiosphere emergy baseline: A synthesis.
The concept of emergy defined as the available energy (or exergy) of one form used up directly and indirectly to produce an item or action (Odum, Environmental Accounting Emergy and Environmental Decision Making, John Wiley & Sons, Inc., 1996) requires the specification of a unif...
Development and Plasticity of Cortical Processing Architectures
NASA Astrophysics Data System (ADS)
Singer, Wolf
1995-11-01
One of the basic functions of the cerebral cortex is the analysis and representation of relations among the components of sensory and motor patterns. It is proposed that the cortex applies two complementary strategies to cope with the combinatorial problem posed by the astronomical number of possible relations: (i) the analysis and representation of frequently occurring, behaviorally relevant relations by groups of cells with fixed but broadly tuned response properties; and (ii) the dynamic association of these cells into functionally coherent assemblies. Feedforward connections and reciprocal associative connections, respectively, are thought to underlie these two operations. The architectures of both types of connections are susceptible to experience-dependent modifications during development, but they become fixed in the adult. As development proceeds, feedforward connections also appear to lose much of their functional plasticity, whereas the synapses of the associative connections retain a high susceptibility to use-dependent modifications. The reduced plasticity of feedforward connections is probably responsible for the invariance of cognitive categories acquired early in development. The persistent adaptivity of reciprocal connections is a likely substrate for the ability to generate representations for new perceptual objects and motor patterns throughout life.
Inducible CRISPR genome-editing tool: classifications and future trends.
Dai, Xiaofeng; Chen, Xiao; Fang, Qiuwu; Li, Jia; Bai, Zhonghu
2018-06-01
The discovery of CRISPR-Cas9/dCas9 system has reinforced our ability and revolutionized our history in genome engineering. While Cas9 and dCas9 are programed to modulate gene expression by introducing DNA breaks, blocking transcription factor recruitment or dragging functional groups towards the targeted sites, sgRNAs determine the genomic loci where the modulation occurs. The off-target problem, due to limited sgRNA specificity and genome complexity of many species, has posed concerns for the wide application of this revolutionary technique. To solve this problem and, more importantly, gain power over gene functionality and cell fate control, inducible strategies have been continuously evolved to offer tailored solutions to address specific biological questions. By reviewing recent advances in inducible CRISPR system design and critical elements potentially adding values to such systems, we classify current approaches in this domain into four mechanically distinct categories, namely, "split system", "allosteric system", "combinatorial system", and "transient delivery system", discuss the pros and cons of each system, and point out the under-explored areas and future directions, with the aim of enriching our toolbox of delicate life engineering.
Computer Description of the Field Artillery Ammunition Supply Vehicle
1983-04-01
Combinatorial Geometry (COM-GEOM) GIFT Computer Code Computer Target Description 2& AfTNACT (Cmne M feerve shb N ,neemssalyan ify by block number) A...input to the GIFT computer code to generate target vulnerability data. F.a- 4 ono OF I NOV 5S OLETE UNCLASSIFIED SECUOITY CLASSIFICATION OF THIS PAGE...Combinatorial Geometry (COM-GEOM) desrription. The "Geometric Information for Tarqets" ( GIFT ) computer code accepts the CO!-GEOM description and
Sin(x)**2 + cos(x)**2 = 1. [programming identities using comparative combinatorial substitutions
NASA Technical Reports Server (NTRS)
Stoutemyer, D. R.
1977-01-01
Attempts to achieve tasteful automatic employment of the identities sin sq x + cos sq x = 1 and cos sq h x -sin sq h x = 1 in a manner which truly minimizes the complexity of the resulting expression are described. The disappointments of trigonometric reduction, trigonometric expansion, pattern matching, Poisson series, and Demoivre's theorem are related. The advantages of using the method of comparative combinatorial substitutions are illustrated.
Optimization of Highway Work Zone Decisions Considering Short-Term and Long-Term Impacts
2010-01-01
strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature of this optimization problem, a heuristic...combination of lane closure and traffic control strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature ...zone) NV # the number of vehicle classes NPV $ Net Present Value p’(t) % Adjusted traffic diversion rate at time t p(t) % Natural diversion rate
Thermal analysis of combinatorial solid geometry models using SINDA
NASA Technical Reports Server (NTRS)
Gerencser, Diane; Radke, George; Introne, Rob; Klosterman, John; Miklosovic, Dave
1993-01-01
Algorithms have been developed using Monte Carlo techniques to determine the thermal network parameters necessary to perform a finite difference analysis on Combinatorial Solid Geometry (CSG) models. Orbital and laser fluxes as well as internal heat generation are modeled to facilitate satellite modeling. The results of the thermal calculations are used to model the infrared (IR) images of targets and assess target vulnerability. Sample analyses and validation are presented which demonstrate code products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, K.-S.; Green, M. L.; Suehle, J.
2006-10-02
The authors have fabricated combinatorial Ni-Ti-Pt ternary metal gate thin film libraries on HfO{sub 2} using magnetron co-sputtering to investigate flatband voltage shift ({delta}V{sub fb}), work function ({phi}{sub m}), and leakage current density (J{sub L}) variations. A more negative {delta}V{sub fb} is observed close to the Ti-rich corner than at the Ni- and Pt-rich corners, implying smaller {phi}{sub m} near the Ti-rich corners and higher {phi}{sub m} near the Ni- and Pt-rich corners. In addition, measured J{sub L} values can be explained consistently with the observed {phi}{sub m} variations. Combinatorial methodologies prove to be useful in surveying the large compositionalmore » space of ternary alloy metal gate electrode systems.« less
DNA-Encoded Dynamic Combinatorial Chemical Libraries.
Reddavide, Francesco V; Lin, Weilin; Lehnert, Sarah; Zhang, Yixin
2015-06-26
Dynamic combinatorial chemistry (DCC) explores the thermodynamic equilibrium of reversible reactions. Its application in the discovery of protein binders is largely limited by difficulties in the analysis of complex reaction mixtures. DNA-encoded chemical library (DECL) technology allows the selection of binders from a mixture of up to billions of different compounds; however, experimental results often show low a signal-to-noise ratio and poor correlation between enrichment factor and binding affinity. Herein we describe the design and application of DNA-encoded dynamic combinatorial chemical libraries (EDCCLs). Our experiments have shown that the EDCCL approach can be used not only to convert monovalent binders into high-affinity bivalent binders, but also to cause remarkably enhanced enrichment of potent bivalent binders by driving their in situ synthesis. We also demonstrate the application of EDCCLs in DNA-templated chemical reactions. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Potta, Thrimoorthy; Zhen, Zhuo; Grandhi, Taraka Sai Pavan; Christensen, Matthew D.; Ramos, James; Breneman, Curt M.; Rege, Kaushal
2014-01-01
We describe the combinatorial synthesis and cheminformatics modeling of aminoglycoside antibiotics-derived polymers for transgene delivery and expression. Fifty-six polymers were synthesized by polymerizing aminoglycosides with diglycidyl ether cross-linkers. Parallel screening resulted in identification of several lead polymers that resulted in high transgene expression levels in cells. The role of polymer physicochemical properties in determining efficacy of transgene expression was investigated using Quantitative Structure-Activity Relationship (QSAR) cheminformatics models based on Support Vector Regression (SVR) and ‘building block’ polymer structures. The QSAR model exhibited high predictive ability, and investigation of descriptors in the model, using molecular visualization and correlation plots, indicated that physicochemical attributes related to both, aminoglycosides and diglycidyl ethers facilitated transgene expression. This work synergistically combines combinatorial synthesis and parallel screening with cheminformatics-based QSAR models for discovery and physicochemical elucidation of effective antibiotics-derived polymers for transgene delivery in medicine and biotechnology. PMID:24331709
Optimal weighted combinatorial forecasting model of QT dispersion of ECGs in Chinese adults.
Wen, Zhang; Miao, Ge; Xinlei, Liu; Minyi, Cen
2016-07-01
This study aims to provide a scientific basis for unifying the reference value standard of QT dispersion of ECGs in Chinese adults. Three predictive models including regression model, principal component model, and artificial neural network model are combined to establish the optimal weighted combination model. The optimal weighted combination model and single model are verified and compared. Optimal weighted combinatorial model can reduce predicting risk of single model and improve the predicting precision. The reference value of geographical distribution of Chinese adults' QT dispersion was precisely made by using kriging methods. When geographical factors of a particular area are obtained, the reference value of QT dispersion of Chinese adults in this area can be estimated by using optimal weighted combinatorial model and reference value of the QT dispersion of Chinese adults anywhere in China can be obtained by using geographical distribution figure as well.
Development of New Sensing Materials Using Combinatorial and High-Throughput Experimentation
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Mirsky, Vladimir M.
New sensors with improved performance characteristics are needed for applications as diverse as bedside continuous monitoring, tracking of environmental pollutants, monitoring of food and water quality, monitoring of chemical processes, and safety in industrial, consumer, and automotive settings. Typical requirements in sensor improvement are selectivity, long-term stability, sensitivity, response time, reversibility, and reproducibility. Design of new sensing materials is the important cornerstone in the effort to develop new sensors. Often, sensing materials are too complex to predict their performance quantitatively in the design stage. Thus, combinatorial and high-throughput experimentation methodologies provide an opportunity to generate new required data to discover new sensing materials and/or to optimize existing material compositions. The goal of this chapter is to provide an overview of the key concepts of experimental development of sensing materials using combinatorial and high-throughput experimentation tools, and to promote additional fruitful interactions between computational scientists and experimentalists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric
2015-01-16
This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less
Besalú, Emili
2016-01-01
The Superposing Significant Interaction Rules (SSIR) method is described. It is a general combinatorial and symbolic procedure able to rank compounds belonging to combinatorial analogue series. The procedure generates structure-activity relationship (SAR) models and also serves as an inverse SAR tool. The method is fast and can deal with large databases. SSIR operates from statistical significances calculated from the available library of compounds and according to the previously attached molecular labels of interest or non-interest. The required symbolic codification allows dealing with almost any combinatorial data set, even in a confidential manner, if desired. The application example categorizes molecules as binding or non-binding, and consensus ranking SAR models are generated from training and two distinct cross-validation methods: leave-one-out and balanced leave-two-out (BL2O), the latter being suited for the treatment of binary properties. PMID:27240346
NASA Technical Reports Server (NTRS)
Lee, Jonathan A.
2005-01-01
High-throughput measurement techniques are reviewed for solid phase transformation from materials produced by combinatorial methods, which are highly efficient concepts to fabricate large variety of material libraries with different compositional gradients on a single wafer. Combinatorial methods hold high potential for reducing the time and costs associated with the development of new materials, as compared to time-consuming and labor-intensive conventional methods that test large batches of material, one- composition at a time. These high-throughput techniques can be automated to rapidly capture and analyze data, using the entire material library on a single wafer, thereby accelerating the pace of materials discovery and knowledge generation for solid phase transformations. The review covers experimental techniques that are applicable to inorganic materials such as shape memory alloys, graded materials, metal hydrides, ferric materials, semiconductors and industrial alloys.
Exploiting Quantum Resonance to Solve Combinatorial Problems
NASA Technical Reports Server (NTRS)
Zak, Michail; Fijany, Amir
2006-01-01
Quantum resonance would be exploited in a proposed quantum-computing approach to the solution of combinatorial optimization problems. In quantum computing in general, one takes advantage of the fact that an algorithm cannot be decoupled from the physical effects available to implement it. Prior approaches to quantum computing have involved exploitation of only a subset of known quantum physical effects, notably including parallelism and entanglement, but not including resonance. In the proposed approach, one would utilize the combinatorial properties of tensor-product decomposability of unitary evolution of many-particle quantum systems for physically simulating solutions to NP-complete problems (a class of problems that are intractable with respect to classical methods of computation). In this approach, reinforcement and selection of a desired solution would be executed by means of quantum resonance. Classes of NP-complete problems that are important in practice and could be solved by the proposed approach include planning, scheduling, search, and optimal design.
Combinatorial studies of (1-x)Na0.5Bi0.5TiO3-xBaTiO3 thin-film chips
NASA Astrophysics Data System (ADS)
Cheng, Hong-Wei; Zhang, Xue-Jin; Zhang, Shan-Tao; Feng, Yan; Chen, Yan-Feng; Liu, Zhi-Guo; Cheng, Guang-Xi
2004-09-01
Applying a combinatorial methodology, (1-x)Na0.5Bi0.5TiO3-xBaTiO3 (NBT-BT) thin-film chips were fabricated on (001)-LaAlO3 substrates by pulsed laser deposition with a few quaternary masks. A series of NBT-BT library with the composition of BT ranged from 0 to 44% was obtained with uniform composition and well crystallinity. The relation between the concentration of NBT-BT and their structural and dielectric properties were investigated by x-ray diffraction (XRD), evanescent microwave probe, atomic force microscopy, and Raman spectroscopy. An obvious morphotropic phase boundary (MPB) was established to be about 9% BT by XRD, Raman frequency shift, and dielectric anomaly, different from the well-known MPB of the materials. The result shows the high efficiency of combinatorial method in searching new relaxor ferroelectrics.
A combinatorial code for pattern formation in Drosophila oogenesis.
Yakoby, Nir; Bristow, Christopher A; Gong, Danielle; Schafer, Xenia; Lembong, Jessica; Zartman, Jeremiah J; Halfon, Marc S; Schüpbach, Trudi; Shvartsman, Stanislav Y
2008-11-01
Two-dimensional patterning of the follicular epithelium in Drosophila oogenesis is required for the formation of three-dimensional eggshell structures. Our analysis of a large number of published gene expression patterns in the follicle cells suggests that they follow a simple combinatorial code based on six spatial building blocks and the operations of union, difference, intersection, and addition. The building blocks are related to the distribution of inductive signals, provided by the highly conserved epidermal growth factor receptor and bone morphogenetic protein signaling pathways. We demonstrate the validity of the code by testing it against a set of patterns obtained in a large-scale transcriptional profiling experiment. Using the proposed code, we distinguish 36 distinct patterns for 81 genes expressed in the follicular epithelium and characterize their joint dynamics over four stages of oogenesis. The proposed combinatorial framework allows systematic analysis of the diversity and dynamics of two-dimensional transcriptional patterns and guides future studies of gene regulation.
Wang, Lipo; Li, Sa; Tian, Fuyu; Fu, Xiuju
2004-10-01
Recently Chen and Aihara have demonstrated both experimentally and mathematically that their chaotic simulated annealing (CSA) has better search ability for solving combinatorial optimization problems compared to both the Hopfield-Tank approach and stochastic simulated annealing (SSA). However, CSA may not find a globally optimal solution no matter how slowly annealing is carried out, because the chaotic dynamics are completely deterministic. In contrast, SSA tends to settle down to a global optimum if the temperature is reduced sufficiently slowly. Here we combine the best features of both SSA and CSA, thereby proposing a new approach for solving optimization problems, i.e., stochastic chaotic simulated annealing, by using a noisy chaotic neural network. We show the effectiveness of this new approach with two difficult combinatorial optimization problems, i.e., a traveling salesman problem and a channel assignment problem for cellular mobile communications.
Study of constrained minimal supersymmetry
NASA Astrophysics Data System (ADS)
Kane, G. L.; Kolda, Chris; Roszkowski, Leszek; Wells, James D.
1994-06-01
Taking seriously the phenomenological indications for supersymmetry we have made a detailed study of unified minimal SUSY, including many effects at the few percent level in a consistent fashion. We report here a general analysis of what can be studied without choosing a particular gauge group at the unification scale. Firstly, we find that the encouraging SUSY unification results of recent years do survive the challenge of a more complete and accurate analysis. Taking into account effects at the 5-10 % level leads to several improvements of previous results and allows us to sharpen our predictions for SUSY in the light of unification. We perform a thorough study of the parameter space and look for patterns to indicate SUSY predictions, so that they do not depend on arbitrary choices of some parameters or untested assumptions. Our results can be viewed as a fully constrained minimal SUSY standard model. The resulting model forms a well-defined basis for comparing the physics potential of different facilities. Very little of the acceptable parameter space has been excluded by CERN LEP or Fermilab so far, but a significant fraction can be covered when these accelerators are upgraded. A number of initial applications to the understanding of the values of mh and mt, the SUSY spectrum, detectability of SUSY at LEP II or Fermilab, B(b-->sγ), Γ(Z-->bb¯), dark matter, etc., are included in a separate section that might be of more interest to some readers than the technical aspects of model building. We formulate an approach to extracting SUSY parameters from data when superpartners are detected. For small tanβ or large mt both m1/2 and m0 are entirely bounded from above at ~1 TeV without having to use a fine-tuning constraint.
Superparticle phenomenology from the natural mini-landscape
NASA Astrophysics Data System (ADS)
Baer, Howard; Barger, Vernon; Savoy, Michael; Serce, Hasan; Tata, Xerxes
2017-06-01
The methodology of the heterotic mini-landscape attempts to zero in on phenomenologically viable corners of the string landscape where the effective low energy theory is the Minimal Supersymmetric Standard Model with localized grand unification. The gaugino mass pattern is that of mirage-mediation. The magnitudes of various SM Yukawa couplings point to a picture where scalar soft SUSY breaking terms are related to the geography of fields in the compactified dimensions. Higgs fields and third generation scalars extend to the bulk and occur in split multiplets with TeV scale soft masses. First and second generation scalars, localized at orbifold fixed points or tori with enhanced symmetry, occur in complete GUT multiplets and have much larger masses. This picture can be matched onto the parameter space of generalized mirage mediation. Naturalness considerations, the requirement of the observed electroweak symmetry breaking pattern, and LHC bounds on m g together limit the gravitino mass to the m 3/2 ˜ 5-60 TeV range. The mirage unification scale is bounded from below with the limit depending on the ratio of squark to gravitino masses. We show that while natural SUSY in this realization may escape detection even at the high luminosity LHC, the high energy LHC with √{s}=33 TeV could unequivocally confirm or exclude this scenario. It should be possible to detect the expected light higgsinos at the ILC if these are kinematically accessible, and possibly also discriminate the expected compression of gaugino masses in the natural mini-landscape picture from the mass pattern expected in models with gaugino mass unification. The thermal WIMP signal should be accessible via direct detection searches at the multi-ton noble liquid detectors such as XENONnT or LZ.
Balkanization and Unification of Probabilistic Inferences
ERIC Educational Resources Information Center
Yu, Chong-Ho
2005-01-01
Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…
An International Aerospace Information System: A Cooperative Opportunity.
ERIC Educational Resources Information Center
Blados, Walter R.; Cotter, Gladys A.
1992-01-01
Introduces and discusses ideas and issues relevant to the international unification of scientific and technical information (STI) through development of an international aerospace database (IAD). Specific recommendations for improving the National Aeronautics and Space Administration Aerospace Database (NAD) and for implementing IAD are given.…
A Programming Environment for Parallel Vision Algorithms
1990-04-11
industrial arm on the market , while the unique head was designed by Rochester’s Computer Science and Mechanical Engineering Departments. 9a 4.1 Introduction...R. Constraining-Unification and the Programming Language Unicorn . In Logic Programming, Functions, Relations, and Equations, Degroot and Lind- strom
Strategies for Reforming Workforce Preparation Programs in Europe
ERIC Educational Resources Information Center
Stenstrom, Marja-Leena; Lasonen, Johanna
2004-01-01
The Post-16 Strategies project coordinated by Dr. Johanna Lasonen from 1996 to 1998 was chiefly concerned with four post-16 education strategies: (1) "vocational enhancement"; (2) "mutual enrichment"; (3) "linkages;" and (4) "unification." These four strategies to promote parity of esteem between vocational…
Le Chatelier's Principle in Sensation and Perception: Fractal-Like Enfolding at Different Scales
Norwich, Kenneth H.
2010-01-01
Le Chatelier's principle asserts that a disturbance, when applied to a resting system may drive the system away from its equilibrium state, but will invoke a countervailing influence that will counteract the effect of the disturbance. When applied to the field of sensation and perception, a generalized stimulus will displace the system from equilibrium, and a generalized adaptation process will serve as the countervailing influence tending to reduce the impact of the stimulus. The principle applies at all levels, from the behavioral to the neural, the larger enfolding the smaller in fractal-like form. Le Chatelier's principle, so applied, leads to the unification of many concepts in sensory science. Ideas as diverse as sensory adaptation, reflex arcs, and simple deductive logic can be brought under the umbrella of a single orienting principle. Beyond unification, this principle allows us to approach many questions in pathophysiology from a different perspective. For example, we find new direction toward the reduction of phantom-limb pain and possibly of vertigo. PMID:21423359
Physics through the 1990s: Elementary-particle physics
NASA Astrophysics Data System (ADS)
The volume begins with a non-mathematical discussion of the motivation behind, and basic ideas of, elementary-particle physics theory and experiment. The progress over the past two decades with the quark model and unification of the electromagnetic and weak interactions is reviewed. Existing theoretical problems in the field, such as the origin of mass and the unification of the fundamental forces, are detailed, along with experimental programs to test the new theories. Accelerators, instrumentation, and detectors are described for both current and future facilities. Interactions with other areas of both theoretical and applied physics are presented. The sociology of the field is examined regarding the education of graduate students, the organization necessary in large-scale experiments, and the decision-making process involved in high-cost experiments. Finally, conclusions and recommendations for maintaining US excellence in theory and experiment are given. Appendices list both current and planned accelerators, and present statistical data on the US elementary-particle physics program. A glossary is included.
Physics through the 1990s: elementary-particle physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1986-01-01
The volume begins with a non-mathematical discussion of the motivation behind, and basic ideas of, elementary-particle physics theory and experiment. The progress over the past two decades with the quark model and unification of the electromagnetic and weak interactions is reviewed. Existing theoretical problems in the field, such as the origin of mass and the unification of the fundamental forces, are detailed, along with experimental programs to test the new theories. Accelerators, instrumentation, and detectors are described for both current and future facilities. Interactions with other areas of both theoretical and applied physics are presented. The sociology of the fieldmore » is examined regarding the education of graduate students, the organization necessary in large-scale experiments, and the decision-making process involved in high-cost experiments. Finally, conclusions and recommendations for maintaining US excellence in theory and experiment are given. Appendices list both current and planned accelerators, and present statistical data on the US elementary-particle physics program. A glossary is included.« less
On the unification of nuclear-structure theory: A response to Bortignon and Broglia
NASA Astrophysics Data System (ADS)
Cook, Norman D.
2016-09-01
Nuclear-structure theory is unusual among the diverse fields of quantum physics. Although it provides a coherent description of all known isotopes on the basis of a quantum-mechanical understanding of nucleon states, nevertheless, in the absence of a fundamental theory of the nuclear force acting between nucleons, the prediction of all ground-state and excited-state nuclear binding energies is inherently semi-empirical. I suggest that progress can be made by returning to the foundational work of Eugene Wigner from 1937, where the mathematical symmetries of nucleon states were first defined. Those symmetries were later successfully exploited in the development of the independent-particle model ( IPM ˜ shell model , but the geometrical implications noted by Wigner were neglected. Here I review how the quantum-mechanical, but remarkably easy-to-understand geometrical interpretation of the IPM provides constraints on the parametrization of the nuclear force. The proposed "geometrical IPM" indicates a way forward toward the unification of nuclear-structure theory that Bortignon and Broglia have called for.
Evolution of Extragalactic Radio Sources and Quasar/Galaxy Unification
NASA Astrophysics Data System (ADS)
Onah, C. I.; Ubachukwu, A. A.; Odo, F. C.; Onuchukwu, C. C.
2018-04-01
We use a large sample of radio sources to investigate the effects of evolution, luminosity selection and radio source orientation in explaining the apparent deviation of observed angular size - redshift (θ - z) relation of extragalactic radio sources (EGRSs) from the standard model. We have fitted the observed θ - z data with standard cosmological models based on a flat universe (Ω0 = 1). The size evolution of EGRSs has been described as luminosity, temporal and orientation-dependent in the form DP,z,Φ ≍ P±q(1 + z)-m sinΦ, with q=0.3, Φ=59°, m=-0.26 for radio galaxies and q=-0.5, Φ=33°, m=3.1 for radio quasars respectively. Critical points of luminosity, logPcrit=26.33 WHz-1 and logDc=2.51 kpc (316.23 kpc) of the present sample of radio sources were also observed. All the results were found to be consistent with the popular quasar/galaxy unification scheme.
Hogan, Craig
2017-12-22
Classical spacetime and quantum mass-energy form the basis of all of physics. They become inconsistent at the Planck scale, 5.4 times 10^{-44} seconds, which may signify a need for reconciliation in a unified theory. Although proposals for unified theories exist, a direct experimental probe of this scale, 16 orders of magnitude above Tevatron energy, has seemed hopelessly out of reach. However in a particular interpretation of holographic unified theories, derived from black hole evaporation physics, a world assembled out of Planck-scale waves displays effects of unification with a new kind of uncertainty in position at the Planck diffraction scale, the geometric mean of the Planck length and the apparatus size. In this case a new phenomenon may measurable, an indeterminacy of spacetime position that appears as noise in interferometers. The colloquium will discuss the theory of the effect, and our plans to build a holographic interferometer at Fermilab to measure it.
Beyond description. Comment on "Approaching human language with complex networks" by Cong and Liu
NASA Astrophysics Data System (ADS)
Ferrer-i-Cancho, R.
2014-12-01
In their historical overview, Cong & Liu highlight Sausurre as the father of modern linguistics [1]. They apparently miss G.K. Zipf as a pioneer of the view of language as a complex system. His idea of a balance between unification and diversification forces in the organization of natural systems, e.g., vocabularies [2], can be seen as a precursor of the view of complexity as a balance between order (unification) and disorder (diversification) near the edge of chaos [3]. Although not mentioned by Cong & Liu somewhere else, trade-offs between hearer and speaker needs are very important in Zipf's view, which has inspired research on the optimal networks mapping words into meanings [4-6]. Quantitative linguists regard G.K. Zipf as the funder of modern quantitative linguistics [7], a discipline where statistics plays a central role as in network science. Interestingly, that centrality of statistics is missing Saussure's work and that of many of his successors.
Symmetry Breaking, Unification, and Theories Beyond the Standard Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nomura, Yasunori
2009-07-31
A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressingmore » the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.« less
Identifying the genes of unconventional high temperature superconductors.
Hu, Jiangping
We elucidate a recently emergent framework in unifying the two families of high temperature (high [Formula: see text]) superconductors, cuprates and iron-based superconductors. The unification suggests that the latter is simply the counterpart of the former to realize robust extended s-wave pairing symmetries in a square lattice. The unification identifies that the key ingredients (gene) of high [Formula: see text] superconductors is a quasi two dimensional electronic environment in which the d -orbitals of cations that participate in strong in-plane couplings to the p -orbitals of anions are isolated near Fermi energy. With this gene, the superexchange magnetic interactions mediated by anions could maximize their contributions to superconductivity. Creating the gene requires special arrangements between local electronic structures and crystal lattice structures. The speciality explains why high [Formula: see text] superconductors are so rare. An explicit prediction is made to realize high [Formula: see text] superconductivity in Co/Ni-based materials with a quasi two dimensional hexagonal lattice structure formed by trigonal bipyramidal complexes.
Olsson, Lennart; Jerneck, Anne; Thoren, Henrik; Persson, Johannes; O’Byrne, David
2015-01-01
Resilience is often promoted as a boundary concept to integrate the social and natural dimensions of sustainability. However, it is a troubled dialogue from which social scientists may feel detached. To explain this, we first scrutinize the meanings, attributes, and uses of resilience in ecology and elsewhere to construct a typology of definitions. Second, we analyze core concepts and principles in resilience theory that cause disciplinary tensions between the social and natural sciences (system ontology, system boundary, equilibria and thresholds, feedback mechanisms, self-organization, and function). Third, we provide empirical evidence of the asymmetry in the use of resilience theory in ecology and environmental sciences compared to five relevant social science disciplines. Fourth, we contrast the unification ambition in resilience theory with methodological pluralism. Throughout, we develop the argument that incommensurability and unification constrain the interdisciplinary dialogue, whereas pluralism drawing on core social scientific concepts would better facilitate integrated sustainability research. PMID:26601176
NASA Astrophysics Data System (ADS)
Bobkov, S. G.; Serdin, O. V.; Arkhangelskiy, A. I.; Arkhangelskaja, I. V.; Suchkov, S. I.; Topchiev, N. P.
The problem of electronic component unification at the different levels (circuits, interfaces, hardware and software) used in space industry is considered. The task of computer systems for space purposes developing is discussed by example of scientific data acquisition system for space project GAMMA-400. The basic characteristics of high reliable and fault tolerant chips developed by SRISA RAS for space applicable computational systems are given. To reduce power consumption and enhance data reliability, embedded system interconnect made hierarchical: upper level is Serial RapidIO 1x or 4x with rate transfer 1.25 Gbaud; next level - SpaceWire with rate transfer up to 400 Mbaud and lower level - MIL-STD-1553B and RS232/RS485. The Ethernet 10/100 is technology interface and provided connection with the previously released modules too. Systems interconnection allows creating different redundancy systems. Designers can develop heterogeneous systems that employ the peer-to-peer networking performance of Serial RapidIO using multiprocessor clusters interconnected by SpaceWire.
Physics through the 1990s: Elementary-particle physics
NASA Technical Reports Server (NTRS)
1986-01-01
The volume begins with a non-mathematical discussion of the motivation behind, and basic ideas of, elementary-particle physics theory and experiment. The progress over the past two decades with the quark model and unification of the electromagnetic and weak interactions is reviewed. Existing theoretical problems in the field, such as the origin of mass and the unification of the fundamental forces, are detailed, along with experimental programs to test the new theories. Accelerators, instrumentation, and detectors are described for both current and future facilities. Interactions with other areas of both theoretical and applied physics are presented. The sociology of the field is examined regarding the education of graduate students, the organization necessary in large-scale experiments, and the decision-making process involved in high-cost experiments. Finally, conclusions and recommendations for maintaining US excellence in theory and experiment are given. Appendices list both current and planned accelerators, and present statistical data on the US elementary-particle physics program. A glossary is included.
Yukawa unification in an SO(10) SUSY GUT: SUSY on the edge
NASA Astrophysics Data System (ADS)
Poh, Zijie; Raby, Stuart
2015-07-01
In this paper we analyze Yukawa unification in a three family SO(10) SUSY GUT. We perform a global χ2 analysis and show that supersymmetry (SUSY) effects do not decouple even though the universal scalar mass parameter at the grand unified theory (GUT) scale, m16, is found to lie between 15 and 30 TeV with the best fit given for m16≈25 TeV . Note, SUSY effects do not decouple since stops and bottoms have mass of order 5 TeV, due to renormalization group running from MGUT. The model has many testable predictions. Gauginos are the lightest sparticles and the light Higgs boson is very much standard model-like. The model is consistent with flavor and C P observables with the BR (μ →e γ ) close to the experimental upper bound. With such a large value of m16 we clearly cannot be considered "natural" SUSY nor are we "split" SUSY. We are thus in the region in between or "SUSY on the edge."
SO(10) Yukawa unification after the first run of the LHC
NASA Astrophysics Data System (ADS)
Raby, Stuart
2014-06-01
In this talk we discuss SO(10) Yukawa unification and its ramifications for phenomenology. The initial constraints come from fitting the top, bottom and tau masses, requiring large tan β ˜ 50 and particular values for soft SUSY breaking parameters. We perform a global χ2 analysis, fitting the recently observed `Higgs' with mass of order 125 GeV in addition to fermion masses and mixing angles and several flavor violating observables. We discuss two distinct GUT scale boundary conditions for soft SUSY breaking masses. In both cases we have a universal cubic scalar parameter, A0. In the first case we consider universal gaugino masses, and universal scalar masses, m16, for squarks and sleptons; while in the latter case we have non-universal gaugino masses and either universal scalar masses, m16, for squarks and sleptons or D-term splitting of scalar masses. We discuss the spectrum of SUSY particle masses and consequences for the LHC.
Proposal for a unified selection to medical residency programs.
Toffoli, Sônia Ferreira Lopes; Ferreira Filho, Olavo Franco; Andrade, Dalton Francisco de
2013-01-01
This paper proposes the unification of entrance exams to medical residency programs (MRP) in Brazil. Problems related to MRP and its interface with public health problems in Brazil are highlighted and how this proposal are able to help solving these problems. The proposal is to create a database to be applied in MRP unified exams. Some advantages of using the Item Response Theory (IRT) in this database are highlighted. The MRP entrance exams are developed and applied decentralized where each school is responsible for its examination. These exams quality are questionable. Reviews about items quality, validity and reliability of appliances are not common disclosed. Evaluation is important in every education system bringing on required changes and control of teaching and learning. The proposal of MRP entrance exams unification, besides offering high quality exams to institutions participants, could be as an extra source to rate medical school and cause improvements, provide studies with a database and allow a regional mobility. Copyright © 2013 Elsevier Editora Ltda. All rights reserved.
Abraham, Anna
2016-11-01
The astounding capacity for the human imagination to be engaged across a wide range of contexts is limitless and fundamental to our day-to-day experiences. Although processes of imagination are central to human psychological function, they rarely occupy center stage in academic discourse or empirical study within psychological and neuroscientific realms. The aim of this paper is to tackle this imbalance by drawing together the multitudinous facets of imagination within a common framework. The processes fall into one of five categories depending on whether they are characterized as involving perceptual/motor related mental imagery, intentionality or recollective processing, novel combinatorial or generative processing, exceptional phenomenology in the aesthetic response, or altered psychological states which range from commonplace to dysfunctional. These proposed categories are defined on the basis of theoretical ideas from philosophy as well as empirical evidence from neuroscience. By synthesizing the findings across these domains of imagination, this novel five-part or quinquepartite classification of the human imagination aids in systematizing, and thereby abets, our understanding of the workings and neural foundations of the human imagination. It would serve as a blueprint to direct further advances in the field of imagination while also promoting crosstalk with reference to stimulus-oriented facets of information processing. A biologically and ecologically valid psychology is one that seeks to explain fundamental aspects of human nature. Given the ubiquitous nature of the imaginative operations in our daily lives, there can be little doubt that these quintessential aspects of the mind should be central to the discussion. Hum Brain Mapp 37:4197-4211, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Switched Systems and Motion Coordination: Combinatorial Challenges
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V.
2016-01-01
Problems of routing commercial air traffic in a terminal airspace encounter different constraints: separation assurance, aircraft performance limitations, regulations. The general setting of these problems is that of a switched control system. Such a system combines the differentiable motion of the aircraft with the combinatorial choices of choosing precedence when traffic routes merge and choosing branches when the routes diverge. This presentation gives an overview of the problem, the ATM context, related literature, and directions for future research.
Coelho, V N; Coelho, I M; Souza, M J F; Oliveira, T A; Cota, L P; Haddad, M N; Mladenovic, N; Silva, R C P; Guimarães, F G
2016-01-01
This article presents an Evolution Strategy (ES)--based algorithm, designed to self-adapt its mutation operators, guiding the search into the solution space using a Self-Adaptive Reduced Variable Neighborhood Search procedure. In view of the specific local search operators for each individual, the proposed population-based approach also fits into the context of the Memetic Algorithms. The proposed variant uses the Greedy Randomized Adaptive Search Procedure with different greedy parameters for generating its initial population, providing an interesting exploration-exploitation balance. To validate the proposal, this framework is applied to solve three different [Formula: see text]-Hard combinatorial optimization problems: an Open-Pit-Mining Operational Planning Problem with dynamic allocation of trucks, an Unrelated Parallel Machine Scheduling Problem with Setup Times, and the calibration of a hybrid fuzzy model for Short-Term Load Forecasting. Computational results point out the convergence of the proposed model and highlight its ability in combining the application of move operations from distinct neighborhood structures along the optimization. The results gathered and reported in this article represent a collective evidence of the performance of the method in challenging combinatorial optimization problems from different application domains. The proposed evolution strategy demonstrates an ability of adapting the strength of the mutation disturbance during the generations of its evolution process. The effectiveness of the proposal motivates the application of this novel evolutionary framework for solving other combinatorial optimization problems.
Neural correlates of implicit and explicit combinatorial semantic processing
Graves, William W.; Binder, Jeffrey R.; Desai, Rutvik H.; Conant, Lisa L.; Seidenberg, Mark S.
2010-01-01
Language consists of sequences of words, but comprehending phrases involves more than concatenating meanings: A boat house is a shelter for boats, whereas a summer house is a house used during summer, and a ghost house is typically uninhabited. Little is known about the brain bases of combinatorial semantic processes. We performed two fMRI experiments using familiar, highly meaningful phrases (LAKE HOUSE) and unfamiliar phrases with minimal meaning created by reversing the word order of the familiar items (HOUSE LAKE). The first experiment used a 1-back matching task to assess implicit semantic processing, and the second used a classification task to engage explicit semantic processing. These conditions required processing of the same words, but with more effective combinatorial processing in the meaningful condition. The contrast of meaningful versus reversed phrases revealed activation primarily during the classification task, to a greater extent in the right hemisphere, including right angular gyrus, dorsomedial prefrontal cortex, and bilateral posterior cingulate/precuneus, areas previously implicated in semantic processing. Positive correlations of fMRI signal with lexical (word-level) frequency occurred exclusively with the 1-back task and to a greater spatial extent on the left, including left posterior middle temporal gyrus and bilateral parahippocampus. These results reveal strong effects of task demands on engagement of lexical versus combinatorial processing and suggest a hemispheric dissociation between these levels of semantic representation. PMID:20600969
Chang, Yuchao; Tang, Hongying; Cheng, Yongbo; Zhao, Qin; Yuan, Baoqing Li andXiaobing
2017-07-19
Routing protocols based on topology control are significantly important for improving network longevity in wireless sensor networks (WSNs). Traditionally, some WSN routing protocols distribute uneven network traffic load to sensor nodes, which is not optimal for improving network longevity. Differently to conventional WSN routing protocols, we propose a dynamic hierarchical protocol based on combinatorial optimization (DHCO) to balance energy consumption of sensor nodes and to improve WSN longevity. For each sensor node, the DHCO algorithm obtains the optimal route by establishing a feasible routing set instead of selecting the cluster head or the next hop node. The process of obtaining the optimal route can be formulated as a combinatorial optimization problem. Specifically, the DHCO algorithm is carried out by the following procedures. It employs a hierarchy-based connection mechanism to construct a hierarchical network structure in which each sensor node is assigned to a special hierarchical subset; it utilizes the combinatorial optimization theory to establish the feasible routing set for each sensor node, and takes advantage of the maximum-minimum criterion to obtain their optimal routes to the base station. Various results of simulation experiments show effectiveness and superiority of the DHCO algorithm in comparison with state-of-the-art WSN routing algorithms, including low-energy adaptive clustering hierarchy (LEACH), hybrid energy-efficient distributed clustering (HEED), genetic protocol-based self-organizing network clustering (GASONeC), and double cost function-based routing (DCFR) algorithms.
Emergent latent symbol systems in recurrent neural networks
NASA Astrophysics Data System (ADS)
Monner, Derek; Reggia, James A.
2012-12-01
Fodor and Pylyshyn [(1988). Connectionism and cognitive architecture: A critical analysis. Cognition, 28(1-2), 3-71] famously argued that neural networks cannot behave systematically short of implementing a combinatorial symbol system. A recent response from Frank et al. [(2009). Connectionist semantic systematicity. Cognition, 110(3), 358-379] claimed to have trained a neural network to behave systematically without implementing a symbol system and without any in-built predisposition towards combinatorial representations. We believe systems like theirs may in fact implement a symbol system on a deeper and more interesting level: one where the symbols are latent - not visible at the level of network structure. In order to illustrate this possibility, we demonstrate our own recurrent neural network that learns to understand sentence-level language in terms of a scene. We demonstrate our model's learned understanding by testing it on novel sentences and scenes. By paring down our model into an architecturally minimal version, we demonstrate how it supports combinatorial computation over distributed representations by using the associative memory operations of Vector Symbolic Architectures. Knowledge of the model's memory scheme gives us tools to explain its errors and construct superior future models. We show how the model designs and manipulates a latent symbol system in which the combinatorial symbols are patterns of activation distributed across the layers of a neural network, instantiating a hybrid of classical symbolic and connectionist representations that combines advantages of both.
Combinatorial Pharmacophore-Based 3D-QSAR Analysis and Virtual Screening of FGFR1 Inhibitors
Zhou, Nannan; Xu, Yuan; Liu, Xian; Wang, Yulan; Peng, Jianlong; Luo, Xiaomin; Zheng, Mingyue; Chen, Kaixian; Jiang, Hualiang
2015-01-01
The fibroblast growth factor/fibroblast growth factor receptor (FGF/FGFR) signaling pathway plays crucial roles in cell proliferation, angiogenesis, migration, and survival. Aberration in FGFRs correlates with several malignancies and disorders. FGFRs have proved to be attractive targets for therapeutic intervention in cancer, and it is of high interest to find FGFR inhibitors with novel scaffolds. In this study, a combinatorial three-dimensional quantitative structure-activity relationship (3D-QSAR) model was developed based on previously reported FGFR1 inhibitors with diverse structural skeletons. This model was evaluated for its prediction performance on a diverse test set containing 232 FGFR inhibitors, and it yielded a SD value of 0.75 pIC50 units from measured inhibition affinities and a Pearson’s correlation coefficient R2 of 0.53. This result suggests that the combinatorial 3D-QSAR model could be used to search for new FGFR1 hit structures and predict their potential activity. To further evaluate the performance of the model, a decoy set validation was used to measure the efficiency of the model by calculating EF (enrichment factor). Based on the combinatorial pharmacophore model, a virtual screening against SPECS database was performed. Nineteen novel active compounds were successfully identified, which provide new chemical starting points for further structural optimization of FGFR1 inhibitors. PMID:26110383
Combinatorial treatments enhance recovery following facial nerve crush.
Sharma, Nijee; Moeller, Carl W; Marzo, Sam J; Jones, Kathryn J; Foecking, Eileen M
2010-08-01
To investigate the effects of various combinatorial treatments, consisting of a tapering dose of prednisone (P), a brief period of nerve electrical stimulation (ES), and systemic testosterone propionate (TP) on improving functional recovery following an intratemporal facial nerve crush injury. Prospective, controlled animal study. After a right intratemporal facial nerve crush, adult male Sprague-Dawley rats were divided into the following eight treatment groups: 1) no treatment, 2) P only, 3) ES only, 4) ES + P, 5) TP only, 6) TP + P, 7) ES + TP, and 8) ES + TP + P. For each group n = 4-8. Recovery of the eyeblink reflex and vibrissae orientation and movement were assessed. Changes in peak amplitude and latency of evoked response, in response to facial nerve stimulation, was also recorded weekly. : Brief ES of the proximal nerve stump most effectively accelerated the initiation of functional recovery. Also, ES or TP treatments enhanced recovery of some functional parameters more than P treatment. When administered alone, none of the three treatments improved recovery of complete facial function. Only the combinatorial treatment of ES + TP, regardless of the presence of P, accelerated complete functional recovery and return of normal motor nerve conduction. Our findings suggest that a combinatorial treatment strategy of using brief ES and TP together promises to be an effective therapeutic intervention for promoting regeneration following facial nerve injury. Administration of P neither augments nor hinders recovery.
Attitudes of Korean adults towards human dignity: A Q methodology approach
Kae Hwa, JO; Gyeong-Ju, AN; DOORENBOS, Ardith Z.
2013-01-01
Aim The aim of this study was to identify the perceived attitudes of Korean adults towards human dignity in order to determine the relationship of human dignity to its social and cultural background. Methods The Q methodology research technique was used to explore perceived attitude typology on the basis of the respondents’ ranking order for different statements. A convenience sampling method was used to select 40 Korean adults who were interested in human dignity to create statements. From the questionnaires, in-depth interviews, and a literature review, a total of 158 statements was obtained. The final 34 Q samples were selected from a review by two nursing professors and a Q methodology expert. Moreover, 38 respondents participated as P samples by sorting 34 Q statements on a nine-point normal distribution scale. The data were analyzed by using the QUANL software package. Results The following four types of attitudes about human dignity were identified in Korea: a happiness-oriented–self-pursuit type, relationship-oriented–self-recognition type, reflection-oriented–self-unification type, and discrimination-oriented–self-maintenance type. Conclusions The results indicate that approaches to developing human dignity education need to take this typology into account and the characteristics of the participants who fall into each category. These results provide general guidelines to understand Korean values for professional practice in various healthcare settings. PMID:22583944
Attitudes of Korean adults towards human dignity: a Q methodology approach.
Jo, Kae Hwa; An, Gyeong-Ju; Doorenbos, Ardith Z
2012-06-01
The aim of this study was to identify the perceived attitudes of Korean adults towards human dignity in order to determine the relationship of human dignity to its social and cultural background. The Q methodology research technique was used to explore perceived attitude typology on the basis of the respondents' ranking order for different statements. A convenience sampling method was used to select 40 Korean adults who were interested in human dignity to create statements. From the questionnaires, in-depth interviews, and a literature review, a total of 158 statements was obtained. The final 34 Q samples were selected from a review by two nursing professors and a Q methodology expert. Moreover, 38 respondents participated as P samples by sorting 34 Q statements on a nine-point normal distribution scale. The data were analyzed by using the QUANL software package. The following four types of attitudes about human dignity were identified in Korea: a happiness-oriented-self-pursuit type, relationship-oriented-self-recognition type, reflection-oriented-self-unification type, and discrimination-oriented-self-maintenance type. The results indicate that approaches to developing human dignity education need to take this typology into account and the characteristics of the participants who fall into each category. These results provide general guidelines to understand Korean values for professional practice in various healthcare settings. © 2011 The Authors. Japan Journal of Nursing Science © 2011 Japan Academy of Nursing Science.
Australian Higher Education Reforms--Unification or Diversification?
ERIC Educational Resources Information Center
Coombe, Leanne
2015-01-01
The higher education policy of the previous Australian government aimed to achieve an internationally competitive higher education sector while expanding access opportunities to all Australians. This policy agenda closely reflects global trends that focus on achieving both quality and equity objectives. In this paper, the formulation and…
Explanation and Prediction: Building a Unified Theory of Librarianship, Concept and Review.
ERIC Educational Resources Information Center
McGrath, William E.
2002-01-01
Develops a comprehensive, unified, explanatory theory of librarianship by first making an analogy to the unification of the fundamental forces of nature. Topics include dependent and independent variables; publishing; acquisitions; classification and organization of knowledge; storage, preservation, and collection management; collections; and…
How Is the Ideal Gas Law Explanatory?
ERIC Educational Resources Information Center
Woody, Andrea I.
2013-01-01
Using the ideal gas law as a comparative example, this essay reviews contemporary research in philosophy of science concerning scientific explanation. It outlines the inferential, causal, unification, and erotetic conceptions of explanation and discusses an alternative project, the functional perspective. In each case, the aim is to highlight…
A Re-Unification of Two Competing Models for Document Retrieval.
ERIC Educational Resources Information Center
Bodoff, David
1999-01-01
Examines query-oriented versus document-oriented information retrieval and feedback learning. Highlights include a reunification of the two approaches for probabilistic document retrieval and for vector space model (VSM) retrieval; learning in VSM and in probabilistic models; multi-dimensional scaling; and ongoing field studies. (LRW)
Chromosome structure inside the nucleus.
Swedlow, J R; Agard, D A; Sedat, J W
1993-06-01
Recent in situ three-dimensional structural studies have provided a new model for the 30 nm chromatin fiber. In addition, research during the past year has revealed some of the molecular complexity of non-histone chromosomal proteins. Still to come is the unification of molecular insights with chromosomal architecture.
NASA Astrophysics Data System (ADS)
Kawashima, Kazuhiro; Okamoto, Yuji; Annayev, Orazmuhammet; Toyokura, Nobuo; Takahashi, Ryota; Lippmaa, Mikk; Itaka, Kenji; Suzuki, Yoshikazu; Matsuki, Nobuyuki; Koinuma, Hideomi
2017-12-01
As an extension of combinatorial molecular layer epitaxy via ablation of perovskite oxides by a pulsed excimer laser, we have developed a laser molecular beam epitaxy (MBE) system for parallel integration of nano-scaled thin films of organic-inorganic hybrid materials. A pulsed infrared (IR) semiconductor laser was adopted for thermal evaporation of organic halide (A-site: CH3NH3I) and inorganic halide (B-site: PbI2) powder targets to deposit repeated A/B bilayer films where the thickness of each layer was controlled on molecular layer scale by programming the evaporation IR laser pulse number, length, or power. The layer thickness was monitored with an in situ quartz crystal microbalance and calibrated against ex situ stylus profilometer measurements. A computer-controlled movable mask system enabled the deposition of combinatorial thin film libraries, where each library contains a vertically homogeneous film with spatially programmable A- and B-layer thicknesses. On the composition gradient film, a hole transport Spiro-OMeTAD layer was spin-coated and dried followed by the vacuum evaporation of Ag electrodes to form the solar cell. The preliminary cell performance was evaluated by measuring I-V characteristics at seven different positions on the 12.5 mm × 12.5 mm combinatorial library sample with seven 2 mm × 4 mm slits under a solar simulator irradiation. The combinatorial solar cell library clearly demonstrated that the energy conversion efficiency sharply changes from nearly zero to 10.2% as a function of the illumination area in the library. The exploration of deposition parameters for obtaining optimum performance could thus be greatly accelerated. Since the thickness ratio of PbI2 and CH3NH3I can be freely chosen along the shadow mask movement, these experiments show the potential of this system for high-throughput screening of optimum chemical composition in the binary film library and application to halide perovskite solar cell.
The combinatorial control of alternative splicing in C. elegans
2017-01-01
Normal development requires the right splice variants to be made in the right tissues at the right time. The core splicing machinery is engaged in all splicing events, but which precise splice variant is made requires the choice between alternative splice sites—for this to occur, a set of splicing factors (SFs) must recognize and bind to short RNA motifs in the pre-mRNA. In C. elegans, there is known to be extensive variation in splicing patterns across development, but little is known about the targets of each SF or how multiple SFs combine to regulate splicing. Here we combine RNA-seq with in vitro binding assays to study how 4 different C. elegans SFs, ASD-1, FOX-1, MEC-8, and EXC-7, regulate splicing. The 4 SFs chosen all have well-characterised biology and well-studied loss-of-function genetic alleles, and all contain RRM domains. Intriguingly, while the SFs we examined have varied roles in C. elegans development, they show an unexpectedly high overlap in their targets. We also find that binding sites for these SFs occur on the same pre-mRNAs more frequently than expected suggesting extensive combinatorial control of splicing. We confirm that regulation of splicing by multiple SFs is often combinatorial and show that this is functionally significant. We also find that SFs appear to combine to affect splicing in two modes—they either bind in close proximity within the same intron or they appear to bind to separate regions of the intron in a conserved order. Finally, we find that the genes whose splicing are regulated by multiple SFs are highly enriched for genes involved in the cytoskeleton and in ion channels that are key for neurotransmission. Together, this shows that specific classes of genes have complex combinatorial regulation of splicing and that this combinatorial regulation is critical for normal development to occur. PMID:29121637
ChIP-less analysis of chromatin states.
Su, Zhangli; Boersma, Melissa D; Lee, Jin-Hee; Oliver, Samuel S; Liu, Shichong; Garcia, Benjamin A; Denu, John M
2014-01-01
Histone post-translational modifications (PTMs) are key epigenetic regulators in chromatin-based processes. Increasing evidence suggests that vast combinations of PTMs exist within chromatin histones. These complex patterns, rather than individual PTMs, are thought to define functional chromatin states. However, the ability to interrogate combinatorial histone PTM patterns at the nucleosome level has been limited by the lack of direct molecular tools. Here we demonstrate an efficient, quantitative, antibody-free, chromatin immunoprecipitation-less (ChIP-less) method for interrogating diverse epigenetic states. At the heart of the workflow are recombinant chromatin reader domains, which target distinct chromatin states with combinatorial PTM patterns. Utilizing a newly designed combinatorial histone peptide microarray, we showed that three reader domains (ATRX-ADD, ING2-PHD and AIRE-PHD) displayed greater specificity towards combinatorial PTM patterns than corresponding commercial histone antibodies. Such specific recognitions were employed to develop a chromatin reader-based affinity enrichment platform (matrix-assisted reader chromatin capture, or MARCC). We successfully applied the reader-based platform to capture unique chromatin states, which were quantitatively profiled by mass spectrometry to reveal interconnections between nucleosomal histone PTMs. Specifically, a highly enriched signature that harbored H3K4me0, H3K9me2/3, H3K79me0 and H4K20me2/3 within the same nucleosome was identified from chromatin enriched by ATRX-ADD. This newly reported PTM combination was enriched in heterochromatin, as revealed by the associated DNA. Our results suggest the broad utility of recombinant reader domains as an enrichment tool specific to combinatorial PTM patterns, which are difficult to probe directly by antibody-based approaches. The reader affinity platform is compatible with several downstream analyses to investigate the physical coexistence of nucleosomal PTM states associated with specific genomic loci. Collectively, the reader-based workflow will greatly facilitate our understanding of how distinct chromatin states and reader domains function in gene regulatory mechanisms.
Novel Modeling of Combinatorial miRNA Targeting Identifies SNP with Potential Role in Bone Density
Coronnello, Claudia; Hartmaier, Ryan; Arora, Arshi; Huleihel, Luai; Pandit, Kusum V.; Bais, Abha S.; Butterworth, Michael; Kaminski, Naftali; Stormo, Gary D.; Oesterreich, Steffi; Benos, Panayiotis V.
2012-01-01
MicroRNAs (miRNAs) are post-transcriptional regulators that bind to their target mRNAs through base complementarity. Predicting miRNA targets is a challenging task and various studies showed that existing algorithms suffer from high number of false predictions and low to moderate overlap in their predictions. Until recently, very few algorithms considered the dynamic nature of the interactions, including the effect of less specific interactions, the miRNA expression level, and the effect of combinatorial miRNA binding. Addressing these issues can result in a more accurate miRNA:mRNA modeling with many applications, including efficient miRNA-related SNP evaluation. We present a novel thermodynamic model based on the Fermi-Dirac equation that incorporates miRNA expression in the prediction of target occupancy and we show that it improves the performance of two popular single miRNA target finders. Modeling combinatorial miRNA targeting is a natural extension of this model. Two other algorithms show improved prediction efficiency when combinatorial binding models were considered. ComiR (Combinatorial miRNA targeting), a novel algorithm we developed, incorporates the improved predictions of the four target finders into a single probabilistic score using ensemble learning. Combining target scores of multiple miRNAs using ComiR improves predictions over the naïve method for target combination. ComiR scoring scheme can be used for identification of SNPs affecting miRNA binding. As proof of principle, ComiR identified rs17737058 as disruptive to the miR-488-5p:NCOA1 interaction, which we confirmed in vitro. We also found rs17737058 to be significantly associated with decreased bone mineral density (BMD) in two independent cohorts indicating that the miR-488-5p/NCOA1 regulatory axis is likely critical in maintaining BMD in women. With increasing availability of comprehensive high-throughput datasets from patients ComiR is expected to become an essential tool for miRNA-related studies. PMID:23284279
A Combinatorial Platform for the Optimization of Peptidomimetic Methyl-Lysine Reader Antagonists
NASA Astrophysics Data System (ADS)
Barnash, Kimberly D.
Post-translational modification of histone N-terminal tails mediates chromatin compaction and, consequently, DNA replication, transcription, and repair. While numerous post-translational modifications decorate histone tails, lysine methylation is an abundant mark important for both gene activation and repression. Methyl-lysine (Kme) readers function through binding mono-, di-, or trimethyl-lysine. Chemical intervention of Kme readers faces numerous challenges due to the broad surface-groove interactions between readers and their cognate histone peptides; yet, the increasing interest in understanding chromatin-modifying complexes suggests tractable lead compounds for Kme readers are critical for elucidating the mechanisms of chromatin dysregulation in disease states and validating the druggability of these domains and complexes. The successful discovery of a peptide-derived chemical probe, UNC3866, for the Polycomb repressive complex 1 (PRC1) chromodomain Kme readers has proven the potential for selective peptidomimetic inhibition of reader function. Unfortunately, the systematic modification of peptides-to-peptidomimetics is a costly and inefficient strategy for target-class hit discovery against Kme readers. Through the exploration of biased chemical space via combinatorial on-bead libraries, we have developed two concurrent methodologies for Kme reader chemical probe discovery. We employ biased peptide combinatorial libraries as a hit discovery strategy with subsequent optimization via iterative targeted libraries. Peptide-to-peptidomimetic optimization through targeted library design was applied based on structure-guided library design around the interaction of the endogenous peptide ligand with three target Kme readers. Efforts targeting the WD40 reader EED led to the discovery of the 3-mer peptidomimetic ligand UNC5115 while combinatorial repurposing of UNC3866 for off-target chromodomains resulted in the discovery of UNC4991, a CDYL/2-selective ligand, and UNC4848, a MPP8 and CDYL/2 ligand. Ultimately, our efforts demonstrate the generalizability of a peptidomimetic combinatorial platform for the optimization of Kme reader ligands in a target class manner.
Kawashima, Kazuhiro; Okamoto, Yuji; Annayev, Orazmuhammet; Toyokura, Nobuo; Takahashi, Ryota; Lippmaa, Mikk; Itaka, Kenji; Suzuki, Yoshikazu; Matsuki, Nobuyuki; Koinuma, Hideomi
2017-01-01
As an extension of combinatorial molecular layer epitaxy via ablation of perovskite oxides by a pulsed excimer laser, we have developed a laser molecular beam epitaxy (MBE) system for parallel integration of nano-scaled thin films of organic-inorganic hybrid materials. A pulsed infrared (IR) semiconductor laser was adopted for thermal evaporation of organic halide (A-site: CH 3 NH 3 I) and inorganic halide (B-site: PbI 2 ) powder targets to deposit repeated A/B bilayer films where the thickness of each layer was controlled on molecular layer scale by programming the evaporation IR laser pulse number, length, or power. The layer thickness was monitored with an in situ quartz crystal microbalance and calibrated against ex situ stylus profilometer measurements. A computer-controlled movable mask system enabled the deposition of combinatorial thin film libraries, where each library contains a vertically homogeneous film with spatially programmable A- and B-layer thicknesses. On the composition gradient film, a hole transport Spiro-OMeTAD layer was spin-coated and dried followed by the vacuum evaporation of Ag electrodes to form the solar cell. The preliminary cell performance was evaluated by measuring I - V characteristics at seven different positions on the 12.5 mm × 12.5 mm combinatorial library sample with seven 2 mm × 4 mm slits under a solar simulator irradiation. The combinatorial solar cell library clearly demonstrated that the energy conversion efficiency sharply changes from nearly zero to 10.2% as a function of the illumination area in the library. The exploration of deposition parameters for obtaining optimum performance could thus be greatly accelerated. Since the thickness ratio of PbI 2 and CH 3 NH 3 I can be freely chosen along the shadow mask movement, these experiments show the potential of this system for high-throughput screening of optimum chemical composition in the binary film library and application to halide perovskite solar cell.
NASA Astrophysics Data System (ADS)
Evans, Garrett Nolan
In this work, I present two projects that both contribute to the aim of discovering how intelligence manifests in the brain. The first project is a method for analyzing recorded neural signals, which takes the form of a convolution-based metric on neural membrane potential recordings. Relying only on integral and algebraic operations, the metric compares the timing and number of spikes within recordings as well as the recordings' subthreshold features: summarizing differences in these with a single "distance" between the recordings. Like van Rossum's (2001) metric for spike trains, the metric is based on a convolution operation that it performs on the input data. The kernel used for the convolution is carefully chosen such that it produces a desirable frequency space response and, unlike van Rossum's kernel, causes the metric to be first order both in differences between nearby spike times and in differences between same-time membrane potential values: an important trait. The second project is a combinatorial syntax method for connectionist semantic network encoding. Combinatorial syntax has been a point on which those who support a symbol-processing view of intelligent processing and those who favor a connectionist view have had difficulty seeing eye-to-eye. Symbol-processing theorists have persuasively argued that combinatorial syntax is necessary for certain intelligent mental operations, such as reasoning by analogy. Connectionists have focused on the versatility and adaptability offered by self-organizing networks of simple processing units. With this project, I show that there is a way to reconcile the two perspectives and to ascribe a combinatorial syntax to a connectionist network. The critical principle is to interpret nodes, or units, in the connectionist network as bound integrations of the interpretations for nodes that they share links with. Nodes need not correspond exactly to neurons and may correspond instead to distributed sets, or assemblies, of neurons.
Zhao, Zheng; Bai, Jing; Wu, Aiwei; Wang, Yuan; Zhang, Jinwen; Wang, Zishan; Li, Yongsheng; Xu, Juan; Li, Xia
2015-01-01
Long non-coding RNAs (lncRNAs) are emerging as key regulators of diverse biological processes and diseases. However, the combinatorial effects of these molecules in a specific biological function are poorly understood. Identifying co-expressed protein-coding genes of lncRNAs would provide ample insight into lncRNA functions. To facilitate such an effort, we have developed Co-LncRNA, which is a web-based computational tool that allows users to identify GO annotations and KEGG pathways that may be affected by co-expressed protein-coding genes of a single or multiple lncRNAs. LncRNA co-expressed protein-coding genes were first identified in publicly available human RNA-Seq datasets, including 241 datasets across 6560 total individuals representing 28 tissue types/cell lines. Then, the lncRNA combinatorial effects in a given GO annotations or KEGG pathways are taken into account by the simultaneous analysis of multiple lncRNAs in user-selected individual or multiple datasets, which is realized by enrichment analysis. In addition, this software provides a graphical overview of pathways that are modulated by lncRNAs, as well as a specific tool to display the relevant networks between lncRNAs and their co-expressed protein-coding genes. Co-LncRNA also supports users in uploading their own lncRNA and protein-coding gene expression profiles to investigate the lncRNA combinatorial effects. It will be continuously updated with more human RNA-Seq datasets on an annual basis. Taken together, Co-LncRNA provides a web-based application for investigating lncRNA combinatorial effects, which could shed light on their biological roles and could be a valuable resource for this community. Database URL: http://www.bio-bigdata.com/Co-LncRNA/. © The Author(s) 2015. Published by Oxford University Press.
1985-10-01
NOTE3 1W. KFY OORDS (Continwo =n reverse aide If necesesar aid ldwttlfy by" block ntmber) •JW7 Regions, COM-EOM Region Ident• fication GIFT Material...technique of mobna.tcri• i Geometr- (Com-Geom). The Com-Gem data is used as input to the Geometric Inf• •cation for Targets ( GIFT ) computer code to... GIFT ) 2 3 computer code. This report documents the combinatorial geometry (Com-Geom) target description data which is the input data for the GIFT code
Combinatorial interpretation of Haldane-Wu fractional exclusion statistics.
Aringazin, A K; Mazhitov, M I
2002-08-01
Assuming that the maximal allowed number of identical particles in a state is an integer parameter, q, we derive the statistical weight and analyze the associated equation that defines the statistical distribution. The derived distribution covers Fermi-Dirac and Bose-Einstein ones in the particular cases q=1 and q--> infinity (n(i)/q-->1), respectively. We show that the derived statistical weight provides a natural combinatorial interpretation of Haldane-Wu fractional exclusion statistics, and present exact solutions of the distribution equation.
NASA Astrophysics Data System (ADS)
Xue, Wei; Wang, Qi; Wang, Tianyu
2018-04-01
This paper presents an improved parallel combinatory spread spectrum (PC/SS) communication system with the method of double information matching (DIM). Compared with conventional PC/SS system, the new model inherits the advantage of high transmission speed, large information capacity and high security. Besides, the problem traditional system will face is the high bit error rate (BER) and since its data-sequence mapping algorithm. Hence the new model presented shows lower BER and higher efficiency by its optimization of mapping algorithm.
Mapping protein-protein interactions with phage-displayed combinatorial peptide libraries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kay, B. K.; Castagnoli, L.; Biosciences Division
This unit describes the process and analysis of affinity selecting bacteriophage M13 from libraries displaying combinatorial peptides fused to either a minor or major capsid protein. Direct affinity selection uses target protein bound to a microtiter plate followed by purification of selected phage by ELISA. Alternatively, there is a bead-based affinity selection method. These methods allow one to readily isolate peptide ligands that bind to a protein target of interest and use the consensus sequence to search proteomic databases for putative interacting proteins.
Distributed Combinatorial Optimization Using Privacy on Mobile Phones
NASA Astrophysics Data System (ADS)
Ono, Satoshi; Katayama, Kimihiro; Nakayama, Shigeru
This paper proposes a method for distributed combinatorial optimization which uses mobile phones as computers. In the proposed method, an ordinary computer generates solution candidates and mobile phones evaluates them by referring privacy — private information and preferences. Users therefore does not have to send their privacy to any other computers and does not have to refrain from inputting their preferences. They therefore can obtain satisfactory solution. Experimental results have showed the proposed method solved room assignment problems without sending users' privacy to a server.
Combinatorial FSK modulation for power-efficient high-rate communications
NASA Technical Reports Server (NTRS)
Wagner, Paul K.; Budinger, James M.; Vanderaar, Mark J.
1991-01-01
Deep-space and satellite communications systems must be capable of conveying high-rate data accurately with low transmitter power, often through dispersive channels. A class of noncoherent Combinatorial Frequency Shift Keying (CFSK) modulation schemes is investigated which address these needs. The bit error rate performance of this class of modulation formats is analyzed and compared to the more traditional modulation types. Candidate modulator, demodulator, and digital signal processing (DSP) hardware structures are examined in detail. System-level issues are also discussed.
Statistical Mechanics of Combinatorial Auctions
NASA Astrophysics Data System (ADS)
Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo
2006-09-01
Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.
Aerospace applications of integer and combinatorial optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in solving combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem, for example, seeks the optimal locations for vibration-damping devices on a large space structure and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
NASA Astrophysics Data System (ADS)
Kang, Angray S.; Barbas, Carlos F.; Janda, Kim D.; Benkovic, Stephen J.; Lerner, Richard A.
1991-05-01
We describe a method based on a phagemid vector with helper phage rescue for the construction and rapid analysis of combinatorial antibody Fab libraries. This approach should allow the generation and selection of many monoclonal antibodies. Antibody genes are expressed in concert with phage morphogenesis, thereby allowing incorporation of functional Fab molecules along the surface of filamentous phage. The power of the method depends upon the linkage of recognition and replication functions and is not limited to antibody molecules.
NASA Astrophysics Data System (ADS)
Godovsky, D.; Chen, L.; Petterson, L.; Inganäs, O.
2000-11-01
The influence of Zinc Phtalocyanine admixture to fullerene layers on top of PTOPT to the photovoltaic cells performance was studied. In order to investigate all the possible combinations of ZnPc and C60 the combinatorial technique was developed consisting in thermal co-evaporation of ZnPc and C60 from two different boats. The significant increase in solar cells photocurrent was observed, coming from ZnPc absorbance bands, especially for the layers containing 1:1 molar ratio of the components.
Structural Equation Model Trees
ERIC Educational Resources Information Center
Brandmaier, Andreas M.; von Oertzen, Timo; McArdle, John J.; Lindenberger, Ulman
2013-01-01
In the behavioral and social sciences, structural equation models (SEMs) have become widely accepted as a modeling tool for the relation between latent and observed variables. SEMs can be seen as a unification of several multivariate analysis techniques. SEM Trees combine the strengths of SEMs and the decision tree paradigm by building tree…
New Approaches to a Subject of Anthropocentric Linguistics
ERIC Educational Resources Information Center
Lee, Valentine S.; Tumanova, Ainakul B.; Salkhanova, Zhanat H.
2016-01-01
The article studies theoretical issues of modern anthropocentric paradigm of scientific knowledge from the history of anthropocentric linguistics development as a special field of language science. The purpose of this study is to answer the question about human influence on the semiotic system. The material result is the unification of specific…
1980-10-31
ASSIFICATION OOWRGRAOIRG| OULE IN/A Since UNCLASSIFIED14 IST•R|IETIORS TATE[MENT (OR lAIR R.PRR) Approved for public release; distribution unlimited. 1 IS...well as by uncertainties in nonlinear devices, unknown non- linearities, nonuniform materials (sail and concrete), and many other factors. On the
Nationwide, there is a strong need to streamline methods for assessing impairment of surface waters (305b listings), diagnosing cause of biological impairment (303d listings), estimating total maximum daily loads (TMDLs), and/or prioritizing watershed restoration activities (Unif...
Unification of the macro- and microbiome in trophic ecology
USDA-ARS?s Scientific Manuscript database
Biological control is a key part of virtually any IPM program, and microbial bio-control agents represent particularly effective agents because of their capacity to be applied via conventional spray application methods. We are showing that fungi function just as arthropods do in the food web—the fun...
Avionics System Architecture for NASA Orion Vehicle
NASA Technical Reports Server (NTRS)
Baggerman, Clint
2010-01-01
This viewgraph presentation reviews the Orion Crew Exploration Vehicle avionics architecture. The contents include: 1) What is Orion?; 2) Orion Concept of Operations; 3) Orion Subsystems; 4) Orion Avionics Architecture; 5) Orion Avionics-Network; 6) Orion Network Unification; 7) Orion Avionics-Integrity; 8) Orion Avionics-Partitioning; and 9) Orion Avionics-Redundancy.
Interface Architecture for Testing in Foreign Language Education
ERIC Educational Resources Information Center
Laborda, Jesus Garcia
2009-01-01
The implications of new learning environments have been far-reaching and pervasive (Plass, 1998), at least in the field of interface design both in traditional computer and mobile devices (Fallahkhair, Pemberton, & Griffiths, 2007). Given the current status of efficient models, educators need the unproven unification of interfaces and working…
The American Work Force, 1992-2005. Historical Trends, 1950-92, and Current Uncertainties.
ERIC Educational Resources Information Center
Kutscher, Ronald E.
1993-01-01
Reviews the trends of the last four decades in terms of the labor force, economics, employment by industry, and employment by occupation. Considers uncertainties surrounding projections to 2005: end of the cold war, European unification, and the North American Free Trade Agreement. (SK)
29 CFR 779.218 - Methods to accomplish “unified operation.”
Code of Federal Regulations, 2010 CFR
2010-07-01
..., join together to perform some or all of their activities as a unified business or business system. They may accomplish such unification through agreements, franchises, grants, leases, or other arrangements... others so that they constitute a single business or unified business system. Whether in any particular...
Unification of Speaker and Meaning in Language Comprehension: An fMRI Study
ERIC Educational Resources Information Center
Tesink, Cathelijne M. J. Y.; Petersson, Karl Magnus; van Berkum, Jos J. A.; van den Brink, Danielle; Buitelaar, Jan K.; Hagoort, Peter
2009-01-01
When interpreting a message, a listener takes into account several sources of linguistic and extralinguistic information. Here we focused on one particular form of extralinguistic information, certain speaker characteristics as conveyed by the voice. Using functional magnetic resonance imaging, we examined the neural structures involved in the…
Towards Automatic Threat Recognition
2006-12-01
York: Bantam. Forschungsinstitut für Kommunikation, Informationsverarbeitung und Ergonomie FGAN Informationstechnik und Führungssysteme KIE Towards...Informationsverarbeitung und Ergonomie FGAN Informationstechnik und Führungssysteme KIE Content Preliminaries about Information Fusion The System Ontology Unification...as Processing Principle Back to the Example Conclusion and Outlook Forschungsinstitut für Kommunikation, Informationsverarbeitung und Ergonomie FGAN
2014-04-01
absolute dynamic height (ADH; in meters) from the Archiving, Validation, and Interpretation of Satellite Oceano - graphic data (AVISO) product [this...altimeter product was produced by the Segment Sol multimissions d’Altimetrie, d’Orbitographie et de localisation precise (Ssalto)/Data Unification and
ERIC Educational Resources Information Center
Connors, Lyndsay
An analysis of Australia's two conflicting trends in school governance and their effectiveness in meeting two major educational challenges is the purpose of this paper. Nationalization, which refers to greater centralization and increased national regulation; and privatization, which refers to decentralization, deregulation, and increased local…
Esteve, Clara; D'Amato, Alfonsina; Marina, María Luisa; García, María Concepción; Righetti, Pier Giorgio
2012-09-01
Avocado (Persea americana) proteins have been scarcely studied despite their importance, especially in food related allergies. The proteome of avocado pulp was explored in depth by extracting proteins with capture by combinatorial peptide ligand libraries at pH 7.4 and under conditions mimicking reverse-phase capture at pH 2.2. The total number of unique gene products identified amounts to 1012 proteins, of which 174 are in common with the control, untreated sample, 190 are present only in the control and 648 represent the new species detected via combinatorial peptide ligand libraries of all combined eluates and likely represent low-abundance proteins. Among the 1012 proteins, it was possible to identify the already known avocado allergen Pers a 1 and different proteins susceptible to be allergens such as a profilin, a polygalacturonase, a thaumatin-like protein, a glucanase, and an isoflavone reductase like protein. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A path-oriented knowledge representation system: Defusing the combinatorial system
NASA Technical Reports Server (NTRS)
Karamouzis, Stamos T.; Barry, John S.; Smith, Steven L.; Feyock, Stefan
1995-01-01
LIMAP is a programming system oriented toward efficient information manipulation over fixed finite domains, and quantification over paths and predicates. A generalization of Warshall's Algorithm to precompute paths in a sparse matrix representation of semantic nets is employed to allow questions involving paths between components to be posed and answered easily. LIMAP's ability to cache all paths between two components in a matrix cell proved to be a computational obstacle, however, when the semantic net grew to realistic size. The present paper describes a means of mitigating this combinatorial explosion to an extent that makes the use of the LIMAP representation feasible for problems of significant size. The technique we describe radically reduces the size of the search space in which LIMAP must operate; semantic nets of more than 500 nodes have been attacked successfully. Furthermore, it appears that the procedure described is applicable not only to LIMAP, but to a number of other combinatorially explosive search space problems found in AI as well.
Dubey, Ritesh; Desiraju, Gautam R.
2015-01-01
The crystallization of 28 binary and ternary cocrystals of quercetin with dibasic coformers is analyzed in terms of a combinatorial selection from a solution of preferred molecular conformations and supramolecular synthons. The crystal structures are characterized by distinctive O—H⋯N and O—H⋯O based synthons and are classified as nonporous, porous and helical. Variability in molecular conformation and synthon structure led to an increase in the energetic and structural space around the crystallization event. This space is the crystal structure landscape of the compound and is explored by fine-tuning the experimental conditions of crystallization. In the landscape context, we develop a strategy for the isolation of ternary cocrystals with the use of auxiliary template molecules to reduce the molecular and supramolecular ‘confusion’ that is inherent in a molecule like quercetin. The absence of concomitant polymorphism in this study highlights the selectivity in conformation and synthon choice from the virtual combinatorial library in solution. PMID:26175900
Bhat, Venugopal T.; Caniard, Anne M.; Luksch, Torsten; Brenk, Ruth; Campopiano, Dominic J.; Greaney, Michael F.
2010-01-01
Dynamic covalent chemistry uses reversible chemical reactions to set up an equilibrating network of molecules at thermodynamic equilibrium, which can adjust its composition in response to any agent capable of altering the free energy of the system. When the target is a biological macromolecule, such as a protein, the process corresponds to the protein directing the synthesis of its own best ligand. Here, we demonstrate that reversible acylhydrazone formation is an effective chemistry for biological dynamic combinatorial library formation. In the presence of aniline as a nucleophilic catalyst, dynamic combinatorial libraries equilibrate rapidly at pH 6.2, are fully reversible, and may be switched on or off by means of a change in pH. We have interfaced these hydrazone dynamic combinatorial libraries with two isozymes from the glutathione S-transferase class of enzyme, and observed divergent amplification effects, where each protein selects the best-fitting hydrazone for the hydrophobic region of its active site. PMID:20489719
Turkett, Jeremy A; Bicker, Kevin L
2017-04-10
Growing prevalence of antibiotic resistant bacterial infections necessitates novel antimicrobials, which could be rapidly identified from combinatorial libraries. We report the use of the peptoid library agar diffusion (PLAD) assay to screen peptoid libraries against the ESKAPE pathogens, including the optimization of assay conditions for each pathogen. Work presented here focuses on the tailoring of combinatorial peptoid library design through a detailed study of how peptoid lipophilicity relates to antibacterial potency and mammalian cell toxicity. The information gleaned from this optimization was then applied using the aforementioned screening method to examine the relative potency of peptoid libraries against Staphylococcus aureus, Acinetobacter baumannii, and Enterococcus faecalis prior to and following functionalization with long alkyl tails. The data indicate that overall peptoid hydrophobicity and not simply alkyl tail length is strongly correlated with mammalian cell toxicity. Furthermore, this work demonstrates the utility of the PLAD assay in rapidly evaluating the effect of molecular property changes in similar libraries.
Castanotto, Daniela; Sakurai, Kumi; Lingeman, Robert; Li, Haitang; Shively, Louise; Aagaard, Lars; Soifer, Harris; Gatignol, Anne; Riggs, Arthur; Rossi, John J.
2007-01-01
Despite the great potential of RNAi, ectopic expression of shRNA or siRNAs holds the inherent risk of competition for critical RNAi components, thus altering the regulatory functions of some cellular microRNAs. In addition, specific siRNA sequences can potentially hinder incorporation of other siRNAs when used in a combinatorial approach. We show that both synthetic siRNAs and expressed shRNAs compete against each other and with the endogenous microRNAs for transport and for incorporation into the RNA induced silencing complex (RISC). The same siRNA sequences do not display competition when expressed from a microRNA backbone. We also show that TAR RNA binding protein (TRBP) is one of the sensors for selection and incorporation of the guide sequence of interfering RNAs. These findings reveal that combinatorial siRNA approaches can be problematic and have important implications for the methodology of expression and use of therapeutic interfering RNAs. PMID:17660190
Boehm, Markus; Wu, Tong-Ying; Claussen, Holger; Lemmen, Christian
2008-04-24
Large collections of combinatorial libraries are an integral element in today's pharmaceutical industry. It is of great interest to perform similarity searches against all virtual compounds that are synthetically accessible by any such library. Here we describe the successful application of a new software tool CoLibri on 358 combinatorial libraries based on validated reaction protocols to create a single chemistry space containing over 10 (12) possible products. Similarity searching with FTrees-FS allows the systematic exploration of this space without the need to enumerate all product structures. The search result is a set of virtual hits which are synthetically accessible by one or more of the existing reaction protocols. Grouping these virtual hits by their synthetic protocols allows the rapid design and synthesis of multiple follow-up libraries. Such library ideas support hit-to-lead design efforts for tasks like follow-up from high-throughput screening hits or scaffold hopping from one hit to another attractive series.
Latimer, Luke N; Dueber, John E
2017-06-01
A common challenge in metabolic engineering is rapidly identifying rate-controlling enzymes in heterologous pathways for subsequent production improvement. We demonstrate a workflow to address this challenge and apply it to improving xylose utilization in Saccharomyces cerevisiae. For eight reactions required for conversion of xylose to ethanol, we screened enzymes for functional expression in S. cerevisiae, followed by a combinatorial expression analysis to achieve pathway flux balancing and identification of limiting enzymatic activities. In the next round of strain engineering, we increased the copy number of these limiting enzymes and again tested the eight-enzyme combinatorial expression library in this new background. This workflow yielded a strain that has a ∼70% increase in biomass yield and ∼240% increase in xylose utilization. Finally, we chromosomally integrated the expression library. This library enriched for strains with multiple integrations of the pathway, which likely were the result of tandem integrations mediated by promoter homology. Biotechnol. Bioeng. 2017;114: 1301-1309. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Device for preparing combinatorial libraries in powder metallurgy.
Yang, Shoufeng; Evans, Julian R G
2004-01-01
This paper describes a powder-metering, -mixing, and -dispensing mechanism that can be used as a method for producing large numbers of samples for metallurgical evaluation or electrical or mechanical testing from multicomponent metal and cermet powder systems. It is designed to make use of the same commercial powders that are used in powder metallurgy and, therefore, to produce samples that are faithful to the microstructure of finished products. The particle assemblies produced by the device could be consolidated by die pressing, isostatic pressing, laser sintering, or direct melting. The powder metering valve provides both on/off and flow rate control of dry powders in open capillaries using acoustic vibration. The valve is simple and involves no relative movement, avoiding seizure with fine powders. An orchestra of such valves can be arranged on a building platform to prepare multicomponent combinatorial libraries. As with many combinatorial devices, identification and evaluation of sources of mixing error as a function of sample size is mandatory. Such an analysis is presented.
Ashbaugh, Alyssa G.; Jiang, Xuesong; Zheng, Jesse; Tsai, Andrew S.; Kim, Woo-Shin; Thompson, John M.; Miller, Robert J.; Shahbazian, Jonathan H.; Wang, Yu; Dillen, Carly A.; Ordonez, Alvaro A.; Chang, Yong S.; Jain, Sanjay K.; Jones, Lynne C.; Sterling, Robert S.; Mao, Hai-Quan; Miller, Lloyd S.
2016-01-01
Bacterial biofilm formation is a major complication of implantable medical devices that results in therapeutically challenging chronic infections, especially in cases involving antibiotic-resistant bacteria. As an approach to prevent these infections, an electrospun composite coating comprised of poly(lactic-coglycolic acid) (PLGA) nanofibers embedded in a poly(ε-caprolactone) (PCL) film was developed to locally codeliver combinatorial antibiotics from the implant surface. The release of each antibiotic could be adjusted by loading each drug into the different polymers or by varying PLGA:PCL polymer ratios. In a mouse model of biofilm-associated orthopedic-implant infection, three different combinations of antibiotic-loaded coatings were highly effective in preventing infection of the bone/joint tissue and implant biofilm formation and were biocompatible with enhanced osseointegration. This nanofiber composite-coating technology could be used to tailor the delivery of combinatorial antimicrobial agents from various metallic implantable devices or prostheses to effectively decrease biofilm-associated infections in patients. PMID:27791154
Ito, Yoichiro; Yamanishi, Mamoru; Ikeuchi, Akinori; Imamura, Chie; Matsuyama, Takashi
2015-01-01
Combinatorial screening used together with a broad library of gene expression cassettes is expected to produce a powerful tool for the optimization of the simultaneous expression of multiple enzymes. Recently, we proposed a highly tunable protein expression system that utilized multiple genome-integrated target genes to fine-tune enzyme expression in yeast cells. This tunable system included a library of expression cassettes each composed of three gene-expression control elements that in different combinations produced a wide range of protein expression levels. In this study, four gene expression cassettes with graded protein expression levels were applied to the expression of three cellulases: cellobiohydrolase 1, cellobiohydrolase 2, and endoglucanase 2. After combinatorial screening for transgenic yeasts simultaneously secreting these three cellulases, we obtained strains with higher cellulase expressions than a strain harboring three cellulase-expression constructs within one high-performance gene expression cassette. These results show that our method will be of broad use throughout the field of metabolic engineering. PMID:26692026
Combinatorial Strategies for the Development of Bulk Metallic Glasses
NASA Astrophysics Data System (ADS)
Ding, Shiyan
The systematic identification of multi-component alloys out of the vast composition space is still a daunting task, especially in the development of bulk metallic glasses that are typically based on three or more elements. In order to address this challenge, combinatorial approaches have been proposed. However, previous attempts have not successfully coupled the synthesis of combinatorial libraries with high-throughput characterization methods. The goal of my dissertation is to develop efficient high-throughput characterization methods, optimized to identify glass formers systematically. Here, two innovative approaches have been invented. One is to measure the nucleation temperature in parallel for up-to 800 compositions. The composition with the lowest nucleation temperature has a reasonable agreement with the best-known glass forming composition. In addition, the thermoplastic formability of a metallic glass forming system is determined through blow molding a compositional library. Our results reveal that the composition with the largest thermoplastic deformation correlates well with the best-known formability composition. I have demonstrated both methods as powerful tools to develop new bulk metallic glasses.
Loeffler, Felix F; Foertsch, Tobias C; Popov, Roman; Mattes, Daniela S; Schlageter, Martin; Sedlmayr, Martyna; Ridder, Barbara; Dang, Florian-Xuan; von Bojničić-Kninski, Clemens; Weber, Laura K; Fischer, Andrea; Greifenstein, Juliane; Bykovskaya, Valentina; Buliev, Ivan; Bischoff, F Ralf; Hahn, Lothar; Meier, Michael A R; Bräse, Stefan; Powell, Annie K; Balaban, Teodor Silviu; Breitling, Frank; Nesterov-Mueller, Alexander
2016-06-14
Laser writing is used to structure surfaces in many different ways in materials and life sciences. However, combinatorial patterning applications are still limited. Here we present a method for cost-efficient combinatorial synthesis of very-high-density peptide arrays with natural and synthetic monomers. A laser automatically transfers nanometre-thin solid material spots from different donor slides to an acceptor. Each donor bears a thin polymer film, embedding one type of monomer. Coupling occurs in a separate heating step, where the matrix becomes viscous and building blocks diffuse and couple to the acceptor surface. Furthermore, we can consecutively deposit two material layers of activation reagents and amino acids. Subsequent heat-induced mixing facilitates an in situ activation and coupling of the monomers. This allows us to incorporate building blocks with click chemistry compatibility or a large variety of commercially available non-activated, for example, posttranslationally modified building blocks into the array's peptides with >17,000 spots per cm(2).
Thermoelectric properties of the LaCoO3-LaCrO3 system using a high-throughput combinatorial approach
NASA Astrophysics Data System (ADS)
Talley, K. R.; Barron, S. C.; Nguyen, N.; Wong-Ng, W.; Martin, J.; Zhang, Y. L.; Song, X.
2017-02-01
A combinatorial film of the LaCo1-xCrxO3 system was fabricated using the LaCoO3 and LaCrO3 targets at the NIST Pulsed Laser Deposition (PLD) facility. As the ionic size of Cr3+ is greater than that of Co3+, the unit cell volume of the series increases with increasing x. Using a custom screening tool, the Seebeck coefficient of LaCo1-xCrxO3 approaches a measured maximum of 286 μV/K, near to the cobalt-rich end of the film library (with x ≈ 0.49). The resistivity value increases continuously with increasing x. The measured power factor, PF, of this series, which is related to the efficiency of energy conversion, also exhibits a maximum at the composition of x ≈ 0.49, which corresponds to the maximum value of the Seebeck coefficient. Our results illustrate the efficiency of applying the high-throughput combinatorial technique to study thermoelectric materials.
BioPartsBuilder: a synthetic biology tool for combinatorial assembly of biological parts.
Yang, Kun; Stracquadanio, Giovanni; Luo, Jingchuan; Boeke, Jef D; Bader, Joel S
2016-03-15
Combinatorial assembly of DNA elements is an efficient method for building large-scale synthetic pathways from standardized, reusable components. These methods are particularly useful because they enable assembly of multiple DNA fragments in one reaction, at the cost of requiring that each fragment satisfies design constraints. We developed BioPartsBuilder as a biologist-friendly web tool to design biological parts that are compatible with DNA combinatorial assembly methods, such as Golden Gate and related methods. It retrieves biological sequences, enforces compliance with assembly design standards and provides a fabrication plan for each fragment. BioPartsBuilder is accessible at http://public.biopartsbuilder.org and an Amazon Web Services image is available from the AWS Market Place (AMI ID: ami-508acf38). Source code is released under the MIT license, and available for download at https://github.com/baderzone/biopartsbuilder joel.bader@jhu.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Bioengineering Strategies for Designing Targeted Cancer Therapies
Wen, Xuejun
2014-01-01
The goals of bioengineering strategies for targeted cancer therapies are (1) to deliver a high dose of an anticancer drug directly to a cancer tumor, (2) to enhance drug uptake by malignant cells, and (3) to minimize drug uptake by nonmalignant cells. Effective cancer-targeting therapies will require both passive- and active targeting strategies and a thorough understanding of physiologic barriers to targeted drug delivery. Designing a targeted therapy includes the selection and optimization of a nanoparticle delivery vehicle for passive accumulation in tumors, a targeting moiety for active receptor-mediated uptake, and stimuli-responsive polymers for control of drug release. The future direction of cancer targeting is a combinatorial approach, in which targeting therapies are designed to use multiple targeting strategies. The combinatorial approach will enable combination therapy for delivery of multiple drugs and dual ligand targeting to improve targeting specificity. Targeted cancer treatments in development and the new combinatorial approaches show promise for improving targeted anticancer drug delivery and improving treatment outcomes. PMID:23768509
Siol, Sebastian; Holder, Aaron; Ortiz, Brenden R.; ...
2017-05-09
Here, the controlled decomposition of metastable alloys is an attractive route to form nanostructured thermoelectric materials with reduced thermal conductivity. The ternary SnTe–MnTe and SnTe–SnSe heterostructural alloys have been demonstrated as promising materials for thermoelectric applications. In this work, the quaternary Sn 1–yMnyTe 1–xSe x phase space serves as a relevant model system to explore how a combination of computational and combinatorial-growth methods can be used to study equilibrium and non-equilibrium solubility limits. Results from first principle calculations indicate low equilibrium solubility for x,y < 0.05 that are in good agreement with results obtained from bulk equilibrium synthesis experiments andmore » predict significantly higher spinodal limits. An experimental screening using sputtered combinatorial thin film sample libraries showed a remarkable increase in non-equilibrium solubility for x,y > 0.2. These theoretical and experimental results were used to guide the bulk synthesis of metastable alloys. The ability to reproduce the non-equilibrium solubility levels in bulk materials indicates that such theoretical calculations and combinatorial growth can inform bulk synthetic routes. Further, the large difference between equilibrium and non-equilibrium solubility limits in Sn 1–yMn yTe 1–xSe x indicates these metastable alloys are attractive in terms of nano-precipitate formation for potential thermoelectric applications.« less
Tang, Hongying; Cheng, Yongbo; Zhao, Qin; Li, Baoqing; Yuan, Xiaobing
2017-01-01
Routing protocols based on topology control are significantly important for improving network longevity in wireless sensor networks (WSNs). Traditionally, some WSN routing protocols distribute uneven network traffic load to sensor nodes, which is not optimal for improving network longevity. Differently to conventional WSN routing protocols, we propose a dynamic hierarchical protocol based on combinatorial optimization (DHCO) to balance energy consumption of sensor nodes and to improve WSN longevity. For each sensor node, the DHCO algorithm obtains the optimal route by establishing a feasible routing set instead of selecting the cluster head or the next hop node. The process of obtaining the optimal route can be formulated as a combinatorial optimization problem. Specifically, the DHCO algorithm is carried out by the following procedures. It employs a hierarchy-based connection mechanism to construct a hierarchical network structure in which each sensor node is assigned to a special hierarchical subset; it utilizes the combinatorial optimization theory to establish the feasible routing set for each sensor node, and takes advantage of the maximum–minimum criterion to obtain their optimal routes to the base station. Various results of simulation experiments show effectiveness and superiority of the DHCO algorithm in comparison with state-of-the-art WSN routing algorithms, including low-energy adaptive clustering hierarchy (LEACH), hybrid energy-efficient distributed clustering (HEED), genetic protocol-based self-organizing network clustering (GASONeC), and double cost function-based routing (DCFR) algorithms. PMID:28753962
Spread the word: MMN brain response reveals whole-form access of discontinuous particle verbs.
Hanna, Jeff; Cappelle, Bert; Pulvermüller, Friedemann
2017-12-01
The status of particle verbs such as rise (…) up as either lexically stored or combinatorially assembled is an issue which so far has not been settled decisively. In this study, we use the mismatch negativity (MMN) brain response to observe neurophysiological responses to discontinuous particle verbs. The MMN can be used to distinguish between whole-form storage and combinatorial processes, as it is enhanced to stored words compared to unknown pseudowords, whereas combinatorially legal strings elicit a reduced MMN relative to ungrammatical ones. Earlier work had found larger MMNs to congruent than to incongruent verb-particle combinations when particle and verb appeared as adjacent elements, thus suggesting whole-form storage at least in this case. However, it is still possible that particle verbs discontinuously spread out across a sentence would elicit the combinatorial, grammar-violation response pattern instead. Here, we tested the brain signatures of discontinuous verb-particle combinations, orthogonally varying congruence and semantic transparency. The results show for the first time brain indices of whole-form storage for discontinuous constituents, thus arguing in favour of access to whole-form-stored lexical elements in the processing of particle verbs, irrespective of their semantic opacity. Results are discussed in the context of linguistic debates about the status of particle verbs as words, lexical elements or syntactically generated combinations. The explanation of the pattern of results within a neurobiological language model is highlighted. Copyright © 2017 Elsevier Inc. All rights reserved.
Brown, Colby R; McCalla, Eric; Watson, Cody; Dahn, J R
2015-06-08
Combinatorial synthesis has proven extremely effective in screening for new battery materials for Li-ion battery electrodes. Here, a study in the Li-Ni-Mn-Co-O system is presented, wherein samples with nearly 800 distinct compositions were prepared using a combinatorial and high-throughput method to screen for single-phase materials of high interest as next generation positive electrode materials. X-ray diffraction is used to determine the crystal structure of each sample. The Gibbs' pyramid representing the pseudoquaternary system was studied by making samples within three distinct pseudoternary planes defined at fractional cobalt metal contents of 10%, 20%, and 30% within the Li-Ni-Mn-Co-O system. Two large single-phase regions were observed in the system: the layered region (ordered rocksalt) and cubic spinel region; both of which are of interest for next-generation positive electrodes in lithium-ion batteries. These regions were each found to stretch over a wide range of compositions within the Li-Ni-Mn-Co-O pseudoquaternary system and had complex coexistence regions existing between them. The sample cooling rate was found to have a significant effect on the position of the phase boundaries of the single-phase regions. The results of this work are intended to guide further research by narrowing the composition ranges worthy of study and to illustrate the broad range of applications where solution-based combinatorial synthesis can have significant impact.
Estrin, Michael A; Hussein, Islam T M; Puryear, Wendy B; Kuan, Anne C; Artim, Stephen C; Runstadler, Jonathan A
2018-01-01
Influenza A virus infections are important causes of morbidity and mortality worldwide, and currently available prevention and treatment methods are suboptimal. In recent years, genome-wide investigations have revealed numerous host factors that are required for influenza to successfully complete its life cycle. However, only a select, small number of influenza strains were evaluated using this platform, and there was considerable variation in the genes identified across different investigations. In an effort to develop a universally efficacious therapeutic strategy with limited potential for the emergence of resistance, this study was performed to investigate the effect of combinatorial RNA interference (RNAi) on inhibiting the replication of diverse influenza A virus subtypes and strains. Candidate genes were selected for targeting based on the results of multiple previous independent genome-wide studies. The effect of single and combinatorial RNAi on the replication of 12 diverse influenza A viruses, including three strains isolated from birds and one strain isolated from seals, was then evaluated in primary normal human bronchial epithelial cells. After excluding overly toxic siRNA, two siRNA combinations were identified that reduced mean viral replication by greater than 79 percent in all mammalian strains, and greater than 68 percent in all avian strains. Host-directed combinatorial RNAi effectively prevents growth of a broad range of influenza virus strains in vitro, and is a potential therapeutic candidate for further development and future in vivo studies.
Gene-network inference by message passing
NASA Astrophysics Data System (ADS)
Braunstein, A.; Pagnani, A.; Weigt, M.; Zecchina, R.
2008-01-01
The inference of gene-regulatory processes from gene-expression data belongs to the major challenges of computational systems biology. Here we address the problem from a statistical-physics perspective and develop a message-passing algorithm which is able to infer sparse, directed and combinatorial regulatory mechanisms. Using the replica technique, the algorithmic performance can be characterized analytically for artificially generated data. The algorithm is applied to genome-wide expression data of baker's yeast under various environmental conditions. We find clear cases of combinatorial control, and enrichment in common functional annotations of regulated genes and their regulators.
Fu, Junjie; Lee, Timothy; Qi, Xin
2014-01-01
G protein-coupled receptors (GPCRs), which are involved in virtually every biological process, constitute the largest family of transmembrane receptors. Many top-selling and newly approved drugs target GPCRs. In this review, we aim to recapitulate efforts and progress in combinatorial library-assisted GPCR ligand discovery, particularly focusing on one-bead-one-compound library synthesis and quantum dot-labeled cell-based assays, which both effectively enhance the rapid identification of GPCR ligands with higher affinity and specificity. PMID:24941874
Design of diversity and focused combinatorial libraries in drug discovery.
Young, S Stanley; Ge, Nanxiang
2004-05-01
Using well-characterized chemical reactions and readily available monomers, chemists are able to create sets of compounds, termed libraries, which are useful in drug discovery processes. The design of combinatorial chemical libraries can be complex and there has been much information recently published offering suggestions on how the design process can be carried out. This review focuses on literature with the goal of organizing current thinking. At this point in time, it is clear that benchmarking of current suggested methods is required as opposed to further new methods.
Jiménez-Moreno, Ester; Gómez, Ana M; Bastida, Agatha; Corzana, Francisco; Jiménez-Oses, Gonzalo; Jiménez-Barbero, Jesús; Asensio, Juan Luis
2015-03-27
Electrostatic and charge-transfer contributions to CH-π complexes can be modulated by attaching electron-withdrawing substituents to the carbon atom. While clearly stabilizing in the gas phase, the outcome of this chemical modification in water is more difficult to predict. Herein we provide a definitive and quantitative answer to this question employing a simple strategy based on dynamic combinatorial chemistry. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nishimoto, Ryu; Tani, Jun
2004-09-01
This study shows how sensory-action sequences of imitating finite state machines (FSMs) can be learned by utilizing the deterministic dynamics of recurrent neural networks (RNNs). Our experiments indicated that each possible combinatorial sequence can be recalled by specifying its respective initial state value and also that fractal structures appear in this initial state mapping after the learning converges. We also observed that the sequences of mimicking FSMs are encoded utilizing the transient regions rather than the invariant sets of the evolved dynamical systems of the RNNs.
Extremal problems for topological indices in combinatorial chemistry.
Tichy, Robert F; Wagner, Stephan
2005-09-01
Topological indices of molecular graphs are related to several physicochemical characteristics; recently, the inverse problem for some of these indices has been studied, and it has some applications in the design of combinatorial libraries for drug discovery. It is thus very natural to study also extremal problems for these indices, i.e., finding graphs having minimal or maximal index. In this paper, these questions will be discussed for three different indices, namely the sigma-index, the c-index and the Z-index, with emphasis on the sigma-index.
Charleston, M A
1995-01-01
This article introduces a coherent language base for describing and working with characteristics of combinatorial optimization problems, which is at once general enough to be used in all such problems and precise enough to allow subtle concepts in this field to be discussed unambiguously. An example is provided of how this nomenclature is applied to an instance of the phylogeny problem. Also noted is the beneficial effect, on the landscape of the solution space, of transforming the observed data to account for multiple changes of character state.
Combinatorial structure of genome rearrangements scenarios.
Ouangraoua, Aïda; Bergeron, Anne
2010-09-01
In genome rearrangement theory, one of the elusive questions raised in recent years is the enumeration of rearrangement scenarios between two genomes. This problem is related to the uniform generation of rearrangement scenarios and the derivation of tests of statistical significance of the properties of these scenarios. Here we give an exact formula for the number of double-cut-and-join (DCJ) rearrangement scenarios between two genomes. We also construct effective bijections between the set of scenarios that sort a component as well studied combinatorial objects such as parking functions, labeled trees, and prüfer codes.
NASA Astrophysics Data System (ADS)
Zheng, Genrang; Lin, ZhengChun
The problem of winner determination in combinatorial auctions is a hotspot electronic business, and a NP hard problem. A Hybrid Artificial Fish Swarm Algorithm(HAFSA), which is combined with First Suite Heuristic Algorithm (FSHA) and Artificial Fish Swarm Algorithm (AFSA), is proposed to solve the problem after probing it base on the theories of AFSA. Experiment results show that the HAFSA is a rapidly and efficient algorithm for The problem of winner determining. Compared with Ant colony Optimization Algorithm, it has a good performance with broad and prosperous application.
Limpoco, F Ted; Bailey, Ryan C
2011-09-28
We directly monitor in parallel and in real time the temporal profiles of polymer brushes simultaneously grown via multiple ATRP reaction conditions on a single substrate using arrays of silicon photonic microring resonators. In addition to probing relative polymerization rates, we show the ability to evaluate the dynamic properties of the in situ grown polymers. This presents a powerful new platform for studying modified interfaces that may allow for the combinatorial optimization of surface-initiated polymerization conditions.
Aerospace Applications of Integer and Combinatorial Optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in formulating and solving integer and combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem, for example, seeks the optimal locations for vibration-damping devices on an orbiting platform and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
Aerospace applications on integer and combinatorial optimization
NASA Technical Reports Server (NTRS)
Padula, S. L.; Kincaid, R. K.
1995-01-01
Research supported by NASA Langley Research Center includes many applications of aerospace design optimization and is conducted by teams of applied mathematicians and aerospace engineers. This paper investigates the benefits from this combined expertise in formulating and solving integer and combinatorial optimization problems. Applications range from the design of large space antennas to interior noise control. A typical problem. for example, seeks the optimal locations for vibration-damping devices on an orbiting platform and is expressed as a mixed/integer linear programming problem with more than 1500 design variables.
Combinatorial synthesis of bimetallic complexes with three halogeno bridges.
Gauthier, Sébastien; Quebatte, Laurent; Scopelliti, Rosario; Severin, Kay
2004-06-07
Methods for the synthesis of bimetallic complexes in which two different metal fragments are connected by three chloro or bromo bridges are reported. The reactions are general, fast, and give rise to structurally defined products in quantitative yields. Therefore, they are ideally suited for generating a library of homo- and heterobimetallic complexes in a combinatorial fashion. This is of special interest for applications in homogeneous catalysis. Selected members of this library were synthesized and comprehensively characterized; single-crystal X-ray analyses were performed for 15 new bimetallic compounds.
Development of the concept of emergy established a medium for accounting that made it possible to express economic and environmental work of all kinds on a common basis as solar emjoules. Environmental accounting using emdollars, a combined emergy-monetary unit, can be used to pr...
Reform of the College Entrance Examination: Ideology, Principles, and Policy Recommendations
ERIC Educational Resources Information Center
Liu, Haifeng
2013-01-01
Reform of the College Entrance Examination is trending toward simultaneous unification and diversification. The objective of reforming the entrance exam is to establish a college enrollment examination system that is primarily based on a unified test, which would assess students' abilities, appraise them on multiple levels, and classify them.…
A Federal Perspective on Improving Practices, Programs, and Policies in Special Education.
ERIC Educational Resources Information Center
Kaufman, Martin; And Others
1993-01-01
Unification of objective and subjective models is suggested as a means for contributing to a professional knowledge base for special education. The knowledge production, access, and use activities of the Office of Special Education programs are user oriented, while incorporating the empirical rigor of the initial research and evaluation. (SLD)
Combat Post-Traumatic Stress Disorder, Alcoholism, and the Police Officer.
ERIC Educational Resources Information Center
Machell, David F.
This report describes the psychological profile of a police officer who suffers from three dimensions of emotional complication: combat post-traumatic stress disorder (CPTSD), alcoholism, and role immersion. Each of the three dimensions is discussed separately, followed by a discussion of their interaction and unification. It is noted that alcohol…
Sen. Enzi, Michael B. [R-WY
2011-02-16
Senate - 02/16/2011 Read twice and referred to the Committee on Banking, Housing, and Urban Affairs. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation: