EHW Approach to Temperature Compensation of Electronics
NASA Technical Reports Server (NTRS)
Stoica, Adrian
2004-01-01
Efforts are under way to apply the concept of evolvable hardware (EHW) to compensate for variations, with temperature, in the operational characteristics of electronic circuits. To maintain the required functionality of a given circuit at a temperature above or below the nominal operating temperature for which the circuit was originally designed, a new circuit would be evolved; moreover, to obtain the required functionality over a very wide temperature range, there would be evolved a number of circuits, each of which would satisfy the performance requirements over a small part of the total temperature range. The basic concepts and some specific implementations of EHW were described in a number of previous NASA Tech Briefs articles, namely, "Reconfigurable Arrays of Transistors for Evolvable Hardware" (NPO-20078), Vol. 25, No. 2 (February 2001), page 36; Evolutionary Automated Synthesis of Electronic Circuits (NPO- 20535), Vol. 26, No. 7 (July 2002), page 37; "Designing Reconfigurable Antennas Through Hardware Evolution" (NPO-20666), Vol. 26, No. 7 (July 2002), page 38; "Morphing in Evolutionary Synthesis of Electronic Circuits" (NPO-20837), Vol. 26, No. 8 (August 2002), page 31; "Mixtrinsic Evolutionary Synthesis of Electronic Circuits" (NPO-20773) Vol. 26, No. 8 (August 2002), page 32; and "Synthesis of Fuzzy-Logic Circuits in Evolvable Hardware" (NPO-21095) Vol. 26, No. 11 (November 2002), page 38. To recapitulate from the cited prior articles: EHW is characterized as evolutionary in a quasi-genetic sense. The essence of EHW is to construct and test a sequence of populations of circuits that function as incrementally better solutions of a given design problem through the selective, repetitive connection and/or disconnection of capacitors, transistors, amplifiers, inverters, and/or other circuit building blocks. The connection and disconnection can be effected by use of field-programmable transistor arrays (FPTAs). The evolution is guided by a search-andoptimization algorithm (in particular, a genetic algorithm) that operates in the space of possible circuits to find a circuit that exhibits an acceptably close approximation of the desired functionality. The evolved circuits can be tested by mathematical modeling (that is, computational simulation) only, tested in real hardware, or tested in combinations of computational simulation and real hardware.
Considerations for Deep Maneuver: Lessons from North Africa, 1941-1942.
1985-01-01
need arises from the evolutionary changes inherent in the concepts of AirLand Battle doctrine. Among these changes are the reintroduction of the...General Franz Halder, The Halder Diaries: The Private War Journals of Colonel General Franz Halder, 2 vols. ( Colorado : T.N Dupuy and Westview Press, Inc...Franz Halder. 2 vols. Boulder, Colorado : T.N. Dupuy Associates and Westview Press, Inc., 1976. Reprint of an 8 vol. work originally published by the
1978-10-11
REQUIREMENTS OF COMPUTER USERS Warsaw INFORMATYKA in Polish Vol 12 No 8, 1977 pp 12-14 CHELCHOWSKI, JERZY, Academy of Economics, Wroclaw [Abstract...Western. 11 E. Hardware POLAND SQUARE-LOOP FERRITE CORES IN THE WORKING STORAGE OF MODERN COMPUTERS Warsaw INFORMATYKA in Polish Vol 12 No 5...INDUSTRY PLANT Warsaw INFORMATYKA in Polish Vol 12 No 10, 1977 Pp 20-22 BERNATOWICZ, KRYSTYN [Text] Next to mines, steelworks and shipyards, The H
Estimating Performance of Single Bus, Shared Memory Multiprocessors
1987-05-01
Chandy78] K.M. Chandy, C.M. Sauer, "Approximate methods for analyzing queuing network models of computing systems," Computing Surveys, vol10 , no 3...Denning78] P. Denning, J. Buzen, "The operational analysis of queueing network models", Computing Sur- veys, vol10 , no 3, September 1978, pp 225-261
Architectures and Applications for Scalable Quantum Information Systems
2007-01-01
quantum computation models, such as adiabatic quantum computing , can be converted to quantum circuits. Therefore, in our design flow’s first phase...vol. 26, no. 5, pp. 1484–1509, 1997. [19] A. Childs, E. Farhi, and J. Preskill, “Robustness of adiabatic quantum computation ,” Phys. Rev. A, vol. 65...magnetic resonance computer with three quantum bits that simulates an adiabatic quantum optimization algorithm. Adiabatic
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-13
... Brazil Index Fund (``EWZ''), the Market Vectors Gold Miners ETF (``GDX''), and the Energy Select Sector... volatility.\\6\\ \\6\\ CBOE will be the reporting authority for any Vol Index. CBOE will compute values for Vol... price or the calculated forward value of the respective Vol index. Transactions in Vol Index options may...
1975-05-01
Conference on Earthquake Engineering, Santiago de Chile, 13-18 January 1969, Vol. I , Session B2, Chilean Association oil Seismology and Earth- quake...Nuclear Agency May 1975 DISTRIBUTED BY: KJ National Technical Information Service U. S. DEPARTMENT OF COMMERCE ^804J AFWL-TR-74-228, Vol. I ...CM o / i ’•fu.r ) V V AFWL-TR- 74-228 Vol. I SINGER: A COMPUTER CODE FOR GENERAL ANALYSIS OF TWO-DIMENSIONAL CONCRETE STRUCTURES Volum« I
Applied Computational Electromagnetics Society Journal. Volume 13, No. 1
1998-03-01
APPLIED COMPUTATIONAL ELECTROMAGNETICS SOCIETY JOURNAL March 1998 Vol. 13 No. 1 ISSN 1054-4887 MBTMBUTION BTATCICEHt 1 ’ | Appcofd for...public rdtooMf DUrtrlbnttoo Unlimited 1 19980709 083 GENERAL PURPOSE AND SCOPE. The Applied Computational Electromagnetics Society Journal...SOCIETY Journal March 1998 Vol. 13 No. 1 ISSN 1054-4887 The ACES Journal is abstracted in INSPEC, in Engineering Index, and in DTIC. The second
Cooperative combinatorial optimization: evolutionary computation case study.
Burgin, Mark; Eberbach, Eugene
2008-01-01
This paper presents a formalization of the notion of cooperation and competition of multiple systems that work toward a common optimization goal of the population using evolutionary computation techniques. It is proved that evolutionary algorithms are more expressive than conventional recursive algorithms, such as Turing machines. Three classes of evolutionary computations are introduced and studied: bounded finite, unbounded finite, and infinite computations. Universal evolutionary algorithms are constructed. Such properties of evolutionary algorithms as completeness, optimality, and search decidability are examined. A natural extension of evolutionary Turing machine (ETM) model is proposed to properly reflect phenomena of cooperation and competition in the whole population.
Algorithmic Mechanism Design of Evolutionary Computation.
Pei, Yan
2015-01-01
We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.
Algorithmic Mechanism Design of Evolutionary Computation
2015-01-01
We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm. PMID:26257777
Implications of Complexity and Chaos Theories for Organizations that Learn
ERIC Educational Resources Information Center
Smith, Peter A. C.
2003-01-01
In 1996 Hubert Saint-Onge and Smith published an article ("The evolutionary organization: avoiding a Titanic fate", in The Learning Organization, Vol. 3 No. 4), based on their experience at the Canadian Imperial Bank of Commerce (CIBC). It was established at CIBC that change could be successfully facilitated through blended application…
PBTE (Performance-Based Teacher Education); Vol 2, No, 2, May 1973.
ERIC Educational Resources Information Center
Andrews, Theodore E., Ed.
This issue of the Multi-State Consortium for Performance-Based Teacher Education (PBTE) newsletter discusses (1) the evolutionary approach adopted by the state of Minnesota toward the implementation of PBTE, which includes discussion of what is known about PBTE, whether Minnesota ought (or wishes) to adopt the program, and state participation in…
Earth Observing Satellite Orbit Design Via Particle Swarm Optimization
2014-08-01
28.6 77.2 3 Indonesia Jakarta -6.174444 106.829444 3 Japan Tokyo 35.685 139.751389 2 Mexico Ciudad de Mexico 19.434167 -99.138611 3 Morocco Rabat...99. Proceedings of the 1999 Congress on, Vol. 3, 1999. 15Ozcan, E. and Mohan, C., “Particle swarm optimization: surfing the waves,” Evolutionary
Application of evolutionary computation in ECAD problems
NASA Astrophysics Data System (ADS)
Lee, Dae-Hyun; Hwang, Seung H.
1998-10-01
Design of modern electronic system is a complicated task which demands the use of computer- aided design (CAD) tools. Since a lot of problems in ECAD are combinatorial optimization problems, evolutionary computations such as genetic algorithms and evolutionary programming have been widely employed to solve those problems. We have applied evolutionary computation techniques to solve ECAD problems such as technology mapping, microcode-bit optimization, data path ordering and peak power estimation, where their benefits are well observed. This paper presents experiences and discusses issues in those applications.
Acupuncture-Based Biophysical Frontiers of Complementary Medicine
2001-10-28
cf. Fig. 1, an evolutionary older type of intercell communications , transporting ionic electrical signals between excitable cells, whose conductivity...traditional psychology: Biophysical bases of psychosomatic disorders and transpersonal stress reprogramming", in Basic and Clinical Aspects of the Theory...biophysical basis of transpersonal transcendental phenomena", Int. J. Appl. Sci. & Computat, vol. 7, pp. 174-187, 2000 [also presented at Int. Conf
Development of X-TOOLSS: Preliminary Design of Space Systems Using Evolutionary Computation
NASA Technical Reports Server (NTRS)
Schnell, Andrew R.; Hull, Patrick V.; Turner, Mike L.; Dozier, Gerry; Alverson, Lauren; Garrett, Aaron; Reneau, Jarred
2008-01-01
Evolutionary computational (EC) techniques such as genetic algorithms (GA) have been identified as promising methods to explore the design space of mechanical and electrical systems at the earliest stages of design. In this paper the authors summarize their research in the use of evolutionary computation to develop preliminary designs for various space systems. An evolutionary computational solver developed over the course of the research, X-TOOLSS (Exploration Toolset for the Optimization of Launch and Space Systems) is discussed. With the success of early, low-fidelity example problems, an outline of work involving more computationally complex models is discussed.
From evolutionary computation to the evolution of things.
Eiben, Agoston E; Smith, Jim
2015-05-28
Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as evolutionary algorithms that take place in hardware are developed, opening up new avenues towards autonomous machines that can adapt to their environment. We discuss how evolutionary computation compares with natural evolution and what its benefits are relative to other computing approaches, and we introduce the emerging area of artificial evolution in physical systems.
Practical advantages of evolutionary computation
NASA Astrophysics Data System (ADS)
Fogel, David B.
1997-10-01
Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.
1977-08-24
exceeded a million rubles. POLAND SOME METHODOLOGICAL REMARKS RELATING TO THE FORECASTING MODEL OF COMPUTER DEVELOPMENT Warsaw INFORMATYKA in...PROCESSING SYSTEMS Warsaw INFORMATYKA in Polish Vol 11 No 10, Oct 76 pp 19-20 SEKULA, ZOFIA, Wroclaw [Abstract] The author presents critical remarks...TO ODRA 1300 SYSTEM Warsaw INFORMATYKA in Polish Vol 11 No 9, Sep 76 pp 1-4 BZDULA, CZESLAW, Research and Development Center of MERA-ELWRO Digital
Effects of Computer Architecture on FFT (Fast Fourier Transform) Algorithm Performance.
1983-12-01
Criteria for Efficient Implementation of FFT Algorithms," IEEE Transactions on Acoustics, Speech, and Signal Processing, Vol. ASSP-30, pp. 107-109, Feb...1982. Burrus, C. S. and P. W. Eschenbacher. "An In-Place, In-Order Prime Factor FFT Algorithm," IEEE Transactions on Acoustics, Speech, and Signal... Transactions on Acoustics, Speech, and Signal Processing, Vol. ASSP-30, pp. 217-226, Apr. 1982. Control Data Corporation. CDC Cyber 170 Computer Systems
Metaheuristic Optimization and its Applications in Earth Sciences
NASA Astrophysics Data System (ADS)
Yang, Xin-She
2010-05-01
A common but challenging task in modelling geophysical and geological processes is to handle massive data and to minimize certain objectives. This can essentially be considered as an optimization problem, and thus many new efficient metaheuristic optimization algorithms can be used. In this paper, we will introduce some modern metaheuristic optimization algorithms such as genetic algorithms, harmony search, firefly algorithm, particle swarm optimization and simulated annealing. We will also discuss how these algorithms can be applied to various applications in earth sciences, including nonlinear least-squares, support vector machine, Kriging, inverse finite element analysis, and data-mining. We will present a few examples to show how different problems can be reformulated as optimization. Finally, we will make some recommendations for choosing various algorithms to suit various problems. References 1) D. H. Wolpert and W. G. Macready, No free lunch theorems for optimization, IEEE Trans. Evolutionary Computation, Vol. 1, 67-82 (1997). 2) X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, (2008). 3) X. S. Yang, Mathematical Modelling for Earth Sciences, Dunedin Academic Press, (2008).
Precision Machining Application and Technology: An Overview and Perspective.
1983-08-24
diamond turning lathes are being used to produce computer discs. Bryant Symons, an English firm, has reported diamond turning an aluminum computer disk at...34 Precision Engineering, Vol. 5(2), Guildford, Engl nd, July 1983. Watt, G., " Lathe for Generation of Spherical -arfaces of Revolution," given to Optical...Precision CNC Diamond Turning Machine," Annuals of the CIRP, Vol. 31/1, p 409, 1982. 8. Bryant Simmons Product Brochur-, "Ultra Precision Oiamond Turning
Applied Computational Electromagnetics Society Journal, Volume 9, Number 1, March 1994
1994-03-01
AD-A7 5 I..... * APPLIED COMPUrA77ONAL ELECTROMAGNETICS SOCIETY Journal FjLECTE TI S*...*....March 1994 Vol. 9 No. 1 .... .. .. .ISSN 1054-4887...25.00. REMIT BY: ( 1 ) BANK DRAFTS (MUST BE DRAWN ON U.S. BANK). (2) INTERNATIONAL MONEY ORDER, (3) TRAVELER’S CHECKS IN U.S. DOLLARS, (4) ELECTRONIC...COMPUTATIONAL ELECTROMAGNETICS SOCIETY "Accesion For Joumal NTIS CRAM OTIC TAB Urannounced Justification. March 1994 By ................ Vol. 9 No. 1
Monaural Speech Segregation by Integrating Primitive and Schema-Based Analysis
2008-02-03
vol. 19, pp. 475-492. Wang D.L. and Chang P.S. (2008): An oscillatory correlation model of auditory streaming. Cognitive Neurodynamics , vol. 2, pp...Subcontracts DeLiang Wang (Principal Investigator) March 2008 Department of Computer Science & Engineering and Center for Cognitive Science The
Shuttle mission simulator. Volume 2: Requirement report, volume 2, revision C
NASA Technical Reports Server (NTRS)
Burke, J. F.
1973-01-01
The requirements for space shuttle simulation which are discussed include: general requirements, program management, system engineering, design and development, crew stations, on-board computers, and systems integration. For Vol. 1, revision A see N73-22203, for Vol 2, revision A see N73-22204.
Autonomous Sonar Classification Using Expert Systems
1992-06-01
34Multisensor Integration and Fusion in Intelligent System," ZEEE Tmnsactions on Systems, Man and Cybernetics, vol. 19 no. 5, September/Octciber...34 University of California Santa Barbara Department of Computer Science Technical Report TRCS89-06, February 1989. ZEEE , vol. 71 no. 7, July 1983, pp. 872...AutonomousUnderwater Vehicles" , Proceedingsof the ZEEE Oceanic Engineering Society Conference A W 92, Washington DC, June 1992. Corkill, Daniel, "BlackboardSystems," AIErpert, vol. 6 no. 9, September 1991, pp. 40-47. 559
Accuracy of Time Integration Approaches for Stiff Magnetohydrodynamics Problems
NASA Astrophysics Data System (ADS)
Knoll, D. A.; Chacon, L.
2003-10-01
The simulation of complex physical processes with multiple time scales presents a continuing challenge to the computational plasma physisist due to the co-existence of fast and slow time scales. Within computational plasma physics, practitioners have developed and used linearized methods, semi-implicit methods, and time splitting in an attempt to tackle such problems. All of these methods are understood to generate numerical error. We are currently developing algorithms which remove such error for MHD problems [1,2]. These methods do not rely on linearization or time splitting. We are also attempting to analyze the errors introduced by existing ``implicit'' methods using modified equation analysis (MEA) [3]. In this presentation we will briefly cover the major findings in [3]. We will then extend this work further into MHD. This analysis will be augmented with numerical experiments with the hope of gaining insight, particularly into how these errors accumulate over many time steps. [1] L. Chacon,. D.A. Knoll, J.M. Finn, J. Comput. Phys., vol. 178, pp. 15-36 (2002) [2] L. Chacon and D.A. Knoll, J. Comput. Phys., vol. 188, pp. 573-592 (2003) [3] D.A. Knoll , L. Chacon, L.G. Margolin, V.A. Mousseau, J. Comput. Phys., vol. 185, pp. 583-611 (2003)
Stereo, Shading, and Surfaces: Curvature Constraints Couple Neural Computations
2014-04-23
Bullier, and J. S. Lund, ‘‘Circuits for local and global signal integration in primary visual cortex,’’ J. Neurosci ., vol. 22, no. 19, pp. 8633–8646...cortex,’’ J. Neurosci ., vol. 17, no. 6, pp. 2112–2127, Mar. 15, 1997. [16] Y. Boykov, O. Veksler, and R. Zabih, ‘‘Fast approximate energy minimization...plasticity: A Hebbian learning rule,’’ Annu. Rev. Neurosci ., vol. 31, pp. 25–46, 2008. [19] V. A. Casagrande and J. H. Kaas, ‘‘The afferent, intrinsic
NOSC/ONR Robotics Bibliography (1961-1981).
1982-09-01
28, 6 Dec., 1979 @, p4 "DEFENSE EQUIPMENT FIRM TRAINS ROBOT TO PERFORM CRAFTSMAN-SKILLED TASK", Industrial Engineering, vol 13, no 5, May 1981 @, p90...1974 @, pCI-I-8 Gupton, J. A. Jr., "BUILD THIS UNICORN -i ROBOT PART I", Radio-Electronics, vol 51, no 8, 1980 @, p 3 7 ,4 1 ,76 Gupton, J. A. Jr...34BUILD THIS UNICORN -i ROBOT PART II", Radio-Electronics, vol 51, no 9, Sept. 1980 @, p55-8 Gupton, J. A., Jr., "TALK TO A TURTLE; BUILD A COMPUTER
Arenas, Miguel
2015-04-01
NGS technologies present a fast and cheap generation of genomic data. Nevertheless, ancestral genome inference is not so straightforward due to complex evolutionary processes acting on this material such as inversions, translocations, and other genome rearrangements that, in addition to their implicit complexity, can co-occur and confound ancestral inferences. Recently, models of genome evolution that accommodate such complex genomic events are emerging. This letter explores these novel evolutionary models and proposes their incorporation into robust statistical approaches based on computer simulations, such as approximate Bayesian computation, that may produce a more realistic evolutionary analysis of genomic data. Advantages and pitfalls in using these analytical methods are discussed. Potential applications of these ancestral genomic inferences are also pointed out.
Mechanical Properties of Structural Polymers. Computer Simulations and Key Experiments
1992-09-30
Technology", edited by R.W. Cahn, P. Haasen, and E.J. Kramer, (VCH: Weinheim, Germany) (vol. editor H. Mughrabi ) Vol. 6, in the press (1992). 14. A.S. Argon...34, Acta Metall. et Mater., submitted for publication. 5 32. A. Galeski, Z. Bartczak, A.S. Argon and R.E. Cohen, "Morphological Al - terations during
Evolutionary computation in zoology and ecology.
Boone, Randall B
2017-12-01
Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.
Evolutionary computation in zoology and ecology
2017-01-01
Abstract Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species’ niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate. PMID:29492029
From computers to cultivation: reconceptualizing evolutionary psychology.
Barrett, Louise; Pollet, Thomas V; Stulp, Gert
2014-01-01
Does evolutionary theorizing have a role in psychology? This is a more contentious issue than one might imagine, given that, as evolved creatures, the answer must surely be yes. The contested nature of evolutionary psychology lies not in our status as evolved beings, but in the extent to which evolutionary ideas add value to studies of human behavior, and the rigor with which these ideas are tested. This, in turn, is linked to the framework in which particular evolutionary ideas are situated. While the framing of the current research topic places the brain-as-computer metaphor in opposition to evolutionary psychology, the most prominent school of thought in this field (born out of cognitive psychology, and often known as the Santa Barbara school) is entirely wedded to the computational theory of mind as an explanatory framework. Its unique aspect is to argue that the mind consists of a large number of functionally specialized (i.e., domain-specific) computational mechanisms, or modules (the massive modularity hypothesis). Far from offering an alternative to, or an improvement on, the current perspective, we argue that evolutionary psychology is a mainstream computational theory, and that its arguments for domain-specificity often rest on shaky premises. We then go on to suggest that the various forms of e-cognition (i.e., embodied, embedded, enactive) represent a true alternative to standard computational approaches, with an emphasis on "cognitive integration" or the "extended mind hypothesis" in particular. We feel this offers the most promise for human psychology because it incorporates the social and historical processes that are crucial to human "mind-making" within an evolutionarily informed framework. In addition to linking to other research areas in psychology, this approach is more likely to form productive links to other disciplines within the social sciences, not least by encouraging a healthy pluralism in approach.
NASA Astrophysics Data System (ADS)
Ma, Yanlu
2013-04-01
Although most researches focus on the lateral heterogeneity of 3D Earth nowadays, a spherically multi-layered model where the parameters depend only on depth still represents a good first order approximation of real Earth. Such 1D models could be used as starting models for seismic tomographic inversion or as background model where the source mechanisms are inverted. The problem of wave propagation in a spherically layered model had been solved theoretically long time ago (Takeuchi and Saito, 1972). The existing computer programs such as Mineos (developed by G. Master, J. Woodhouse and F. Gilbert), Gemini (Friederich and Dalkolmo 1995), DSM (Kawai et. al. 2006) and QSSP (Wang 1999) tackled the computational aspects of the problem. A new simple and fast program for computing the Green's function of a stack of spherical dissipative layers is presented here. The analytical solutions within each homogeneous spherical layer are joined through the continuous boundary conditions and propagated from the center of model up to the level of source depth. Another solution is built by propagating downwardly from the free surface of model to the source level. The final solution is then constructed in frequency domain from the previous two solutions to satisfy the discontinuities of displacements and stresses at the source level which are required by the focal mechanism. The numerical instability in the propagator approach is solved by complementing the matrix propagating with an orthonormalization procedure (Wang 1999). Another unstable difficulty due to the high attenuation in the upper mantle low velocity zone is overcome by switching the bases of solutions from the spherical Bessel functions to the spherical Hankel functions when necessary. We compared the synthetic seismograms obtained from the new program YASEIS with those computed by Gemini and QSSP. In the range of near distances, the synthetics by a reflectivity code for the horizontally layers are also compared with those from YASEIS. Finally the static displacements in the source region are computed by choosing a very small frequency value in YASEIS which is designed for computing the dynamic response, and compared with the results in a homogeneous half-space model (Okada 1992). [1] Friederich, W. and J. Dalkolmo (1995). Complete synthetic seismograms for a spherically symmetric Earth a numerical computation of the Green's function in the frequency domain, Geophys. J. Int., vol. 122, 537-550. [2] Kawai, K., N. Takeuchi, and R.J. Geller (2006). Complete synthetic seismograms up to 2Hz for transversely isotropic spherically symmetric media, Geophys. J. Int., vol. 164, 411-424. [3] Okada, Y. (1992). Internal deformation due to shear and tensile faults in a half space, Bull. Seismol. Soc. Am., vol. 82, no. 2, 1018-1040. [4] Takeuchi, H. and M. Saito (1972). Seismic surface waves, Methods in computational physics, vol. II, 217-295. [5] Wang, R. (1999). A simple orthonormalization method for stable and efficient computation of Green's functions, Bull. Seismol. Soc. Am., vol. 89, no. 3, 733-741.
Establishing Tools for Computing Hybrids
2006-10-01
moorelaw.html. September 26. pp. 1-28. 47. Sharma , Vijay . 2004. Is it Possible to Build Computers from Living Cells? BioTeach Journal, 2, 53-60. 48...by Vijay K. Varadan, Proceedings of SPIE Vol. 5389, SPIE, Bellingham, WA. Pp. 298-305. 58. Warren, Paul. 2002. The Future of Computing: New
ERIC Educational Resources Information Center
Darrah, Charles N.
This book explores how people look at workplaces and the consequences for one's understanding of work. Chapter 1 discusses the rhetoric of skill requirements. Chapter 2 follows the attempt of Kramden Computers to provide training to reduce problems in workmanship and the program's failure due to the inadequacy of the skill concept. Chapter 3…
Lithium Niobate Arithmetic Logic Unit
1991-03-01
Boot51] A.D. Booth, "A Signed Binary Multiplication Technique," Quarterly Journal of Mechanics and Applied Mathematics , Vol. IV Part 2, 1951. [ChWi79...Trans. Computers, Vol. C-26, No. 7, July 1977, pp. 681-687. [Wake8 I] John F. Wakerly , "Miocrocomputer Architecture and Programming," John Wiley and...different division methods and discusses their applicability to simple bit serial implementation. Several different designs are then presented and
Simulating Nonequilibrium Radiation via Orthogonal Polynomial Refinement
2015-01-07
measured by the preprocessing time, computer memory space, and average query time. In many search procedures for the number of points np of a data set, a...analytic expression for the radiative flux density is possible by the commonly accepted local thermal equilibrium ( LTE ) approximation. A semi...Vol. 227, pp. 9463-9476, 2008. 10. Galvez, M., Ray-Tracing model for radiation transport in three-dimensional LTE system, App. Physics, Vol. 38
From computers to cultivation: reconceptualizing evolutionary psychology
Barrett, Louise; Pollet, Thomas V.; Stulp, Gert
2014-01-01
Does evolutionary theorizing have a role in psychology? This is a more contentious issue than one might imagine, given that, as evolved creatures, the answer must surely be yes. The contested nature of evolutionary psychology lies not in our status as evolved beings, but in the extent to which evolutionary ideas add value to studies of human behavior, and the rigor with which these ideas are tested. This, in turn, is linked to the framework in which particular evolutionary ideas are situated. While the framing of the current research topic places the brain-as-computer metaphor in opposition to evolutionary psychology, the most prominent school of thought in this field (born out of cognitive psychology, and often known as the Santa Barbara school) is entirely wedded to the computational theory of mind as an explanatory framework. Its unique aspect is to argue that the mind consists of a large number of functionally specialized (i.e., domain-specific) computational mechanisms, or modules (the massive modularity hypothesis). Far from offering an alternative to, or an improvement on, the current perspective, we argue that evolutionary psychology is a mainstream computational theory, and that its arguments for domain-specificity often rest on shaky premises. We then go on to suggest that the various forms of e-cognition (i.e., embodied, embedded, enactive) represent a true alternative to standard computational approaches, with an emphasis on “cognitive integration” or the “extended mind hypothesis” in particular. We feel this offers the most promise for human psychology because it incorporates the social and historical processes that are crucial to human “mind-making” within an evolutionarily informed framework. In addition to linking to other research areas in psychology, this approach is more likely to form productive links to other disciplines within the social sciences, not least by encouraging a healthy pluralism in approach. PMID:25161633
Update of aircraft profile data for the Integrated Noise Model computer program, vol 1: final report
DOT National Transportation Integrated Search
1992-03-01
This report provides aircraft takeoff and landing profiles, aircraft aerodynamic performance coefficients and engine performance coefficients for the aircraft data base (Database 9) in the Integrated Noise Model (INM) computer program. Flight profile...
Reliability of Computer Systems ODRA 1305 and R-32,
1983-03-25
RELIABILITY OF COMPUTER SYSTEMS ODRA 1305 AND R-32 By: Wit Drewniak English pages: 12 Source: Informatyka , Vol. 14, Nr. 7, 1979, pp. 5-8 Country of...JS EMC computers installed in ZETO, Katowice", Informatyka , No. 7-8/78, deals with various reliability classes * within the family of the machines of
Soviet Cybernetics Review, Vol. 3, No. 9, September 1969.
ERIC Educational Resources Information Center
Holland, Wade B., Ed.
The issue features articles and photographs of computers displayed at the Automation-69 Exhibition in Moscow, especially the Mir-1 and Ruta-110. Also discussed are the Doza analog computer for radiological dosage; 'on-the-fly' output printers; other ways to increase computer speed and productivity; and the planned ultra-high-energy 1000-Bev…
1991-02-10
25. 1"S. revised mcp recerved Apnil 10, 19W, thnnsmte into a VAX- 11 computer. Fringf_ spacing * acceped for puablI Al 126. 1990 This paPetis a...W.Il.Peters, W.F.llanson mtid S.WI.-MN1il, "Detcrintation of dis. placements using an improved (digit al rorrela Ii’’i ii’’ h. . 111 aad Vision Y*omput ing...experiimental mcchaniics", Experiment al Mechianics, Vol.25, No.3, pp 232-.1-1 (1985). 5.~ A.Rosenfeld and A.C.Kak, Digital Pict nre processing, Vol 1
Bio-inspired algorithms applied to molecular docking simulations.
Heberlé, G; de Azevedo, W F
2011-01-01
Nature as a source of inspiration has been shown to have a great beneficial impact on the development of new computational methodologies. In this scenario, analyses of the interactions between a protein target and a ligand can be simulated by biologically inspired algorithms (BIAs). These algorithms mimic biological systems to create new paradigms for computation, such as neural networks, evolutionary computing, and swarm intelligence. This review provides a description of the main concepts behind BIAs applied to molecular docking simulations. Special attention is devoted to evolutionary algorithms, guided-directed evolutionary algorithms, and Lamarckian genetic algorithms. Recent applications of these methodologies to protein targets identified in the Mycobacterium tuberculosis genome are described.
Establishing national diagnostic reference levels (DRLs) for computed tomography in Egypt.
Salama, Dina Husseiny; Vassileva, Jenia; Mahdaly, Gamal; Shawki, Mona; Salama, Ahmad; Gilley, Debbie; Rehani, Madan Mohan
2017-07-01
To establish national diagnostic reference levels (DRLs) in Egypt for computed tomography (CT) examinations of adults and identify the potential for optimization. Data from 3762 individual patient's undergoing CT scans of head, chest (high resolution), abdomen, abdomen-pelvis, chest-abdomen-pelvis and CT angiography (aorta and both lower limbs) examinations in 50 CT facilities were collected. This represents 20% of facilities in the country and all of the 27 Governorates. Results were compared with DRLs of UK, USA, Canada, Japan, Australia and France. The Egyptian DRLs for CTDI vol in mGy are for head: 30, chest (high resolution): 22, abdomen (liver metastasis): 31, abdomen-pelvis: 31, chest-abdomen-pelvis: 33 and CT angiography (aorta and lower limbs): 37. The corresponding DRLs for DLP in mGy.cm are 1360, 420, 1425, 1325, 1320 and 1320. For head CT, the Egyptian DRL for CTDI vol is 2-3 times lower than the DRLs from other countries. However, the DRL in terms of DLP is in the same range or higher as compared to others. The Egyptian DRL for chest CT (high resolution) is similar to others for DLP but higher for CTDI vol . For abdomen and abdomen-pelvis DRLs for CTDI vol are higher than others. For DLP, the DRLs for abdomen are higher than DRL in UK and lower than those in Japan, while for abdomen-pelvis they are higher than other countries. Despite lower DRLs for CTDI vol , an important consistent problem appears to be higher scan range as DRLs for DLP are higher. Copyright © 2017 Associazione Italiana di Fisica Medica. All rights reserved.
Menshutkin, V V; Kazanskiĭ, A B; Levchenko, V F
2010-01-01
The history of rise and development of evolutionary methods in Saint Petersburg school of biological modelling is traced and analyzed. Some pioneering works in simulation of ecological and evolutionary processes, performed in St.-Petersburg school became an exemplary ones for many followers in Russia and abroad. The individual-based approach became the crucial point in the history of the school as an adequate instrument for construction of models of biological evolution. This approach is natural for simulation of the evolution of life-history parameters and adaptive processes in populations and communities. In some cases simulated evolutionary process was used for solving a reverse problem, i. e., for estimation of uncertain life-history parameters of population. Evolutionary computations is one more aspect of this approach application in great many fields. The problems and vistas of ecological and evolutionary modelling in general are discussed.
DOT National Transportation Integrated Search
1992-03-01
This report provides aircraft takeoff and landing profiles, : aircraft aerodynamic performance coefficients and engine : performance coefficients for the aircraft data base : (Database 9) in the Integrated Noise Model (INM) computer : program. Flight...
Cooperation Helps Power Saving
2009-04-07
the destination node hears the poll, the link between the two nodes is activated. In the original STEM, two radios working on two separate channels... hears the poll, the link between the two nodes is activated. In the original STEM, two radios working on two separate chan- nels are used: one radio is...Computer and Communications Societies. Proceedings. IEEE, vol. 3, pp. 1548–1557 vol.3, 2001. [2] R . Kravets and P. Krishnan, “Application-driven power
Generating Textures for Arbitrary Surfaces Using Reaction-Diffusion
1990-01-01
Review and Classification," Computer Aided Design, Vol. 20, No. 1, pp. 27-38 (January/February 1988). [ Hubel and Wiesel 79] Hubel , David H. and...columns found in mammals [ Hubel and Wiesel 791. Complex Patterns This section shows how we can generate more complex patterns using reaction-diffusion by... Torsten N. Wiesel , "Brain Mechanisms of Vision," Scientific American, Vol. 241, No. 3, pp. 150-162 (September 1979). [Hunding 90] Hunding, Axel, Stuart A
Knowledge Guided Evolutionary Algorithms in Financial Investing
ERIC Educational Resources Information Center
Wimmer, Hayden
2013-01-01
A large body of literature exists on evolutionary computing, genetic algorithms, decision trees, codified knowledge, and knowledge management systems; however, the intersection of these computing topics has not been widely researched. Moving through the set of all possible solutions--or traversing the search space--at random exhibits no control…
Automated design of spacecraft systems power subsystems
NASA Technical Reports Server (NTRS)
Terrile, Richard J.; Kordon, Mark; Mandutianu, Dan; Salcedo, Jose; Wood, Eric; Hashemi, Mona
2006-01-01
This paper discusses the application of evolutionary computing to a dynamic space vehicle power subsystem resource and performance simulation in a parallel processing environment. Our objective is to demonstrate the feasibility, application and advantage of using evolutionary computation techniques for the early design search and optimization of space systems.
States Move toward Computer Science Standards. Policy Update. Vol. 23, No. 17
ERIC Educational Resources Information Center
Tilley-Coulson, Eve
2016-01-01
While educators and parents recognize computer science as a key skill for career readiness, only five states have adopted learning standards in this area. Tides are changing, however, as the Every Student Succeeds Act (ESSA) recognizes with its call on states to provide a "well-rounded education" for students, to include computer science…
Elliptic Length Scales in Laminar, Two-Dimensional Supersonic Flows
2015-06-01
sophisticated computational fluid dynamics ( CFD ) methods. Additionally, for 3D interactions, the length scales would require determination in spanwise as well...Manna, M. “Experimental, Analytical, and Computational Methods Applied to Hypersonic Compression Ramp Flows,” AIAA Journal, Vol. 32, No. 2, Feb. 1994
Using modified fruit fly optimisation algorithm to perform the function test and case studies
NASA Astrophysics Data System (ADS)
Pan, Wen-Tsao
2013-06-01
Evolutionary computation is a computing mode established by practically simulating natural evolutionary processes based on the concept of Darwinian Theory, and it is a common research method. The main contribution of this paper was to reinforce the function of searching for the optimised solution using the fruit fly optimization algorithm (FOA), in order to avoid the acquisition of local extremum solutions. The evolutionary computation has grown to include the concepts of animal foraging behaviour and group behaviour. This study discussed three common evolutionary computation methods and compared them with the modified fruit fly optimization algorithm (MFOA). It further investigated the ability of the three mathematical functions in computing extreme values, as well as the algorithm execution speed and the forecast ability of the forecasting model built using the optimised general regression neural network (GRNN) parameters. The findings indicated that there was no obvious difference between particle swarm optimization and the MFOA in regards to the ability to compute extreme values; however, they were both better than the artificial fish swarm algorithm and FOA. In addition, the MFOA performed better than the particle swarm optimization in regards to the algorithm execution speed, and the forecast ability of the forecasting model built using the MFOA's GRNN parameters was better than that of the other three forecasting models.
Social Media: Menagerie of Metrics
2010-01-27
intelligence, an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm . An EA...Cloning - 22 Animals were cloned to date; genetic algorithms can help prediction (e.g. “elitism” - attempts to ensure selection by including performers...28, 2010 Evolutionary Algorithm • Evolutionary algorithm From Wikipedia, the free encyclopedia Artificial intelligence portal In artificial
NASA Astrophysics Data System (ADS)
Jones, R.
Today the consensus view is that thought and mind is a combination of processes like memory, generalization, comparison, deduction, organization, analogy, etc. performed by classical computational machinery. (R. Jones, Trans. Kansas Acad. Sci., vol. 109, #3/4, 2006) But I believe quantum mechanics is a more plausible dualist theory of reality. (R. Jones, Bull. Am. Phys. Soc., vol. 5, 2011) In a quantum computer the processing (thinking) takes place either in computers in Everett's many worlds or else in the many dimensional Hilbert space. (Depending upon your interpretation of QM.) If our brains were quantum computers then there might be a world of mind which is distinct from the physical world that our bodies occupy. (4 space) This is much like the spirit-body dualism of Descartes and others. My own view is that thought and mind are classical phenomena (see www.robert-w-jones.com, philosopher, theory of thought and mind) but it would be interesting to run an artificial intelligence like my A.S.A. H. on a quantum computer. Might this produce, for the first time, a hypermind in its own universe?
Velocity and Structure Estimation of a Moving Object Using a Moving Monocular Camera
2006-01-01
map the Euclidean position of static landmarks or visual features in the environment . Recent applications of this technique include aerial...From Motion in a Piecewise Planar Environment ,” International Journal of Pattern Recognition and Artificial Intelligence, Vol. 2, No. 3, pp. 485-508...1988. [9] J. M. Ferryman, S. J. Maybank , and A. D. Worrall, “Visual Surveil- lance for Moving Vehicles,” Intl. Journal of Computer Vision, Vol. 37, No
Numerical Simulation of the Interaction of a Vortex with Stationary Airfoil in Transonic Flow,
1984-01-12
Goorjian, P. M., "Implicit Vortex Wakes ," AIAA Journal, Vol. 15, No. 4, April Finite- Difference Computations of Unsteady Transonic 1977, pp. 581-590... Difference Simulations of Three- tion of Wing- Vortex Interaction in Transonic Flow Dimensional Flow," AIAA Journal, Vol. 18, No. 2, Using Implicit...assumptions are made in p = density modeling the nonlinear vortex wake structure. Numerical algorithms based on the Euler equations p_ = free stream density
The Bulletin of Military Operations Research, PHALANX, Vol. 31, No. 2.
1998-06-01
introduction of the Pentium II processor, the writeable CD, and the Digital Video Disc (DVD). Just around the corner, around the turn of the century...broader audi- ence. Presentations that use special visual aids ( videos , computers, etc.), short presen- tations best depicted with color charts...Throughout the treatment of data, anoth- er weapon we should take is Tukey’s Tor- pedo (John W. Tukey, "Sunset Salvo," The American Statistician, vol
Statistical and Variational Methods for Problems in Visual Control
2009-03-02
plane curves to round points," /. Differential Geometry 26 (1987), pp. 285-314. 12 [7] S. Haker , G. Sapiro, and A. Tannenbaum, "Knowledge-based...segmentation of SAR data with learned priors," IEEE Trans. Image Processing, vol. 9, pp. 298-302, 2000. [8] S. Haker , L. Zhu, S. Angenent, and A...Tannenbaum, "Optimal mass transport for registration and warping" Int. Journal Computer Vision, vol. 60, pp. 225-240, 2004. [9] S. Haker , G. Sapiro, A
Chemistry and Physics of Analyte Identification in Integrated Nanosensors
2009-02-05
points," / Differential Geometry 26 (1987), pp. 285-314. 12 [7] S. Haker , G. Sapiro, and A. Tannenbaum, "Knowledge-based segmentation of SAR data with...learned priors," IEEE Trans. Image Processing, vol. 9, pp. 298-302, 2000. [8] S. Haker , L. Zhu, S. Angenent, and A. Tannenbaum, "Optimal mass...transport for registration and warping" Int. Journal Computer Vision, vol. 60, pp. 225-240, 2004. [9] S. Haker , G. Sapiro, A. Tannenbaum, and D. Washburn
1985-11-18
Greenberg and K. Sakallah at Digital Equipment Corporation, and C-F. Chen, L Nagel, and P. ,. Subrahmanyam at AT&T Bell Laboratories, both for providing...Circuit Theory McGraw-Hill, 1969. [37] R. Courant and D. Hilbert , Partial Differential Equations, Vol. 2 of Methods of Mathematical Physics...McGraw-Hill, N.Y., 1965. Page 161 [44) R. Courant and D. Hilbert , Partial Differential Equations, Vol. 2 of Methods of Mathematical Physics
Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che
2014-01-16
To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks.
2014-01-01
Background To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. Results This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Conclusions Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high quality solutions can be obtained within relatively short time. This integrated approach is a promising way for inferring large networks. PMID:24428926
DOT National Transportation Integrated Search
1992-03-01
This report provides aircraft takeoff and landing profiles, aircraft aerodynamic performance coefficients and engine performance coefficients for the aircraft data base (Database 9) in the Integrated Noise Model (INM) computer program. Flight profile...
Evolutionary computing for the design search and optimization of space vehicle power subsystems
NASA Technical Reports Server (NTRS)
Kordon, M.; Klimeck, G.; Hanks, D.
2004-01-01
Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment.
NASA Astrophysics Data System (ADS)
Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin
2010-05-01
This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and Ben-Zion Y.: ‘Techniques and parameters to analyze seismicity patterns associated with large earthquakes', Geophysics Res., vol. 102, pp. 17785-17795, 1997a [3] Habermann R. E.: ‘Precursory seismic quiescence: past, present and future', Pure Applied Geophysics, vol. 126, pp. 279-318, 1988 [4] Matthews M. V. and Reasenberg P. A.: ‘Statistical methods for investigating quiescence and other temporal seismicity patterns', Pure Applied Geophysics, vol. 126, pp. 357-372, 1988 [5] Zubkov S. I.: ‘The appearance times of earthquake precursors', Izv. Akad. Nauk SSSR Fiz. Zemli (Solid Earth), No. 5, pp. 87-91, 1987 [6] Dobrovolsky I. P., Zubkov S. I. and Miachkin V. I.: ‘Estimation of the size of earthquake preparation zones', Pageoph, vol. 117, pp. 1025-1044, 1979 [7] Dobrovolsky I. P., Gershenzon N. I. And Gokhberg M. B.: ‘Theory of electrokinetic effects occurring at the final stage in the preparation of a tectonic earthquake', Physics of the Earth and Planetary Interiors, vol. 57, pp. 144-156, 1989 [8] Richter C. F.: ‘Elementary Seismology', W.H.Freeman and Co., San Francisco, 1958 [9] Choy G. L. and Boatwright J. L.: ‘Global patterns of radiated seismic energy and apparent stress', Journal of Geophysical Research, vol. 84 (B5), pp. 2348-2350, 1995 [10] Haykin S.: ‘Neural Networks', 2nd Edition, Prentice Hall, 1999 [11] Jang J., Sun T. and Mizutany E.: ‘Neuro-fuzzy and soft computing', Prentice Hall, Upper Saddle River, NJ, 1997 [12] Konstantaras A., Varley M.R., Vallianatos F., Collins G. and Holifield P.: ‘Detection of weak seismo-electric signals upon the recordings of the electrotelluric field by means of neuron-fuzzy technology', IEEE Geoscience and Remote Sensing Letters, vol. 4 (1), 2007 [13] Konstantaras A., Varley M.R., Vallianatos F., Collins G. and Holifield P.: ‘Neuro-fuzzy prediction-based adaptive filtering applied to severely distorted magnetic field recordings', IEEE Geoscience and Remote Sensing Letters, vol. 3 (4), 2006 [14] Maravelakis E., Bilalis N., Keith J. and Antoniadis A.: ‘Measuring and Benchmarking the Innovativeness of SME's: a three dimensional Fuzzy Logic Approach', Production Planning and Control Journal, vol. 17 (3), pp. 283-292, 2006 [15] Bodri B.: ‘A neural-network model for earthquake occurrence', Geodynamics, vol. 32, pp. 289-310, 2001 [16] Skounakis E., Karagiannis V. and Vlissidis A.: ‘A Versatile System for Real-time Analyzing and Testing Objects Quality', Proceedings-CD of the 4th International Conference on "New Horizons in Industry, Business and Education" (NHIBE 2005), Corfu, Greece, pp. 701-708, 2005
Reconstructing evolutionary trees in parallel for massive sequences.
Zou, Quan; Wan, Shixiang; Zeng, Xiangxiang; Ma, Zhanshan Sam
2017-12-14
Building the evolutionary trees for massive unaligned DNA sequences is challenging and crucial. However, reconstructing evolutionary tree for ultra-large sequences is hard. Massive multiple sequence alignment is also challenging and time/space consuming. Hadoop and Spark are developed recently, which bring spring light for the classical computational biology problems. In this paper, we tried to solve the multiple sequence alignment and evolutionary reconstruction in parallel. HPTree, which is developed in this paper, can deal with big DNA sequence files quickly. It works well on the >1GB files, and gets better performance than other evolutionary reconstruction tools. Users could use HPTree for reonstructing evolutioanry trees on the computer clusters or cloud platform (eg. Amazon Cloud). HPTree could help on population evolution research and metagenomics analysis. In this paper, we employ the Hadoop and Spark platform and design an evolutionary tree reconstruction software tool for unaligned massive DNA sequences. Clustering and multiple sequence alignment are done in parallel. Neighbour-joining model was employed for the evolutionary tree building. We opened our software together with source codes via http://lab.malab.cn/soft/HPtree/ .
Nonlinear Real-Time Optical Signal Processing
1990-09-01
pattern recognition. Additional work concerns the relationship of parallel computation paradigms to optical computing and halftone screen techniques...paradigms to optical computing and halftone screen techniques for implementing general nonlinear functions. 3\\ 2 Research Progress This section...Vol. 23, No. 8, pp. 34-57, 1986. 2.4 Nonlinear Optical Processing with Halftones : Degradation and Compen- sation Models This paper is concerned with
SSC San Diego Biennial Review 2003. Vol 2: Communication and Information Systems
2003-01-01
University, Department of Electrical and Computer Engineering) Michael Jablecki (Science and Technology Corporation) Stochastic Unified Multiple...wearable computers and cellular phones. The technology-transfer process involved a coalition of government and industrial partners, each providing...the design and fabrication of the coupler. SSC San Diego developed a computer -controlled fused fiber fabrication station to achieve the required
MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.
Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro
2018-06-01
The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.
Eco-Evo PVAs: Incorporating Eco-Evolutionary Processes into Population Viability Models
We synthesize how advances in computational methods and population genomics can be combined within an Ecological-Evolutionary (Eco-Evo) PVA model. Eco-Evo PVA models are powerful new tools for understanding the influence of evolutionary processes on plant and animal population pe...
Enhancing Army S&T Vol. 2: The Future
2012-03-01
Numerical Integrator and Computer ( ENIAC ), was commissioned by the Army’s Ballistic Research Laboratory in 1943 and operated for several years at the Army’s...Aberdeen Proving Ground? The ENIAC is considered to be the genesis of modern digital computing. It is often the case the Army’s laboratories have
Soldier Decision-Making for Allocation of Intelligence, Surveillance, and Reconnaissance Assets
2014-06-01
Judgments; also called Algoritmic or Statistical Judgements Computer Science , Psychology, and Statistics Actuarial or algorithmic...Jan. 2011. [17] R. M. Dawes, D. Faust, and P. E. Meehl, “Clinical versus Actuarial Judgment,” Science , vol. 243, no. 4899, pp. 1668–1674, 1989. [18...School of Computer Science
Dynamic Human-Computer Collaboration in Real-time Unmanned Vehicle Scheduling
2010-06-01
Rarely play games Play games once a month Weekly gamer A few times a week gamer Daily gamer Types of games played: 9. Rate...Algorithm, Alchemy , or Apostasy?," International Journal of Human-Computer Studies, vol. 52, pp. 203-216, 2000. [52] J.-M. Hoc, "From Human
Ventilation-Perfusion Relationships Following Experimental Pulmonary Contusion
2007-06-14
696.7 6.1 to 565.0 24.3 Hounsfield units ), as did VOL (4.3 0.5 to 33.5 3.2%). Multivariate linear regression of MGSD, VOL, VD/VT, and QS vs. PaO2...parenchyma was separated into four regions based on the Hounsfield unit (HU) ranges reported by Gattinoni et al. (23) via a segmentation process executed...determined by repeated measures ANOVA. CT, computed tomography; MGSD, mean gray-scale density of the entire lung by CT scan; HU, Hounsfield units
Generalizing the Nonlocal-Means to Super-Resolution Reconstruction
2008-12-12
Image Process., vol. 5, no. 6, pp. 996–1011, Jun. 1996. [7] A. J. Patti, M. I. Sezan, and M. A. Tekalp, “ Superresolution video reconstruction with...computationally efficient image superresolution algorithm,” IEEE Trans. Image Process., vol. 10, no. 4, pp. 573–583, Apr. 2001. [13] M. Elad and Y...pp. 21–36, May 2003. [18] S. Farsiu, D. Robinson, M. Elad, and P. Milanfar, “Robust shift and add approach to superresolution ,” in Proc. SPIE Conf
Clustering Similarity Digest Bloom Filters in Self-Organizing Maps
2012-12-01
www.sciencedirect.com/science/article/ pii/S1742287610000368 [4] M. Rogers, J . Goldman, R. Mislan, T. Wedge, and S. Debrota, “Computer forensics field triage...1990. [9] T. Kohonen, S. Kaski, K. Lagus, J . Salojarvi, J . Honkela, V. Paatero, and A. Saarela, “Self organization of a massive document collection...the IEEE-INNS-ENNS Interna- tional Joint Conference on, vol. 6, 2000, pp. 15 –19 vol.6. [12] G. Salton , A. Wong, and C. Yang, “A vector space model for
Efficient Numerical Methods for Nonequilibrium Re-Entry Flows
2014-01-14
right-hand side is the only quadratic operation). The number of sub- iterations , kmax, used in this update needs to be chosen for optimal convergence and...Upper Symmetric Gauss - Seidel Method for the Euler and Navier-Stokes Equations,”, AIAA Journal, Vol. 26, No. 9, pp. 1025-1026, Sept. 1988. 11Edwards, J.R...Candler, “The Solution of the Navier-Stokes Equations Using Gauss - Seidel Line Relaxation,” Computers and Fluids, Vol. 17, No. 1, pp. 135-150, 1989
1991-05-01
aspects of planning air interdiction .apability other than reviewing the available maps, photographic missions (e.g., computing fuel and mission time litnes... photographs . FUR or radar pictures of the waypoinis and targets communications. thai allows the mission to be rehearsed. In-flight circumstances are...Planning Aircraft In Flight MPS Geographieal & Meteorological Terrain a Cultural Features Image Data (e.g., Photographic ) Weather Data a Update Data an
Surface electrical properties experiment, Part 3
NASA Technical Reports Server (NTRS)
1974-01-01
A complete unified discussion of the electromagnetic response of a plane stratified structure is reported. A detailed and comprehensive analysis of the theoretical parts of the electromagnetic is given. The numerical problem of computing numbers of the electromagnetic field strengths is discussed. It is shown that the analysis of the conductive media is not very far removed from the theoretical analysis and the numerical difficulties are not as accute as for the low-loss problem. For Vol. 1, see N75-15570; for Vol. 2 see N75-15571.
1992-01-01
2-layer algebraic eddy Lawrence flow, 150corner implicit (lacCormack, viscosity (Baldwin and et al. at M = 14.1 1982) Lomax, 1978) (1987) 5) NASA...for Turbulence Research NASA Ames/Stanford Summer Programme," Journal of Fluid Mechanics, Vol. 190, pp. 375-392. Hussain, A.KM.F., (1986): "Coherent...the development of a Reynolds- stress turbulence closure," Journal of Fluid Mechanics, Vol. 68, pp. 537-566. Lawrence , S. L., and A. Balakrishnan (1988
Assessing the Need for Supercomputing Resources Within the Pacific Area of Responsibility
2015-05-26
portion of today’s research and development dollars are going toward developing machines that will be better suited for addressing big data applications...2009; Radu Sion, “To Cloud or Not to? Musings on Clouds, Security and Big Data ,” in Secure Data Management, Vol. 8425, May 2014, pp. 3–5; Yao Chen...Applied Parallel and Scientific Computing, Vol. 7134, 2010. Sion, Radu, “To Cloud or Not to? Musings on Clouds, Security and Big Data ,” in Secure Data
A Collaborative 20 Questions Model for Target Search with Human-Machine Interaction
2013-05-01
optimal policies for entropy loss,” Journal of Applied Probability, vol. 49, pp. 114–136, 2012. [2] R. Castro and R. Nowak, “ Active learning and...vol. 10, pp. 223231, 1974. [8] R. Castro, Active Learning and Adaptive Sampling for Non- parametric Inference, Ph.D. thesis, Rice University, August...2007. [9] R. Castro and R. D. Nowak, “Upper and lower bounds for active learning ,” in 44th Annual Allerton Conference on Communica- tion, Control and Computing, 2006.
NASA Astrophysics Data System (ADS)
Okanoya, Kazuo
2014-09-01
The comparative computational approach of Fitch [1] attempts to renew the classical David Marr paradigm of computation, algorithm, and implementation, by introducing evolutionary view of the relationship between neural architecture and cognition. This comparative evolutionary view provides constraints useful in narrowing down the problem space for both cognition and neural mechanisms. I will provide two examples from our own studies that reinforce and extend Fitch's proposal.
Learning Evolution and the Nature of Science Using Evolutionary Computing and Artificial Life
ERIC Educational Resources Information Center
Pennock, Robert T.
2007-01-01
Because evolution in natural systems happens so slowly, it is difficult to design inquiry-based labs where students can experiment and observe evolution in the way they can when studying other phenomena. New research in evolutionary computation and artificial life provides a solution to this problem. This paper describes a new A-Life software…
A Novel Handwritten Letter Recognizer Using Enhanced Evolutionary Neural Network
NASA Astrophysics Data System (ADS)
Mahmoudi, Fariborz; Mirzashaeri, Mohsen; Shahamatnia, Ehsan; Faridnia, Saed
This paper introduces a novel design for handwritten letter recognition by employing a hybrid back-propagation neural network with an enhanced evolutionary algorithm. Feeding the neural network consists of a new approach which is invariant to translation, rotation, and scaling of input letters. Evolutionary algorithm is used for the global search of the search space and the back-propagation algorithm is used for the local search. The results have been computed by implementing this approach for recognizing 26 English capital letters in the handwritings of different people. The computational results show that the neural network reaches very satisfying results with relatively scarce input data and a promising performance improvement in convergence of the hybrid evolutionary back-propagation algorithms is exhibited.
Using concepts from biology to improve problem-solving methods
NASA Astrophysics Data System (ADS)
Goodman, Erik D.; Rothwell, Edward J.; Averill, Ronald C.
2011-06-01
Observing nature has been a cornerstone of engineering design. Today, engineers look not only at finished products, but imitate the evolutionary process by which highly optimized artifacts have appeared in nature. Evolutionary computation began by capturing only the simplest ideas of evolution, but today, researchers study natural evolution and incorporate an increasing number of concepts in order to evolve solutions to complex engineering problems. At the new BEACON Center for the Study of Evolution in Action, studies in the lab and field and in silico are laying the groundwork for new tools for evolutionary engineering design. This paper, which accompanies a keynote address, describes various steps in development and application of evolutionary computation, particularly as regards sensor design, and sets the stage for future advances.
Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems
NASA Technical Reports Server (NTRS)
Terrile, Richard J.; Guillaume, Alexandre
2009-01-01
Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.
Exploring Evolutionary Patterns in Genetic Sequence: A Computer Exercise
ERIC Educational Resources Information Center
Shumate, Alice M.; Windsor, Aaron J.
2010-01-01
The increase in publications presenting molecular evolutionary analyses and the availability of comparative sequence data through resources such as NCBI's GenBank underscore the necessity of providing undergraduates with hands-on sequence analysis skills in an evolutionary context. This need is particularly acute given that students have been…
Evaluation of Chapter I Take-Home Computer Program. Report No. 7, Vol. 25.
ERIC Educational Resources Information Center
Fraser, Lowrie A.
The Chapter I Take-Home Computer (THC) program was established in nine elementary and eight middle schools in Atlanta (Georgia) in the 1989-90 school year. One hundred and eighty computers were sent home with 422 students, whose parents were willing to work with the students, for 6-week periods. Log sheets were kept by each child regarding the…
ERIC Educational Resources Information Center
Mitchell, Eugene E., Ed.
In certain boundary layer or natural convection work, where a similarity transformation is valid, the equations can be reduced to a set of nonlinear ordinary differential equations. They are therefore well-suited to a fast solution on an analog/hybrid computer. This paper illustrates such usage of the analog/hybrid computer by a set of…
2013-11-12
Dr. Paramsothy Jayakumar (586) 282-4896 Computational Dynamics Inc. 0 Name of Contractor Computational Dynamics Inc. (CDI) 1809...Dr. Paramsothy Jayakumar TARDEC Computational Dynamics Inc. 1 Project Summary This project aims at addressing and remedying the serious...Shabana, A.A., Jayakumar , P., and Letherwood, M., “Soil Models and Vehicle System Dynamics”, Applied Mechanics Reviews, Vol. 65(4), 2013, doi
Spirov, Alexander; Holloway, David
2013-07-15
This paper surveys modeling approaches for studying the evolution of gene regulatory networks (GRNs). Modeling of the design or 'wiring' of GRNs has become increasingly common in developmental and medical biology, as a means of quantifying gene-gene interactions, the response to perturbations, and the overall dynamic motifs of networks. Drawing from developments in GRN 'design' modeling, a number of groups are now using simulations to study how GRNs evolve, both for comparative genomics and to uncover general principles of evolutionary processes. Such work can generally be termed evolution in silico. Complementary to these biologically-focused approaches, a now well-established field of computer science is Evolutionary Computations (ECs), in which highly efficient optimization techniques are inspired from evolutionary principles. In surveying biological simulation approaches, we discuss the considerations that must be taken with respect to: (a) the precision and completeness of the data (e.g. are the simulations for very close matches to anatomical data, or are they for more general exploration of evolutionary principles); (b) the level of detail to model (we proceed from 'coarse-grained' evolution of simple gene-gene interactions to 'fine-grained' evolution at the DNA sequence level); (c) to what degree is it important to include the genome's cellular context; and (d) the efficiency of computation. With respect to the latter, we argue that developments in computer science EC offer the means to perform more complete simulation searches, and will lead to more comprehensive biological predictions. Copyright © 2013 Elsevier Inc. All rights reserved.
Architecture and Programming Models for High Performance Intensive Computation
2016-06-29
Applications Systems and Large-Scale-Big-Data & Large-Scale-Big-Computing (DDDAS- LS ). ICCS 2015, June 2015. Reykjavk, Ice- land. 2. Bo YT, Wang P, Guo ZL...The Mahali project,” Communications Magazine , vol. 52, pp. 111–133, Aug 2014. 14 DISTRIBUTION A: Distribution approved for public release. Response ID
Campus-Wide Computing: Early Results Using Legion at the University of Virginia
2006-01-01
Bernard et al., “Primitives for Distributed Computing in a Heterogeneous Local Area Network Environ- ment”, IEEE Trans on Soft. Eng. vol. 15, no. 12...1994. [16] F. Ferstl, “CODINE Technical Overview,” Genias, April, 1993. [17] R. F. Freund and D. S. Cornwell , “Superconcurrency: A form of distributed
A Cellular Automata Approach to Computer Vision and Image Processing.
1980-09-01
the ACM, vol. 15, no. 9, pp. 827-837. [ Duda and Hart] R. 0. Duda and P. E. Hart, Pattern Classification and Scene Analysis, Wiley, New York, 1973...Center TR-738, 1979. [Farley] Arthur M. Farley and Andrzej Proskurowski, "Gossiping in Grid Graphs", University of Oregon Computer Science Department CS-TR
2012-03-22
with performance profiles, Math. Program., 91 (2002), pp. 201–213. [6] P. DRINEAS, R. KANNAN, AND M. W. MAHONEY , Fast Monte Carlo algorithms for matrices...computing invariant subspaces of non-Hermitian matri- ces, Numer. Math., 25 ( 1975 /76), pp. 123–136. [25] , Matrix algorithms Vol. II: Eigensystems
Range Image Flow using High-Order Polynomial Expansion
2013-09-01
included as a default algorithm in the OpenCV library [2]. The research of estimating the motion between range images, or range flow, is much more...Journal of Computer Vision, vol. 92, no. 1, pp. 1‒31. 2. G. Bradski and A. Kaehler. 2008. Learning OpenCV : Computer Vision with the OpenCV Library
An Investigation of Memory Latency Reduction Using an Address Prediction Buffer
1992-12-01
McGraw-Hill Inc.. London, England, 1991. [GAJSKI87] Gajski , D.D. et al, Computer Architecture, IEEE Computer Society Press, Washington, D.C., 1987...California, (vol 19 no 3), 1991. [NOWICK92] Nowicki, G ., "Design and Implementation of a Read Prediction Buffer", Master’s Thesis, Naval Postgraduate School
Inference of Evolutionary Jumps in Large Phylogenies using Lévy Processes
Duchen, Pablo; Leuenberger, Christoph; Szilágyi, Sándor M.; Harmon, Luke; Eastman, Jonathan; Schweizer, Manuel
2017-01-01
Abstract Although it is now widely accepted that the rate of phenotypic evolution may not necessarily be constant across large phylogenies, the frequency and phylogenetic position of periods of rapid evolution remain unclear. In his highly influential view of evolution, G. G. Simpson supposed that such evolutionary jumps occur when organisms transition into so-called new adaptive zones, for instance after dispersal into a new geographic area, after rapid climatic changes, or following the appearance of an evolutionary novelty. Only recently, large, accurate and well calibrated phylogenies have become available that allow testing this hypothesis directly, yet inferring evolutionary jumps remains computationally very challenging. Here, we develop a computationally highly efficient algorithm to accurately infer the rate and strength of evolutionary jumps as well as their phylogenetic location. Following previous work we model evolutionary jumps as a compound process, but introduce a novel approach to sample jump configurations that does not require matrix inversions and thus naturally scales to large trees. We then make use of this development to infer evolutionary jumps in Anolis lizards and Loriinii parrots where we find strong signal for such jumps at the basis of clades that transitioned into new adaptive zones, just as postulated by Simpson’s hypothesis. [evolutionary jump; Lévy process; phenotypic evolution; punctuated equilibrium; quantitative traits. PMID:28204787
Open Reading Frame Phylogenetic Analysis on the Cloud
2013-01-01
Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843
NASA Astrophysics Data System (ADS)
Gen, Mitsuo; Kawakami, Hiroshi; Tsujimura, Yasuhiro; Handa, Hisashi; Lin, Lin; Okamoto, Azuma
As efficient utilization of computational resources is increasing, evolutionary technology based on the Genetic Algorithm (GA), Genetic Programming (GP), Evolution Strategy (ES) and other Evolutionary Computations (ECs) is making rapid progress, and its social recognition and the need as applied technology are increasing. This is explained by the facts that EC offers higher robustness for knowledge information processing systems, intelligent production and logistics systems, most advanced production scheduling and other various real-world problems compared to the approaches based on conventional theories, and EC ensures flexible applicability and usefulness for any unknown system environment even in a case where accurate mathematical modeling fails in the formulation. In this paper, we provide a comprehensive survey of the current state-of-the-art in the fundamentals and applications of evolutionary technologies.
Computationally mapping sequence space to understand evolutionary protein engineering.
Armstrong, Kathryn A; Tidor, Bruce
2008-01-01
Evolutionary protein engineering has been dramatically successful, producing a wide variety of new proteins with altered stability, binding affinity, and enzymatic activity. However, the success of such procedures is often unreliable, and the impact of the choice of protein, engineering goal, and evolutionary procedure is not well understood. We have created a framework for understanding aspects of the protein engineering process by computationally mapping regions of feasible sequence space for three small proteins using structure-based design protocols. We then tested the ability of different evolutionary search strategies to explore these sequence spaces. The results point to a non-intuitive relationship between the error-prone PCR mutation rate and the number of rounds of replication. The evolutionary relationships among feasible sequences reveal hub-like sequences that serve as particularly fruitful starting sequences for evolutionary search. Moreover, genetic recombination procedures were examined, and tradeoffs relating sequence diversity and search efficiency were identified. This framework allows us to consider the impact of protein structure on the allowed sequence space and therefore on the challenges that each protein presents to error-prone PCR and genetic recombination procedures.
Optimizing Engineering Tools Using Modern Ground Architectures
2017-12-01
Considerations,” International Journal of Computer Science & Engineering Survey , vol. 5, no. 4, 2014. [10] R. Bell. (n.d). A beginner’s guide to big O notation...scientific community. Traditional computing architectures were not capable of processing the data efficiently, or in some cases, could not process the...thesis investigates how these modern computing architectures could be leveraged by industry and academia to improve the performance and capabilities of
ERIC Educational Resources Information Center
Marcovitz, Alan B., Ed.
Presented are two papers on computer applications in engineering education coursework. The first paper suggests that since most engineering graduates use only "canned programs" and rarely write their own programs, educational emphasis should include model building and the use of existing software as well as program writing. The second paper deals…
Performance Benchmark for a Prismatic Flow Solver
2007-03-26
Gauss- Seidel (LU-SGS) implicit method is used for time integration to reduce the computational time. A one-equation turbulence model by Goldberg and...numerical flux computations. The Lower-Upper-Symmetric Gauss- Seidel (LU-SGS) implicit method [1] is used for time integration to reduce the...Sharov, D. and Nakahashi, K., “Reordering of Hybrid Unstructured Grids for Lower-Upper Symmetric Gauss- Seidel Computations,” AIAA Journal, Vol. 36
2007-03-01
Chains," Mathematics of Control, Signals, and Systems, vol. 3(1), pp. 1-29, 1990. [4] A . Arnold, J . A . Carrillo, and I. Gamba, "Low and High Field...Aronson, C. L. A ., and J . L. Vázquez, "Interfaces with a corner point in one- dimensional porous medium flow," Comm. Pure Appl. Math, vol. 38(4), pp. 375...K. Levin, "Damage analysis of fiber composites," Computer Methods in Applied Mechanics and Engineering. [10] K. S. Barber, A . Goel, T. J . Graser, T
Prospective estimation of organ dose in CT under tube current modulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Xiaoyu, E-mail: xt3@duke.edu; Li, Xiang; Segars, W. Paul
Purpose: Computed tomography (CT) has been widely used worldwide as a tool for medical diagnosis and imaging. However, despite its significant clinical benefits, CT radiation dose at the population level has become a subject of public attention and concern. In this light, optimizing radiation dose has become a core responsibility for the CT community. As a fundamental step to manage and optimize dose, it may be beneficial to have accurate and prospective knowledge about the radiation dose for an individual patient. In this study, the authors developed a framework to prospectively estimate organ dose for chest and abdominopelvic CT examsmore » under tube current modulation (TCM). Methods: The organ dose is mainly dependent on two key factors: patient anatomy and irradiation field. A prediction process was developed to accurately model both factors. To model the anatomical diversity and complexity in the patient population, the authors used a previously developed library of computational phantoms with broad distributions of sizes, ages, and genders. A selected clinical patient, represented by a computational phantom in the study, was optimally matched with another computational phantom in the library to obtain a representation of the patient’s anatomy. To model the irradiation field, a previously validated Monte Carlo program was used to model CT scanner systems. The tube current profiles were modeled using a ray-tracing program as previously reported that theoretically emulated the variability of modulation profiles from major CT machine manufacturers Li et al., [Phys. Med. Biol. 59, 4525–4548 (2014)]. The prediction of organ dose was achieved using the following process: (1) CTDI{sub vol}-normalized-organ dose coefficients (h{sub organ}) for fixed tube current were first estimated as the prediction basis for the computational phantoms; (2) each computation phantom, regarded as a clinical patient, was optimally matched with one computational phantom in the library; (3) to account for the effect of the TCM scheme, a weighted organ-specific CTDI{sub vol} [denoted as (CTDI{sub vol}){sub organ,weighted}] was computed for each organ based on the TCM profile and the anatomy of the “matched” phantom; (4) the organ dose was predicted by multiplying the weighted organ-specific CTDI{sub vol} with the organ dose coefficients (h{sub organ}). To quantify the prediction accuracy, each predicted organ dose was compared with the corresponding organ dose simulated from the Monte Carlo program with the TCM profile explicitly modeled. Results: The predicted organ dose showed good agreements with the simulated organ dose across all organs and modulation profiles. The average percentage error in organ dose estimation was generally within 20% across all organs and modulation profiles, except for organs located in the pelvic and shoulder regions. For an average CTDI{sub vol} of a CT exam of 10 mGy, the average error at full modulation strength (α = 1) across all organs was 0.91 mGy for chest exams, and 0.82 mGy for abdominopelvic exams. Conclusions: This study developed a quantitative model to predict organ dose for clinical chest and abdominopelvic scans. Such information may aid in the design of optimized CT protocols in relation to a targeted level of image quality.« less
ERIC Educational Resources Information Center
Marcovitz, Alan B., Ed.
A computer program for numeric and symbolic manipulation and the methodology underlying its development are presented. Some features of the program are: an option for implied multiplication; computation of higher-order derivatives; differentiation of 26 different trigonometric, hyperbolic, inverse trigonometric, and inverse hyperbolic functions;…
Three-dimensional analysis of third molar development to estimate age of majority.
Márquez-Ruiz, Ana Belén; Treviño-Tijerina, María Concepción; González-Herrera, Lucas; Sánchez, Belén; González-Ramírez, Amanda Rocío; Valenzuela, Aurora
2017-09-01
Third molars are one of the few biological markers available for age estimation in undocumented juveniles close the legal age of majority, assuming an age of 18years as the most frequent legal demarcation between child and adult status. To obtain more accurate visualization and evaluation of third molar mineralization patterns from computed tomography images, a new software application, DentaVol©, was developed. Third molar mineralization according to qualitative (Demirjian's maturational stage) and quantitative parameters (third molar volume) of dental development was assessed in multi-slice helical computed tomography images of both maxillary arches displayed by DentaVol© from 135 individuals (62 females and 73 males) aged between 14 and 23years. Intra- and inter-observer agreement values were remarkably high for both evaluation procedures and for all third molars. A linear correlation between third molar mineralization and chronological age was found, with third molar maturity occurring earlier in males than in females. Assessment of dental development with both procedures, by using DentaVol© software, can be considered a good indicator of age of majority (18years or older) in all third molars. Our results indicated that virtual computed tomography imaging can be considered a valid alternative to orthopantomography for evaluations of third molar mineralization, and therefore a complementary tool for determining the age of majority. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.
Stochastic Evolutionary Algorithms for Planning Robot Paths
NASA Technical Reports Server (NTRS)
Fink, Wolfgang; Aghazarian, Hrand; Huntsberger, Terrance; Terrile, Richard
2006-01-01
A computer program implements stochastic evolutionary algorithms for planning and optimizing collision-free paths for robots and their jointed limbs. Stochastic evolutionary algorithms can be made to produce acceptably close approximations to exact, optimal solutions for path-planning problems while often demanding much less computation than do exhaustive-search and deterministic inverse-kinematics algorithms that have been used previously for this purpose. Hence, the present software is better suited for application aboard robots having limited computing capabilities (see figure). The stochastic aspect lies in the use of simulated annealing to (1) prevent trapping of an optimization algorithm in local minima of an energy-like error measure by which the fitness of a trial solution is evaluated while (2) ensuring that the entire multidimensional configuration and parameter space of the path-planning problem is sampled efficiently with respect to both robot joint angles and computation time. Simulated annealing is an established technique for avoiding local minima in multidimensional optimization problems, but has not, until now, been applied to planning collision-free robot paths by use of low-power computers.
Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.
Menges, Achim
2012-03-01
Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.
Inference of Evolutionary Jumps in Large Phylogenies using Lévy Processes.
Duchen, Pablo; Leuenberger, Christoph; Szilágyi, Sándor M; Harmon, Luke; Eastman, Jonathan; Schweizer, Manuel; Wegmann, Daniel
2017-11-01
Although it is now widely accepted that the rate of phenotypic evolution may not necessarily be constant across large phylogenies, the frequency and phylogenetic position of periods of rapid evolution remain unclear. In his highly influential view of evolution, G. G. Simpson supposed that such evolutionary jumps occur when organisms transition into so-called new adaptive zones, for instance after dispersal into a new geographic area, after rapid climatic changes, or following the appearance of an evolutionary novelty. Only recently, large, accurate and well calibrated phylogenies have become available that allow testing this hypothesis directly, yet inferring evolutionary jumps remains computationally very challenging. Here, we develop a computationally highly efficient algorithm to accurately infer the rate and strength of evolutionary jumps as well as their phylogenetic location. Following previous work we model evolutionary jumps as a compound process, but introduce a novel approach to sample jump configurations that does not require matrix inversions and thus naturally scales to large trees. We then make use of this development to infer evolutionary jumps in Anolis lizards and Loriinii parrots where we find strong signal for such jumps at the basis of clades that transitioned into new adaptive zones, just as postulated by Simpson's hypothesis. [evolutionary jump; Lévy process; phenotypic evolution; punctuated equilibrium; quantitative traits. The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Big cat phylogenies, consensus trees, and computational thinking.
Sul, Seung-Jin; Williams, Tiffani L
2011-07-01
Phylogenetics seeks to deduce the pattern of relatedness between organisms by using a phylogeny or evolutionary tree. For a given set of organisms or taxa, there may be many evolutionary trees depicting how these organisms evolved from a common ancestor. As a result, consensus trees are a popular approach for summarizing the shared evolutionary relationships in a group of trees. We examine these consensus techniques by studying how the pantherine lineage of cats (clouded leopard, jaguar, leopard, lion, snow leopard, and tiger) evolved, which is hotly debated. While there are many phylogenetic resources that describe consensus trees, there is very little information, written for biologists, regarding the underlying computational techniques for building them. The pantherine cats provide us with a small, relevant example to explore the computational techniques (such as sorting numbers, hashing functions, and traversing trees) for constructing consensus trees. Our hope is that life scientists enjoy peeking under the computational hood of consensus tree construction and share their positive experiences with others in their community.
Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir
2011-01-01
Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353
Computational Investigation of Shock-Mitigation Efficacy of Polyurea When Used in a Combat Helmet
2012-01-01
Multidiscipline Modeling in Materials and Structures Emerald Article: Computational investigation of shock-mitigation efficacy of polyurea when used...mitigation efficacy of polyurea when used in a combat helmet: A core sample analysis", Multidiscipline Modeling in Materials and Structures, Vol. 8 Iss...to 00-00-2012 4. TITLE AND SUBTITLE Computational investigation of shock-mitigation efficacy of polyurea when used in a combat helmet: A core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Brigette; King, Ann; Lo, Y.M. Dennis
Purpose: Plasma Epstein-Barr virus DNA (pEBV DNA) is an important prognostic marker in nasopharyngeal carcinoma (NPC). This study tested the hypotheses that pEBV DNA reflects tumor burden and metabolic activity by evaluating its relationship with tumor volume and {sup 18}F-fluorodeoxyglucose ({sup 18}F-FDG) uptake in NPC. Methods and Materials: Pre-treatment pEBV DNA analysis, {sup 18}F-FDG positron emission tomography-computed tomography scan (PET-CT) and magnetic resonance imaging (MRI) of the head and neck were performed in 57 patients. Net volume (cm{sup 3}) of the primary tumor (T{sub vol}) and regional nodes (N{sub vol}) were quantified on MRI. {sup 18}F-FDG uptake was expressed asmore » the maximum standardized uptake value (SUV{sub max}) at the primary tumor (T{sub suv}) and regional nodes (N{sub suv}). Lesions with SUV{sub max} {>=} 2.5 were considered malignant. Relationship between SUV{sub max}, natural logarithm (log) of pEBV DNA, and square root (sq) of MRI volumes was analyzed using the Wilcoxon test. A linear regression model was constructed to test for any interaction between variables and disease stage. Results: Log-pEBV DNA showed significant correlation with sq-T{sub vol} (r = 0.393), sq-N{sub vol} (r = 0.452), total tumor volume (sq-Total{sub vol} = T{sub vol} + N{sub vol}, r = 0.554), T{sub suv} (r = 0.276), N{sub suv} (r = 0.434), and total SUV{sub max} (Total{sub suv} = T{sub suv} + N{sub suv}, r = 0.457). Likewise, sq-T{sub vol} was correlated to T{sub suv} (r 0.426), and sq-N{sub vol} with N{sub suv} (r = 0.651). Regression analysis showed that only log-pEBV DNA was significantly associated with sq-Total{sub vol} (p < 0.001; parameter estimate = 8.844; 95% confidence interval = 3.986-13.703), whereas Sq-T{sub vol} was significantly associated with T{sub suv} (p = 0.002; parameter estimate = 3.923; 95% confidence interval = 1.498-6.348). Conclusion: This study supports the hypothesis that cell-free plasma EBV DNA is a marker of tumor burden in EBV-related NPC.« less
ERIC Educational Resources Information Center
Mitchell, Eugene E., Ed.
In context of an instrumentation course, four ocean engineering students set out to design and construct a micro-computer based data acquisition system that would be compatible with the University's CYBER host computer. The project included hardware design in the area of sampling, analog-to-digital conversion and timing coordination. It also…
Artificial Intelligence: A ’User Friendly Introduction
1985-03-01
computer sVste-. They are tc not only ’magnify’ human nental abilitieL, but perform tasks with an waerring tirele-snets . while serving as ’intelligent...Can’t See (Yet)," Abacus, Vol. I, Iq83, 17-26. 1.6. Kevin McKean, "Computers That See," Discover, September 1984, 1-74. 17. Takeo Kanade and Raj
ERIC Educational Resources Information Center
Mitchell, Eugene E., Ed.
The simulation of a sampled-data system is described that uses a full parallel hybrid computer. The sampled data system simulated illustrates the proportional-integral-derivative (PID) discrete control of a continuous second-order process representing a stirred-tank. The stirred-tank is simulated using continuous analog components, while PID…
Information management system study results. Volume 2: IMS study results appendixes
NASA Technical Reports Server (NTRS)
1971-01-01
Computer systems program specifications are presented for the modular space station information management system. These are the computer program contract end item, data bus system, data bus breadboard, and display interface adapter specifications. The performance, design, tests, and qualification requirements are established for the implementation of the information management system. For Vol. 1, see N72-19972.
Regulatory RNA design through evolutionary computation and strand displacement.
Rostain, William; Landrain, Thomas E; Rodrigo, Guillermo; Jaramillo, Alfonso
2015-01-01
The discovery and study of a vast number of regulatory RNAs in all kingdoms of life over the past decades has allowed the design of new synthetic RNAs that can regulate gene expression in vivo. Riboregulators, in particular, have been used to activate or repress gene expression. However, to accelerate and scale up the design process, synthetic biologists require computer-assisted design tools, without which riboregulator engineering will remain a case-by-case design process requiring expert attention. Recently, the design of RNA circuits by evolutionary computation and adapting strand displacement techniques from nanotechnology has proven to be suited to the automated generation of DNA sequences implementing regulatory RNA systems in bacteria. Herein, we present our method to carry out such evolutionary design and how to use it to create various types of riboregulators, allowing the systematic de novo design of genetic control systems in synthetic biology.
Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing
Nguyen, Nga Thi Thuy; Vincens, Pierre
2018-01-01
Abstract Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). PMID:29087490
Magnetic Control of Hypersonic Flow
NASA Astrophysics Data System (ADS)
Poggie, Jonathan; Gaitonde, Datta
2000-11-01
Electromagnetic control is an appealing possibility for mitigating the thermal loads that occur in hypersonic flight, in particular for the case of atmospheric entry. There was extensive research on this problem between about 1955 and 1970,(M. F. Romig, ``The Influence of Electric and Magnetic Fields on Heat Transfer to Electrically Conducting Fluids,'' \\underlineAdvances In Heat Transfer), Vol. 1, Academic Press, NY, 1964. and renewed interest has arisen due to developments in the technology of super-conducting magnets and the understanding of the physics of weakly-ionized, non-equilibrium plasmas. In order to examine the physics of this problem, and to evaluate the practicality of electromagnetic control in hypersonic flight, we have developed a computer code to solve the three-dimensional, non-ideal magnetogasdynamics equations. We have applied the code to the problem of magnetically-decelerated hypersonic flow over a sphere, and observed a reduction, with an applied dipole field, in heat flux and skin friction near the nose of the body, as well as an increase in shock standoff distance. The computational results compare favorably with the analytical predictions of Bush.(W. B. Bush, ``Magnetohydrodynamic-Hypersonic Flow Past a Blunt Body'', Journal of the Aero/Space Sciences, Vol. 25, No. 11, 1958; ``The Stagnation-Point Boundary Layer in the Presence of an Applied Magnetic Field'', Vol. 28, No. 8, 1961.)
1993-12-01
where negative charge state. The local symmetry of the Ge(I) and Ge(II) centers are CI and C2 respectively. (See also Fig. 1.) q=- 1 Ge(I) Ge(II) s p...Raymond Field: Dept. of Computer Science Dept, CEM. M•e s , PhD Laboratory: / 3200 Willow Creek Road zmbry-Riddle Aeronautical Univ Vol-Page No: 0- 0...Field: Electrical Engineering Assistant Professor, PhD Laboratory: PL/WS 2390 S . York Street University of Denver Vol-Page No: 3-35 Denver, CO 80209-0177
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayne F. Boyer; Gurdeep S. Hura
2005-09-01
The Problem of obtaining an optimal matching and scheduling of interdependent tasks in distributed heterogeneous computing (DHC) environments is well known to be an NP-hard problem. In a DHC system, task execution time is dependent on the machine to which it is assigned and task precedence constraints are represented by a directed acyclic graph. Recent research in evolutionary techniques has shown that genetic algorithms usually obtain more efficient schedules that other known algorithms. We propose a non-evolutionary random scheduling (RS) algorithm for efficient matching and scheduling of inter-dependent tasks in a DHC system. RS is a succession of randomized taskmore » orderings and a heuristic mapping from task order to schedule. Randomized task ordering is effectively a topological sort where the outcome may be any possible task order for which the task precedent constraints are maintained. A detailed comparison to existing evolutionary techniques (GA and PSGA) shows the proposed algorithm is less complex than evolutionary techniques, computes schedules in less time, requires less memory and fewer tuning parameters. Simulation results show that the average schedules produced by RS are approximately as efficient as PSGA schedules for all cases studied and clearly more efficient than PSGA for certain cases. The standard formulation for the scheduling problem addressed in this paper is Rm|prec|Cmax.,« less
Cognitive Coordination on the Network Centric Battlefield
2009-03-06
access in spoken language comprehension: Evaluating a linking hypothesis between fixations and linguistic processing. Journal of Psycholinguistic ...Research, Vol 29, 557-580 56 Trueswell, J. & Tanenhaus, M (eds.) (2004). World-situated language use: Psycholinguistic , linguistic, and computational
NASA Technical Reports Server (NTRS)
1995-01-01
Proceedings from symposia of the Technology 2004 Conference, November 8-10, 1994, Washington, DC. Volume 2 features papers on computers and software, virtual reality simulation, environmental technology, video and imaging, medical technology and life sciences, robotics and artificial intelligence, and electronics.
Avoiding Local Optima with Interactive Evolutionary Robotics
2012-07-09
the top of a flight of stairs selects for climbing ; suspending the robot and the target object above the ground and creating rungs between the two will...REPORT Avoiding Local Optimawith Interactive Evolutionary Robotics 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The main bottleneck in evolutionary... robotics has traditionally been the time required to evolve robot controllers. However with the continued acceleration in computational resources, the
Optimality and stability of symmetric evolutionary games with applications in genetic selection.
Huang, Yuanyuan; Hao, Yiping; Wang, Min; Zhou, Wen; Wu, Zhijun
2015-06-01
Symmetric evolutionary games, i.e., evolutionary games with symmetric fitness matrices, have important applications in population genetics, where they can be used to model for example the selection and evolution of the genotypes of a given population. In this paper, we review the theory for obtaining optimal and stable strategies for symmetric evolutionary games, and provide some new proofs and computational methods. In particular, we review the relationship between the symmetric evolutionary game and the generalized knapsack problem, and discuss the first and second order necessary and sufficient conditions that can be derived from this relationship for testing the optimality and stability of the strategies. Some of the conditions are given in different forms from those in previous work and can be verified more efficiently. We also derive more efficient computational methods for the evaluation of the conditions than conventional approaches. We demonstrate how these conditions can be applied to justifying the strategies and their stabilities for a special class of genetic selection games including some in the study of genetic disorders.
Evolutionary inference via the Poisson Indel Process.
Bouchard-Côté, Alexandre; Jordan, Michael I
2013-01-22
We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.
Evolutionary inference via the Poisson Indel Process
Bouchard-Côté, Alexandre; Jordan, Michael I.
2013-01-01
We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114–124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments. PMID:23275296
VizieR Online Data Catalog: Comparison of evolutionary tracks (Martins+, 2013)
NASA Astrophysics Data System (ADS)
Martins, F.; Palacios, A.
2013-11-01
Tables of evolutionary models for massive stars. The files m*_stol.dat correspond to models computed with the code STAREVOL. The files m*_mesa.dat correspond to models computed with the code MESA. For each code, models with initial masses equal to 7, 9, 15, 20, 25, 40 and 60M⊙ are provided. No rotation is included. The overshooting parameter f is equal to 0.01. The metallicity is solar. (14 data files).
1987-05-01
34 Advances in Crypt g: Proceedings of CRYPTO 84,r o ... .. .. _ __...o ... .. ... ....... ed. by G.R. Blakely and D. Chaum . [Wagn84b] Wagner, Neal R...in Distributed Computer Systems," IEEE Trans. on Computers, Vol. C-35, No. 7, Jul. 86, pp. 583-590. Gifford, David K., "Cryptographic Sealing for
ERIC Educational Resources Information Center
Marcovitz, Alan B., Ed.
Described is the use of an analog/hybrid computer installation to study those physical phenomena that can be described through the evaluation of an algebraic function of a complex variable. This is an alternative way to study such phenomena on an interactive graphics terminal. The typical problem used, involving complex variables, is that of…
Unified Method for Delay Analysis of Random Multiple Access Algorithms.
1985-08-01
packets in the first cell of the stack. The rules of the algorithm yield the following relation for the wi’s: n-1 n w 0= ; W =1; i i 9Q h I+ + zwI .+N...for computer communica- tions", in Proc. 1970 Fall Joint Computer Conf., AFIPS Press, vol. 37, 1970, pp. 281 -285. (15] N. D. Vvedenskaya and B. S
Designing Secure Systems on Reconfigurable Hardware
2008-07-01
Jeff White Department of Electrical and Computer Engineering University of California, Santa Barbara Santa Barbara, CA 93106 {nick callegari,valamehr...ece.ucsb.edu, jdwhite08@engineering.ucsb.edu Ryan Kastner Department of Computer Science and Engineering University of California, San Diego La Jolla...Transactions on Design Automation of Electronic Systems (TODAES), Vol. 13, No. 3, July 2008, 1-24 14. ABSTRACT see report 15. SUBJECT TERMS 16
ERIC Educational Resources Information Center
Jolls, Kenneth R.; And Others
A technique is described for the generation of perspective views of three-dimensional models using computer graphics. The technique is applied to models of familiar thermodynamic phase diagrams and the results are presented for the ideal gas and van der Waals equations of state as well as the properties of liquid water and steam from the Steam…
Wireless Communications in Reverberant Environments
2015-01-01
Secure Wireless Agent Testbed (SWAT), the Protocol Engineering Advanced Networking (PROTEAN) Research Group, the Data Fusion Laboratory (DFL), and the...constraints of their application. 81 Bibliography [1] V. Gungor and G. Hancke, “Industrial wireless sensor networks : Challenges, design principles, and...Bhattacharya, “Path loss estimation for a wireless sensor network for application in ship,” Int. J. of Comput. Sci. and Mobile Computing, vol. 2, no. 6, pp
NexGen PVAs: Incorporating Eco-Evolutionary Processes into Population Viability Models
We examine how the integration of evolutionary and ecological processes in population dynamics – an emerging framework in ecology – could be incorporated into population viability analysis (PVA). Driven by parallel, complementary advances in population genomics and computational ...
Training in software used by practising engineers should be included in university curricula
NASA Astrophysics Data System (ADS)
Silveira, A.; Perdigones, A.; García, J. L.
2009-04-01
Deally, an engineering education should prepare students, i.e., emerging engineers, to use problem-solving processes that synergistically combine creativity and imagination with rigour and discipline. Recently, pressures on curricula have resulted in the development of software-specific courses, often to the detriment of the understanding of theory [1]. However, it is also true that there is a demand for information technology courses by students other than computer science majors [2]. The emphasis on training engineers may be best placed on answering the needs of industry; indeed, many proposals are now being made to try to reduce the gap between the educational and industrial communities [3]. Training in the use of certain computer programs may be one way of better preparing engineering undergraduates for eventual employment in industry. However, industry's needs in this respect must first be known. The aim of this work was to determine which computer programs are used by practising agricultural engineers with the aim of incorporating training in their use into our department's teaching curriculum. The results showed that 72% of their working hours involved the use computer programs. The software packages most commonly used were Microsoft Office (used by 79% of respondents) and CAD (56%), as well as budgeting (27%), statistical (21%), engineering (15%) and GIS (13%) programs. As a result of this survey our university department opened an additional computer suite in order to provide students practical experience in the use of Microsoft Excel, budgeting and engineering software. The results of this survey underline the importance of computer software training in this and perhaps other fields of engineering. [1] D. J. Moore, and D. R. Voltmer, "Curriculum for an engineering renaissance," IEEE Trans. Educ., vol. 46, pp. 452-455, Nov. 2003. [2] N. Kock, R. Aiken, and C. Sandas, "Using complex IT in specific domains: developing and assessing a course for nonmajors," IEEE Trans. Educ., vol. 45, pp. 50- 56, Feb. 2002. [3] I. Vélez, and J. F. Sevillano, "A course to train digital hardware designers for industry," IEEE Trans. Educ., vol. 50, pp. 236-243, Aug. 2007. Acknowledgement: This work was supported in part by the Universidad Politécnica de Madrid, Spain.
Observational constraints for C-rich AGB stars
NASA Astrophysics Data System (ADS)
Rau, G.; Hron, J.; Paladini, C.; Aringer, B.; Marigo, P.; Eriksson, K.
We modeled the atmospheres of six carbon-rich Asymptotic Giant Branch stars (R Lep, R Vol, Y Pav, AQ Sgr, U Hya, and X TrA) using VLTI/MIDI interferometric observations, together with spectro-photometric data, we compared them with self-consistent, dynamic model atmospheres. The results show that the models can reproduce the Spectral Energy Distribution (SED) data well at wavelengths longer than 1 mu m, and the interferometric observations between 8 mu m and 10 mu m. We found differences at wavelengths shorter than 1 mu m in the SED, and longer than 10 mu m in the visibilities. The discrepancies observed can be explained in terms of a combination of data- and model-related reasons. We derived some stellar parameters, and our findings agree well with literature values within the uncertainties. Also, when comparing the location of the stars in the H-R diagram, with evolutionary tracks, the results show that the main derived properties (L, Teff, C/O ratios and stellar masses) from the model fitting are in good agreement with TP-AGB evolutionary calculations.
A Characterization of t/s-Diagnosability and Sequential t-Diagnosability in Designs
1990-10-01
41 151 161 171 181 r91 1101 REFERENCES K.-Y. Chwa and S. L. Hakimi, “On fault identification in diagnosable systems,” ZEEE Tmns. Comput...1975, pp. 167-170. S. L. Hakimi and A. T. Amin, “Characterization of the connection assignment problem of diagnosable systems,” ZEEE Trans. Comput...S. Karunanithi and A. D. Friedman, “Analysis of digital systems using a new measure of system diagnosis,” ZEEE Trans. Cornput., vol. C- A
Serohijos, Adrian W.R.; Shakhnovich, Eugene I.
2014-01-01
The variation among sequences and structures in nature is both determined by physical laws and by evolutionary history. However, these two factors are traditionally investigated by disciplines with different emphasis and philosophy—molecular biophysics on one hand and evolutionary population genetics in another. Here, we review recent theoretical and computational approaches that address the critical need to integrate these two disciplines. We first articulate the elements of these integrated approaches. Then, we survey their contribution to our mechanistic understanding of molecular evolution, the polymorphisms in coding region, the distribution of fitness effects (DFE) of mutations, the observed folding stability of proteins in nature, and the distribution of protein folds in genomes. PMID:24952216
Serohijos, Adrian W R; Shakhnovich, Eugene I
2014-06-01
The variation among sequences and structures in nature is both determined by physical laws and by evolutionary history. However, these two factors are traditionally investigated by disciplines with different emphasis and philosophy-molecular biophysics on one hand and evolutionary population genetics in another. Here, we review recent theoretical and computational approaches that address the crucial need to integrate these two disciplines. We first articulate the elements of these approaches. Then, we survey their contribution to our mechanistic understanding of molecular evolution, the polymorphisms in coding region, the distribution of fitness effects (DFE) of mutations, the observed folding stability of proteins in nature, and the distribution of protein folds in genomes. Copyright © 2014 Elsevier Ltd. All rights reserved.
Evolutionary trends in directional hearing
Carr, Catherine E.; Christensen-Dalsgaard, Jakob
2016-01-01
Tympanic hearing is a true evolutionary novelty that arose in parallel within early tetrapods. We propose that in these tetrapods, selection for sound localization in air acted upon pre-existing directionally sensitive brainstem circuits, similar to those in fishes. Auditory circuits in birds and lizards resemble this ancestral, directionally sensitive framework. Despite this anatomically similarity, coding of sound source location differs between birds and lizards. In birds, brainstem circuits compute sound location from interaural cues. Lizards, however, have coupled ears, and do not need to compute source location in the brain. Thus their neural processing of sound direction differs, although all show mechanisms for enhancing sound source directionality. Comparisons with mammals reveal similarly complex interactions between coding strategies and evolutionary history. PMID:27448850
Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing.
Nguyen, Nga Thi Thuy; Vincens, Pierre; Roest Crollius, Hugues; Louis, Alexandra
2018-01-04
Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Phylogenetic tree and community structure from a Tangled Nature model.
Canko, Osman; Taşkın, Ferhat; Argın, Kamil
2015-10-07
In evolutionary biology, the taxonomy and origination of species are widely studied subjects. An estimation of the evolutionary tree can be done via available DNA sequence data. The calculation of the tree is made by well-known and frequently used methods such as maximum likelihood and neighbor-joining. In order to examine the results of these methods, an evolutionary tree is pursued computationally by a mathematical model, called Tangled Nature. A relatively small genome space is investigated due to computational burden and it is found that the actual and predicted trees are in reasonably good agreement in terms of shape. Moreover, the speciation and the resulting community structure of the food-web are investigated by modularity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Scalable computing for evolutionary genomics.
Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert
2012-01-01
Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.
Design of Tactile Sensor Using Dynamic Wafer Technology Based on VLSI Technique
2001-10-25
Charles Noback, Rober Carola," Human Anatomy and Physiology" third edition, 1995. [5] M.H. Raibert and John E. Tanner, "Design and Implementation of VLSI Tactile Sensing Computer" Robotics Research vol 1, 1983.
2011-11-01
6.5 Conclusions 6-30 6.6 Acknowledgments 6-31 6.7 References 6-31 Chapter 7 – Experimental Investigation of the Supersonic Wake of a Re-entry 7-1... Noise on the Axial Location of 4-4 Transition for the HIFiRE-1 Cone at Zero Angle of Attack and Mach 6 Figure 4-2 Correlations for Transition...Without Sting) Mach 2 AoA 19 Symmetry Plane Computed 6-14 with LORE: Effect of Sting/ Blade vs. No Sting/ Blade on 8M Cells Mesh Figure 6-12 Mach 2
Mudford, Oliver C; Taylor, Sarah Ann; Martin, Neil T
2009-01-01
We reviewed all research articles in 10 recent volumes of the Journal of Applied Behavior Analysis (JABA): Vol. 28(3), 1995, through Vol. 38(2), 2005. Continuous recording was used in the majority (55%) of the 168 articles reporting data on free-operant human behaviors. Three methods for reporting interobserver agreement (exact agreement, block-by-block agreement, and time-window analysis) were employed in more than 10 of the articles that reported continuous recording. Having identified these currently popular agreement computation algorithms, we explain them to assist researchers, software writers, and other consumers of JABA articles.
Masterless Distributed Computing Over Mobile Devices
2012-09-01
Matrix Computations,” Handbooks in OR & MS, vol. 3, pp. 247–321, 1990. [18] R. Barrett et al ., Templates for the Solution of Linear Systems: Building...the truncated SVD of a matrix. The algorithm used in this thesis was developed by Halko , Martinsson, and Tropp in their journal article, Finding...experiments,” dodbuzz.com, 2011 . [Online]. Available: http://www.dodbuzz.com/ 2011 /06/06/army-begins-mobile- phone-experiments/. [Accessed: 15-Feb
ERIC Educational Resources Information Center
ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.
This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 16 titles deal with the following topics: (1) the response of the law to visual journalism from l839 to l978; (2) woman's image in authoritative Mormon discourse; (3) the depiction of computers and computer-related subjects in…
Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro
2012-10-15
There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.
Hybrid VLSI/QCA Architecture for Computing FFTs
NASA Technical Reports Server (NTRS)
Fijany, Amir; Toomarian, Nikzad; Modarres, Katayoon; Spotnitz, Matthew
2003-01-01
A data-processor architecture that would incorporate elements of both conventional very-large-scale integrated (VLSI) circuitry and quantum-dot cellular automata (QCA) has been proposed to enable the highly parallel and systolic computation of fast Fourier transforms (FFTs). The proposed circuit would complement the QCA-based circuits described in several prior NASA Tech Briefs articles, namely Implementing Permutation Matrices by Use of Quantum Dots (NPO-20801), Vol. 25, No. 10 (October 2001), page 42; Compact Interconnection Networks Based on Quantum Dots (NPO-20855) Vol. 27, No. 1 (January 2003), page 32; and Bit-Serial Adder Based on Quantum Dots (NPO-20869), Vol. 27, No. 1 (January 2003), page 35. The cited prior articles described the limitations of very-large-scale integrated (VLSI) circuitry and the major potential advantage afforded by QCA. To recapitulate: In a VLSI circuit, signal paths that are required not to interact with each other must not cross in the same plane. In contrast, for reasons too complex to describe in the limited space available for this article, suitably designed and operated QCAbased signal paths that are required not to interact with each other can nevertheless be allowed to cross each other in the same plane without adverse effect. In principle, this characteristic could be exploited to design compact, coplanar, simple (relative to VLSI) QCA-based networks to implement complex, advanced interconnection schemes.
On joint subtree distributions under two evolutionary models.
Wu, Taoyang; Choi, Kwok Pui
2016-04-01
In population and evolutionary biology, hypotheses about micro-evolutionary and macro-evolutionary processes are commonly tested by comparing the shape indices of empirical evolutionary trees with those predicted by neutral models. A key ingredient in this approach is the ability to compute and quantify distributions of various tree shape indices under random models of interest. As a step to meet this challenge, in this paper we investigate the joint distribution of cherries and pitchforks (that is, subtrees with two and three leaves) under two widely used null models: the Yule-Harding-Kingman (YHK) model and the proportional to distinguishable arrangements (PDA) model. Based on two novel recursive formulae, we propose a dynamic approach to numerically compute the exact joint distribution (and hence the marginal distributions) for trees of any size. We also obtained insights into the statistical properties of trees generated under these two models, including a constant correlation between the cherry and the pitchfork distributions under the YHK model, and the log-concavity and unimodality of the cherry distributions under both models. In addition, we show that there exists a unique change point for the cherry distributions between these two models. Copyright © 2015 Elsevier Inc. All rights reserved.
Gill, Ritu R; Naidich, David P; Mitchell, Alan; Ginsberg, Michelle; Erasmus, Jeremy; Armato, Samuel G; Straus, Christopher; Katz, Sharyn; Patios, Demetrois; Richards, William G; Rusch, Valerie W
2016-08-01
Clinical tumor (T), node, and metastasis staging is based on a qualitative assessment of features defining T descriptors and has been found to be suboptimal for predicting the prognosis of patients with malignant pleural mesothelioma (MPM). Previous work suggests that volumetric computed tomography (VolCT) is prognostic and, if found practical and reproducible, could improve clinical MPM classification. Six North American institutions electronically submitted clinical, pathologic, and imaging data on patients with stages I to IV MPM to an established multicenter database and biostatistical center. Two reference radiologists blinded to clinical data independently reviewed the scans; calculated clinical T, node, and metastasis stage by standard criteria; performed semiautomated tumor volume calculations using commercially available software; and submitted the findings to the biostatistical center. Study end points included the feasibility of a multi-institutional VolCT network, concordance of independent VolCT assessments, and association of VolCT with pathological T classification. Of 164 submitted cases, 129 were evaluated by both reference radiologists. Discordant clinical staging of most cases confirmed the inadequacy of current criteria. The overall correlation between VolCT estimates was good (Spearman correlation 0.822), but some were significantly discordant. Root cause analysis of the most discordant estimates identified four common sources of variability. Despite these limitations, median tumor volume estimates were similar within subgroups of cases representing each pathological T descriptor and increased monotonically for each reference radiologist with increasing pathological T status. The good correlation between VolCT estimates obtained for most cases reviewed by two independent radiologists and qualitative association of VolCT with pathological T status combine to encourage further study. The identified sources of user error will inform design of a follow-up prospective trial to more formally assess interobserver variability of VolCT and its potential contribution to clinical MPM staging. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.
Evolutionary Computing Methods for Spectral Retrieval
NASA Technical Reports Server (NTRS)
Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna
2009-01-01
A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.
PhyloDet: a scalable visualization tool for mapping multiple traits to large evolutionary trees
Lee, Bongshin; Nachmanson, Lev; Robertson, George; Carlson, Jonathan M.; Heckerman, David
2009-01-01
Summary: Evolutionary biologists are often interested in finding correlations among biological traits across a number of species, as such correlations may lead to testable hypotheses about the underlying function. Because some species are more closely related than others, computing and visualizing these correlations must be done in the context of the evolutionary tree that relates species. In this note, we introduce PhyloDet (short for PhyloDetective), an evolutionary tree visualization tool that enables biologists to visualize multiple traits mapped to the tree. Availability: http://research.microsoft.com/cue/phylodet/ Contact: bongshin@microsoft.com. PMID:19633096
Santa Clara Computer And High Technology Law Journal; Vol. 11, No. 1
DOT National Transportation Integrated Search
1995-03-01
SINCE SURFACE TRANSPORTATION MOVES PEOPLE & GOODS ALONG PUBLIC ROADS & TRANSIT SYSTEMS, IT MAY SEEM ODD TO BE CONCERNED ABOUT PRIVACY IN SUCH A HIGHLY PUBLIC CONTEXT. AND YET, RESPECTING PRIVACY WILL BE IMPORTANT, AS ADVANCED TECHNOLOGIES IMPROVE HIG...
Investigating Digital Optical Computing with Spatial Light Rebroadcasters
1991-10-31
3303, 1991 5 5: I 66. F. Rosenblatt, Principles of neurodynamics : Perceptrons and the theory of brain mechanism, Spartan, Washington, 1961. 67. D. E...eds., Parallel Distributed Processing: Explorations in the Microstructure of Cognition , Vol-I : Foundations, MIT Press, 1986. 68. A. Ayyalusamy
Network-level architecture and the evolutionary potential of underground metabolism.
Notebaart, Richard A; Szappanos, Balázs; Kintses, Bálint; Pál, Ferenc; Györkei, Ádám; Bogos, Balázs; Lázár, Viktória; Spohn, Réka; Csörgő, Bálint; Wagner, Allon; Ruppin, Eytan; Pál, Csaba; Papp, Balázs
2014-08-12
A central unresolved issue in evolutionary biology is how metabolic innovations emerge. Low-level enzymatic side activities are frequent and can potentially be recruited for new biochemical functions. However, the role of such underground reactions in adaptation toward novel environments has remained largely unknown and out of reach of computational predictions, not least because these issues demand analyses at the level of the entire metabolic network. Here, we provide a comprehensive computational model of the underground metabolism in Escherichia coli. Most underground reactions are not isolated and 45% of them can be fully wired into the existing network and form novel pathways that produce key precursors for cell growth. This observation allowed us to conduct an integrated genome-wide in silico and experimental survey to characterize the evolutionary potential of E. coli to adapt to hundreds of nutrient conditions. We revealed that underground reactions allow growth in new environments when their activity is increased. We estimate that at least ∼20% of the underground reactions that can be connected to the existing network confer a fitness advantage under specific environments. Moreover, our results demonstrate that the genetic basis of evolutionary adaptations via underground metabolism is computationally predictable. The approach used here has potential for various application areas from bioengineering to medical genetics.
A comparison of sequential and spiral scanning techniques in brain CT.
Pace, Ivana; Zarb, Francis
2015-01-01
To evaluate and compare image quality and radiation dose of sequential computed tomography (CT) examinations of the brain and spiral CT examinations of the brain imaged on a GE HiSpeed NX/I Dual Slice 2CT scanner. A random sample of 40 patients referred for CT examination of the brain was selected and divided into 2 groups. Half of the patients were scanned using the sequential technique; the other half were scanned using the spiral technique. Radiation dose data—both the computed tomography dose index (CTDI) and the dose length product (DLP)—were recorded on a checklist at the end of each examination. Using the European Guidelines on Quality Criteria for Computed Tomography, 4 radiologists conducted a visual grading analysis and rated the level of visibility of 6 anatomical structures considered necessary to produce images of high quality. The mean CTDI(vol) and DLP values were statistically significantly higher (P <.05) with the sequential scans (CTDI(vol): 22.06 mGy; DLP: 304.60 mGy • cm) than with the spiral scans (CTDI(vol): 14.94 mGy; DLP: 229.10 mGy • cm). The mean image quality rating scores for all criteria of the sequential scanning technique were statistically significantly higher (P <.05) in the visual grading analysis than those of the spiral scanning technique. In this local study, the sequential technique was preferred over the spiral technique for both overall image quality and differentiation between gray and white matter in brain CT scans. Other similar studies counter this finding. The radiation dose seen with the sequential CT scanning technique was significantly higher than that seen with the spiral CT scanning technique. However, image quality with the sequential technique was statistically significantly superior (P <.05).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khatonabadi, Maryam; Kim, Hyun J.; Lu, Peiyun
Purpose: In AAPM Task Group 204, the size-specific dose estimate (SSDE) was developed by providing size adjustment factors which are applied to the Computed Tomography (CT) standardized dose metric, CTDI{sub vol}. However, that work focused on fixed tube current scans and did not specifically address tube current modulation (TCM) scans, which are currently the majority of clinical scans performed. The purpose of this study was to extend the SSDE concept to account for TCM by investigating the feasibility of using anatomic and organ specific regions of scanner output to improve accuracy of dose estimates. Methods: Thirty-nine adult abdomen/pelvis and 32more » chest scans from clinically indicated CT exams acquired on a multidetector CT using TCM were obtained with Institutional Review Board approval for generating voxelized models. Along with image data, raw projection data were obtained to extract TCM functions for use in Monte Carlo simulations. Patient size was calculated using the effective diameter described in TG 204. In addition, the scanner-reported CTDI{sub vol} (CTDI{sub vol,global}) was obtained for each patient, which is based on the average tube current across the entire scan. For the abdomen/pelvis scans, liver, spleen, and kidneys were manually segmented from the patient datasets; for the chest scans, lungs and for female models only, glandular breast tissue were segmented. For each patient organ doses were estimated using Monte Carlo Methods. To investigate the utility of regional measures of scanner output, regional and organ anatomic boundaries were identified from image data and used to calculate regional and organ-specific average tube current values. From these regional and organ-specific averages, CTDI{sub vol} values, referred to as regional and organ-specific CTDI{sub vol}, were calculated for each patient. Using an approach similar to TG 204, all CTDI{sub vol} values were used to normalize simulated organ doses; and the ability of each normalized dose to correlate with patient size was investigated. Results: For all five organs, the correlations with patient size increased when organ doses were normalized by regional and organ-specific CTDI{sub vol} values. For example, when estimating dose to the liver, CTDI{sub vol,global} yielded a R{sup 2} value of 0.26, which improved to 0.77 and 0.86, when using the regional and organ-specific CTDI{sub vol} for abdomen and liver, respectively. For breast dose, the global CTDI{sub vol} yielded a R{sup 2} value of 0.08, which improved to 0.58 and 0.83, when using the regional and organ-specific CTDI{sub vol} for chest and breasts, respectively. The R{sup 2} values also increased once the thoracic models were separated for the analysis into females and males, indicating differences between genders in this region not explained by a simple measure of effective diameter. Conclusions: This work demonstrated the utility of regional and organ-specific CTDI{sub vol} as normalization factors when using TCM. It was demonstrated that CTDI{sub vol,global} is not an effective normalization factor in TCM exams where attenuation (and therefore tube current) varies considerably throughout the scan, such as abdomen/pelvis and even thorax. These exams can be more accurately assessed for dose using regional CTDI{sub vol} descriptors that account for local variations in scanner output present when TCM is employed.« less
CRITTERS! A Realistic Simulation for Teaching Evolutionary Biology
ERIC Educational Resources Information Center
Latham, Luke G., II; Scully, Erik P.
2008-01-01
Evolutionary processes can be studied in nature and in the laboratory, but time and financial constraints result in few opportunities for undergraduate and high school students to explore the agents of genetic change in populations. One alternative to time consuming and expensive teaching laboratories is the use of computer simulations. We…
Database Translator (DATALATOR) for Integrated Exploitation
2010-10-31
Modelling in Information Systems Engineering. 2007, Berlin : Springer, pp. 39-58. 2. Arnon Rosenthal, Len Seligman . Pragmatics and Open Problems for Inter...2004, Vol. 2938 . 21. Ahuja, S., N. Carriero and D. Gelemte,. Linda and friends. IEEE Computer. August 1986, pp. 26- 32. 40 Next Generation Software
User Interface Design for Military AR Applications
2010-12-12
virtual objects with the real world: seeing ultrasound imagery within the patient. In: Computer graphics (SIGGRAPH ’ 92 proceedings), vol 26, pp 203–210... airborne reconnaissance and weapon delivery. In: Proceedings of symposium for image display and recording, US Air Force Avionics Laboratory, Wright
Quantitative Robust Control Engineering: Theory and Applications
2006-09-01
30]. Gutman, PO., Baril , C. Neuman, L. (1994), An algorithm for computing value sets of uncertain transfer functions in factored real form...linear compensation design for saturating unstable uncertain plants. Int. J. Control, Vol. 44, pp. 1137-1146. [90]. Oldak S., Baril C. and Gutman
Computational Vision Based on Neurobiology
1993-07-09
of Personality and 71. M. Seibert and A.M. Waxman "Learning and Social Psychology, Vol. 37, pp. 2049-2058, 1979. recognizing 3D objects from multiple...414, 1992. 18. Petter, G. Nuove ricerche sperimentali sulla totalizzazione percettiva. Rivista di psicologia , 50: 213-227, 1956. 19. Vallortigara, G
Legendre modified moments for Euler's constant
NASA Astrophysics Data System (ADS)
Prévost, Marc
2008-10-01
Polynomial moments are often used for the computation of Gauss quadrature to stabilize the numerical calculation of the orthogonal polynomials, see [W. Gautschi, Computational aspects of orthogonal polynomials, in: P. Nevai (Ed.), Orthogonal Polynomials-Theory and Practice, NATO ASI Series, Series C: Mathematical and Physical Sciences, vol. 294. Kluwer, Dordrecht, 1990, pp. 181-216 [6]; W. Gautschi, On the sensitivity of orthogonal polynomials to perturbations in the moments, Numer. Math. 48(4) (1986) 369-382 [5]; W. Gautschi, On generating orthogonal polynomials, SIAM J. Sci. Statist. Comput. 3(3) (1982) 289-317 [4
Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke
2015-01-01
Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842
An Adaptive Evolutionary Algorithm for Traveling Salesman Problem with Precedence Constraints
Sung, Jinmo; Jeong, Bongju
2014-01-01
Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments. PMID:24701158
An adaptive evolutionary algorithm for traveling salesman problem with precedence constraints.
Sung, Jinmo; Jeong, Bongju
2014-01-01
Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments.
Cancer Evolution: Mathematical Models and Computational Inference
Beerenwinkel, Niko; Schwarz, Roland F.; Gerstung, Moritz; Markowetz, Florian
2015-01-01
Cancer is a somatic evolutionary process characterized by the accumulation of mutations, which contribute to tumor growth, clinical progression, immune escape, and drug resistance development. Evolutionary theory can be used to analyze the dynamics of tumor cell populations and to make inference about the evolutionary history of a tumor from molecular data. We review recent approaches to modeling the evolution of cancer, including population dynamics models of tumor initiation and progression, phylogenetic methods to model the evolutionary relationship between tumor subclones, and probabilistic graphical models to describe dependencies among mutations. Evolutionary modeling helps to understand how tumors arise and will also play an increasingly important prognostic role in predicting disease progression and the outcome of medical interventions, such as targeted therapy. PMID:25293804
Mean-Potential Law in Evolutionary Games
NASA Astrophysics Data System (ADS)
Nałecz-Jawecki, Paweł; Miekisz, Jacek
2018-01-01
The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1 /3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.
Open Issues in Evolutionary Robotics.
Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne
2016-01-01
One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.
Toward a unifying framework for evolutionary processes.
Paixão, Tiago; Badkobeh, Golnaz; Barton, Nick; Çörüş, Doğan; Dang, Duc-Cuong; Friedrich, Tobias; Lehre, Per Kristian; Sudholt, Dirk; Sutton, Andrew M; Trubenová, Barbora
2015-10-21
The theory of population genetics and evolutionary computation have been evolving separately for nearly 30 years. Many results have been independently obtained in both fields and many others are unique to its respective field. We aim to bridge this gap by developing a unifying framework for evolutionary processes that allows both evolutionary algorithms and population genetics models to be cast in the same formal framework. The framework we present here decomposes the evolutionary process into its several components in order to facilitate the identification of similarities between different models. In particular, we propose a classification of evolutionary operators based on the defining properties of the different components. We cast several commonly used operators from both fields into this common framework. Using this, we map different evolutionary and genetic algorithms to different evolutionary regimes and identify candidates with the most potential for the translation of results between the fields. This provides a unified description of evolutionary processes and represents a stepping stone towards new tools and results to both fields. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Normalized Implicit Radial Models for Scattered Point Cloud Data without Normal Vectors
2009-03-23
points by shrinking a discrete membrane, Computer Graphics Forum, Vol. 24-4, 2005, pp. 791-808 [8] Floater , M. S., Reimers, M.: Meshless...Parameterization and Surface Reconstruction, Computer Aided Geometric Design 18, 2001, pp 77-92 [9] Floater , M. S.: Parameterization of Triangulations and...Unorganized Points, In: Tutorials on Multiresolution in Geometric Modelling, A. Iske, E. Quak and M. S. Floater (eds.), Springer , 2002, pp. 287-316 [10
Advances and Challenges in Super-Resolution
2004-03-15
resolution in video. In: Proc. European Conf on Computer Vision (ECCV), May 2002, pp. 331–336. N. Sochen, R . Kimmel, R . Malladi . 1998. A general...2004a). 48 Vol. 14, 47–57 (2004) distinguish between a generic down-sampling operation (or CCD decimation by a factor r ) and the sampling...factor r often depends on the number of available low-resolution frames, the computational limitations (exponential in r ), and the accuracy of motion
2014-01-01
computational and empirical dosimetric tools [31]. For the computational dosimetry, we employed finite-dif- ference time- domain (FDTD) modeling techniques to...temperature-time data collected for a well exposed to THz radiation using finite-difference time- domain (FDTD) modeling techniques and thermocouples... like )). Alter- ation in the expression of such genes underscores the signif- 62 IEEE TRANSACTIONS ON TERAHERTZ SCIENCE AND TECHNOLOGY, VOL. 6, NO. 1
JPRS Report, Science & Technology. China.
1989-03-29
Commun ., Vol COM-29, No 6, pp 895-901, June 1981. [4] R.C. Titsworth , "A Boolean-Function-Multiplexed Telemetry System," IEEE Trans, on SET, pp 42...Reagents 39 Gene-Engineered Human Epithelium Growth Factor (hEGF) 39 Superfine Snake Venom 39 COMPUTERS Ai Computer System LISP-MI [Zheng Shouqi, et...XUEBAO, No 3, Jun 88] 134 Coordinated Development of Microwave, Optical Communications [Zhang Xu; DIANXIN KUAIBAO, No 11, Nov 88] 143 Error
NASA Technical Reports Server (NTRS)
1974-01-01
A collection of blank worksheets for use on each BRAVO problem to be analyzed is supplied, for the purposes of recording the inputs for the BRAVO analysis, working out the definition of mission equipment, recording inputs to the satellite synthesis computer program, estimating satellite earth station costs, costing terrestrial systems, and cost effectiveness calculations. The group of analysts working BRAVO will normally use a set of worksheets on each problem, however, the workbook pages are of sufficiently good quality that the user can duplicate them, if more worksheet blanks are required than supplied. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.
Benard, Emmanuel; Michel, Christian J
2009-08-01
We present here the SEGM web server (Stochastic Evolution of Genetic Motifs) in order to study the evolution of genetic motifs both in the direct evolutionary sense (past-present) and in the inverse evolutionary sense (present-past). The genetic motifs studied can be nucleotides, dinucleotides and trinucleotides. As an example of an application of SEGM and to understand its functionalities, we give an analysis of inverse mutations of splice sites of human genome introns. SEGM is freely accessible at http://lsiit-bioinfo.u-strasbg.fr:8080/webMathematica/SEGM/SEGM.html directly or by the web site http://dpt-info.u-strasbg.fr/~michel/. To our knowledge, this SEGM web server is to date the only computational biology software in this evolutionary approach.
Biology Needs Evolutionary Software Tools: Let’s Build Them Right
Team, Galaxy; Goecks, Jeremy; Taylor, James
2018-01-01
Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462
TESOL Newsletter, Vol. 18, 1984.
ERIC Educational Resources Information Center
TESOL Newsletter, 1984
1984-01-01
The 1984 volume of the Teachers of English to Speakers of Other Languages (TESOL) newsletter includes articles on language competence and cultural awareness in the United States; interest in English in Peru; employment trends; the case method in adult English as a second language (ESL); evaluating computer assisted instruction; the…
Problems in Decentralized Decision making and Computation.
1984-12-01
systesis being referred to. Findeisen [1982] clarifies this distinction by talking about the "programing" and "execution" phases.) 5. The lower and higher...n ... *iii... -258- Findeisen , W., (1982), "Decentralized and Hierarchical Control Under Consistency or Disagreement of Interest," Automatica, Vol. 18
DOT National Transportation Integrated Search
1981-10-01
This report presents an updated description of a vehicle simulation program, VEHSIM, which can determine the fuel economy and performance of a specified vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated ...
Computational Vision Based on Neurobiology
1994-08-10
34 Journal of Personality and 71. M. Seibert and A.M. Waxman "Learning and Social Psychology, Vol. 37, pp. 2049-2058, 1979. recognizing 3D objects from...coherence. Nature. 358:412-414, 1992. 18. Petter, G. Nuove ricerche sperimentali sulla totalizzazione percettiva. Rivista di psicologia . 50: 213-227
DOT National Transportation Integrated Search
1981-10-01
This report presents an updated description of a vehicle simulation program, VEHSIM, which can determine the fuel economy and performance of a specified vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated ...
DOT National Transportation Integrated Search
1981-10-01
This report presents an updated description of a vehicle simulation program, VEHSIM, which can determine the fuel economy and performance of a specified vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated ...
Evolving Better Cars: Teaching Evolution by Natural Selection with a Digital Inquiry Activity
ERIC Educational Resources Information Center
Royer, Anne M.; Schultheis, Elizabeth H.
2014-01-01
Evolutionary experiments are usually difficult to perform in the classroom because of the large sizes and long timescales of experiments testing evolutionary hypotheses. Computer applications give students a window to observe evolution in action, allowing them to gain comfort with the process of natural selection and facilitating inquiry…
Memetic Algorithms, Domain Knowledge, and Financial Investing
ERIC Educational Resources Information Center
Du, Jie
2012-01-01
While the question of how to use human knowledge to guide evolutionary search is long-recognized, much remains to be done to answer this question adequately. This dissertation aims to further answer this question by exploring the role of domain knowledge in evolutionary computation as applied to real-world, complex problems, such as financial…
Bipartite graphs as models of population structures in evolutionary multiplayer games.
Peña, Jorge; Rochat, Yannick
2012-01-01
By combining evolutionary game theory and graph theory, "games on graphs" study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner's dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner's dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures.
Resistance and relatedness on an evolutionary graph
Maciejewski, Wes
2012-01-01
When investigating evolution in structured populations, it is often convenient to consider the population as an evolutionary graph—individuals as nodes, and whom they may act with as edges. There has, in recent years, been a surge of interest in evolutionary graphs, especially in the study of the evolution of social behaviours. An inclusive fitness framework is best suited for this type of study. A central requirement for an inclusive fitness analysis is an expression for the genetic similarity between individuals residing on the graph. This has been a major hindrance for work in this area as highly technical mathematics are often required. Here, I derive a result that links genetic relatedness between haploid individuals on an evolutionary graph to the resistance between vertices on a corresponding electrical network. An example that demonstrates the potential computational advantage of this result over contemporary approaches is provided. This result offers more, however, to the study of population genetics than strictly computationally efficient methods. By establishing a link between gene transfer and electric circuit theory, conceptualizations of the latter can enhance understanding of the former. PMID:21849384
NASA Astrophysics Data System (ADS)
Fischer, Peter; Schuegraf, Philipp; Merkle, Nina; Storch, Tobias
2018-04-01
This paper presents a hybrid evolutionary algorithm for fast intensity based matching between satellite imagery from SAR and very high-resolution (VHR) optical sensor systems. The precise and accurate co-registration of image time series and images of different sensors is a key task in multi-sensor image processing scenarios. The necessary preprocessing step of image matching and tie-point detection is divided into a search problem and a similarity measurement. Within this paper we evaluate the use of an evolutionary search strategy for establishing the spatial correspondence between satellite imagery of optical and radar sensors. The aim of the proposed algorithm is to decrease the computational costs during the search process by formulating the search as an optimization problem. Based upon the canonical evolutionary algorithm, the proposed algorithm is adapted for SAR/optical imagery intensity based matching. Extensions are drawn using techniques like hybridization (e.g. local search) and others to lower the number of objective function calls and refine the result. The algorithm significantely decreases the computational costs whilst finding the optimal solution in a reliable way.
An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.
Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N V
2013-01-01
The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.
An Evolutionary Firefly Algorithm for the Estimation of Nonlinear Biological Model Parameters
Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N. V.
2013-01-01
The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test. PMID:23469172
NASA Astrophysics Data System (ADS)
Titov, A.; Gordov, E.; Okladnikov, I.
2009-04-01
In this report the results of the work devoted to the development of working model of the software system for storage, semantically-enabled search and retrieval along with processing and visualization of environmental datasets containing results of meteorological and air pollution observations and mathematical climate modeling are presented. Specially designed metadata standard for machine-readable description of datasets related to meteorology, climate and atmospheric pollution transport domains is introduced as one of the key system components. To provide semantic interoperability the Resource Description Framework (RDF, http://www.w3.org/RDF/) technology means have been chosen for metadata description model realization in the form of RDF Schema. The final version of the RDF Schema is implemented on the base of widely used standards, such as Dublin Core Metadata Element Set (http://dublincore.org/), Directory Interchange Format (DIF, http://gcmd.gsfc.nasa.gov/User/difguide/difman.html), ISO 19139, etc. At present the system is available as a Web server (http://climate.risks.scert.ru/metadatabase/) based on the web-portal ATMOS engine [1] and is implementing dataset management functionality including SeRQL-based semantic search as well as statistical analysis and visualization of selected data archives [2,3]. The core of the system is Apache web server in conjunction with Tomcat Java Servlet Container (http://jakarta.apache.org/tomcat/) and Sesame Server (http://www.openrdf.org/) used as a database for RDF and RDF Schema. At present statistical analysis of meteorological and climatic data with subsequent visualization of results is implemented for such datasets as NCEP/NCAR Reanalysis, Reanalysis NCEP/DOE AMIP II, JMA/CRIEPI JRA-25, ECMWF ERA-40 and local measurements obtained from meteorological stations on the territory of Russia. This functionality is aimed primarily at finding of main characteristics of regional climate dynamics. The proposed system represents a step in the process of development of a distributed collaborative information-computational environment to support multidisciplinary investigations of Earth regional environment [4]. Partial support of this work by SB RAS Integration Project 34, SB RAS Basic Program Project 4.5.2.2, APN Project CBA2007-08NSY and FP6 Enviro-RISKS project (INCO-CT-2004-013427) is acknowledged. References 1. E.P. Gordov, V.N. Lykosov, and A.Z. Fazliev. Web portal on environmental sciences "ATMOS" // Advances in Geosciences. 2006. Vol. 8. p. 33 - 38. 2. Gordov E.P., Okladnikov I.G., Titov A.G. Development of elements of web based information-computational system supporting regional environment processes investigations // Journal of Computational Technologies, Vol. 12, Special Issue #3, 2007, pp. 20 - 28. 3. Okladnikov I.G., Titov A.G. Melnikova V.N., Shulgina T.M. Web-system for processing and visualization of meteorological and climatic data // Journal of Computational Technologies, Vol. 13, Special Issue #3, 2008, pp. 64 - 69. 4. Gordov E.P., Lykosov V.N. Development of information-computational infrastructure for integrated study of Siberia environment // Journal of Computational Technologies, Vol. 12, Special Issue #2, 2007, pp. 19 - 30.
Domain Decomposition: A Bridge between Nature and Parallel Computers
1992-09-01
B., "Domain Decomposition Algorithms for Indefinite Elliptic Problems," S"IAM Journal of S; cientific and Statistical (’omputing, Vol. 13, 1992, pp...AD-A256 575 NASA Contractor Report 189709 ICASE Report No. 92-44 ICASE DOMAIN DECOMPOSITION: A BRIDGE BETWEEN NATURE AND PARALLEL COMPUTERS DTIC dE...effectively implemented on dis- tributed memory multiprocessors. In 1990 (as reported in Ref. 38 using the tile algo- rithm), a 103,201-unknown 2D elliptic
A Computer-Based Visual Analog Scale,
1992-06-01
34 keys on the computer keyboard or other input device. The initial position of the arrow is always in the center of the scale to prevent biasing the...3 REFERENCES 1. Gift, A.G., "Visual Analogue Scales: Measurement of Subjective Phenomena." Nursing Research, Vol. 38, pp. 286-288, 1989. 2. Ltmdberg...3. Menkes, D.B., Howard, R.C., Spears, G.F., and Cairns, E.R., "Salivary THC Following Cannabis Smoking Correlates With Subjective Intoxication and
Computational Fluid Dynamics for Atmospheric Entry
2009-09-01
equations. This method is a parallelizable variant of the Gauss - Seidel line-relaxation method of MacCormack (Ref. 33, 35), and is at the core of the...G.V. Candler, “The Solution of the Navier-Stokes Equations Gauss - Seidel Line Relaxation,” Computers and Fluids, Vol. 17, No. 1, 1989, pp. 135-150. 35... solution differs by 5% from the results obtained using the direct simulation Monte Carlo method . 3 Some authors advocate the use of higher-order continuum
A Combinatorial Geometry Computer Description of the XR311 Vehicle
1978-04-01
cards or magnetic tape. The shot line output of the GRID subroutine of the GIFT code is also stored on magnetic tape for future vulnera- bility...descriptions as processed by the Geometric Information For Targets ( GIFT )2 computer code. This report documents the COM-GEOM target description for all...72, March 1974. ’L.W. Bains and M.J. Reisinger, "The GIFT Code User Manual, VOL 1, Introduction and Input Requirements, " Ballistic Research
Preliminary Development of a Computational Model of a Dielectric Barrier Discharge
2004-12-01
Gerhard Pietsch . "Microdischarges in Air-Fed Ozonizers," Journal of Physics D: Applied Physics, Vol 24, 1991, pp 564-572. 14 Baldur Eliasson. "Modeling...Gibalov and Gerhard Pietsch . "Two-dimensional Modeling of the Dielectric Barrier Discharge in Air," Plasma Sources Science Technology, 1 (1992), pp. 166...Computer Modeling," IEEE Transactions on Plasma Science, 27 (1), February 1999, pp 36-37. 19 Valentin I Gibalov and Gerhard J. Pietsch . "The
EvolQG - An R package for evolutionary quantitative genetics
Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel
2016-01-01
We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352
Mean-Potential Law in Evolutionary Games.
Nałęcz-Jawecki, Paweł; Miękisz, Jacek
2018-01-12
The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1/3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.
Evolutionary versatility of eukaryotic protein domains revealed by their bigram networks
2011-01-01
Background Protein domains are globular structures of independently folded polypeptides that exert catalytic or binding activities. Their sequences are recognized as evolutionary units that, through genome recombination, constitute protein repertoires of linkage patterns. Via mutations, domains acquire modified functions that contribute to the fitness of cells and organisms. Recent studies have addressed the evolutionary selection that may have shaped the functions of individual domains and the emergence of particular domain combinations, which led to new cellular functions in multi-cellular animals. This study focuses on modeling domain linkage globally and investigates evolutionary implications that may be revealed by novel computational analysis. Results A survey of 77 completely sequenced eukaryotic genomes implies a potential hierarchical and modular organization of biological functions in most living organisms. Domains in a genome or multiple genomes are modeled as a network of hetero-duplex covalent linkages, termed bigrams. A novel computational technique is introduced to decompose such networks, whereby the notion of domain "networking versatility" is derived and measured. The most and least "versatile" domains (termed "core domains" and "peripheral domains" respectively) are examined both computationally via sequence conservation measures and experimentally using selected domains. Our study suggests that such a versatility measure extracted from the bigram networks correlates with the adaptivity of domains during evolution, where the network core domains are highly adaptive, significantly contrasting the network peripheral domains. Conclusions Domain recombination has played a major part in the evolution of eukaryotes attributing to genome complexity. From a system point of view, as the results of selection and constant refinement, networks of domain linkage are structured in a hierarchical modular fashion. Domains with high degree of networking versatility appear to be evolutionary adaptive, potentially through functional innovations. Domain bigram networks are informative as a model of biological functions. The networking versatility indices extracted from such networks for individual domains reflect the strength of evolutionary selection that the domains have experienced. PMID:21849086
Evolutionary versatility of eukaryotic protein domains revealed by their bigram networks.
Xie, Xueying; Jin, Jing; Mao, Yongyi
2011-08-18
Protein domains are globular structures of independently folded polypeptides that exert catalytic or binding activities. Their sequences are recognized as evolutionary units that, through genome recombination, constitute protein repertoires of linkage patterns. Via mutations, domains acquire modified functions that contribute to the fitness of cells and organisms. Recent studies have addressed the evolutionary selection that may have shaped the functions of individual domains and the emergence of particular domain combinations, which led to new cellular functions in multi-cellular animals. This study focuses on modeling domain linkage globally and investigates evolutionary implications that may be revealed by novel computational analysis. A survey of 77 completely sequenced eukaryotic genomes implies a potential hierarchical and modular organization of biological functions in most living organisms. Domains in a genome or multiple genomes are modeled as a network of hetero-duplex covalent linkages, termed bigrams. A novel computational technique is introduced to decompose such networks, whereby the notion of domain "networking versatility" is derived and measured. The most and least "versatile" domains (termed "core domains" and "peripheral domains" respectively) are examined both computationally via sequence conservation measures and experimentally using selected domains. Our study suggests that such a versatility measure extracted from the bigram networks correlates with the adaptivity of domains during evolution, where the network core domains are highly adaptive, significantly contrasting the network peripheral domains. Domain recombination has played a major part in the evolution of eukaryotes attributing to genome complexity. From a system point of view, as the results of selection and constant refinement, networks of domain linkage are structured in a hierarchical modular fashion. Domains with high degree of networking versatility appear to be evolutionary adaptive, potentially through functional innovations. Domain bigram networks are informative as a model of biological functions. The networking versatility indices extracted from such networks for individual domains reflect the strength of evolutionary selection that the domains have experienced.
Mastery Learning: A Comprehensive Bibliography. Fall 1982; Vol. 1, No. 2.
ERIC Educational Resources Information Center
Hymel, Glenn M.
A clearinghouse on mastery learning (CML) has been established at Loyola University (New Orleans, Louisiana) to accommodate the informational needs of mastery learning researchers, practitioners, and policymakers. Manual and computer searches on this topic have been conducted in the following databases: ERIC, Education Index, Psychological…
Determining Asset Criticality for Cyber Defense
2011-09-23
sciences area that may be applied to our situation. In particular, Analytic Hierarchy Process ( AHP ) [20] and Hierarchical TOPSIS [21] [22] are some examples...34 Mathematical and Computer Modeling, vol. 45, no. 7-8, pp. 801-813, 2007. 33 [22] Jia-Wen Wang, Ching-Hsue Cheng, and Kun-Cheng Huang, " Fuzzy
Data Visualization for ESM and ELINT: Visualizing 3D and Hyper Dimensional Data
2011-06-01
technique to present multiple 2D views was devised by D. Asimov . He assembled multiple two dimensional scatter plot views of the hyper dimensional...Viewing Multidimensional Data”, D. Asimov , DIAM Journal on Scientific and Statistical Computing, vol.61, pp.128-143, 1985. [2] “High-Dimensional
Adult Literacy and Technology Newsletter. Vol. 3, Nos. 1-4.
ERIC Educational Resources Information Center
Gueble, Ed, Ed.
1989-01-01
This document consists of four issues of a newsletter focused on the spectrum of technology use in literacy instruction. The first issue contains the following articles: "Five 'Big' Systems and One 'Little' Option" (Weisberg); "Computer Use Patterns at Blackfeet Community College" (Hill); "Software Review: Educational Activities' Science Series"…
MODELS-3 (CMAQ). NARSTO NEWS (VOL. 3, NO. 2, SUMMER/FALL 1999)
A revised version of the U.S. EPA's Models-3/CMAQ system was released on June 30, 1999. Models-3 consists of a sophisticated computational framework for environmental models allowing for much flexibility in the communications between component parts of the system, in updating or ...
Agile Port and High Speed Ship Technologies, Vol 1: FY05 Projects 3-6 and 8-10
2008-07-02
Computational Fluid Dynamics DTMB - David Taylor Model Basin JVR - Jet Velocity Ratio NSWCCD - Naval Surface Warfare Center, Carderock Division SDD - Systems...immature current state of the technology employed for the reactor system (multiple closed Brayton Cycle, Helium Cooled Gas reactors); (iii) several
Real-Time Communication Systems: Design, Analysis and Implementation
1984-07-31
sively [141-[19). A two-hop configuration involving a ring of repeaters around a station has been analyzed by Gitman [20) ; STATION network capacity...control of the packet-switching broadcast channels," J. Ass. Comput Mach., vol. 24, pp. 375-386, July 1977. [201 I. Gitman , "On the capacity of
Investigation of Superdetonative Ram Accelerator Drive Modes
1989-12-15
137. 18. Dwoyer, D.L., Kutler, P., and Povinelli , L.A., "Retooling CFD for Hypersonic Aircraft," Aerospace America, Vol. 25, Oct. 1987, pp 32-35. 19... Povinelli , L.A., "Advanced Computational Techniques for Hypersonic Propulsion," NASA Technical Memorandum No. 102005, NASA Lewis Research Center, Sept
Recreation and Natural Area Needs Assessment (GREAT III)
1982-01-01
1970 The Pennsylvania State University: Research and Computer Technician for Dr. E. L. Bergman, Department of Horticulture . Education B.S. The...Publication 1974 Becker, R. H. and R. 0. Ray. "Accessibility: An Application of the New Technology." Therapeutic Recreation Journal, Vol. 8, No. 4. 1976 Becker
Surplus Value in Organizational Communication
1992-03-01
according to Farace et al, you get what you pay for. 3. Efficiency and Value The relationship of cost to effectiveness is the efficiency of a communication...Structures and Computer Support: A Field Experiment" ACM Transactions on Office Information Systems, Vol. 6, No. 4. pp. 354-379, October 1988. Farace , R., J
Geo-Based Inter-Domain Routing (GIDR) Protocol for MANETS
2009-10-01
routing, and support for node mobility. Crowcroft et al. proposed Plutarch as architecture to translate address spaces and transport protocols among...Warfield, “ Plutarch : an argument for network pluralism,” ACM Computer Communication Review, vol. 33, no. 4, pp. 258–266, 2003. [6] S. Schmid, L
TESL Reporter, Vol. 10, No. 3.
ERIC Educational Resources Information Center
Pack, Alice C., Ed.
This issue of a publication devoted to providing ideas and guidance for teachers of English as a second language includes the following articles and features: (1) "Toward Interactive Modes in Guided Composition," (2) "Computer Compatibility in the Classroom," (3) "Discourse Structure in Reading," (4) "Terminal Behavior and Language," (5) "Sector…
DOT National Transportation Integrated Search
1981-10-01
This report presents an updated description of a vehicle simulation program, VEHSIM, which can determine the fuel economy and performance of a specified vehicle over a defined route as it executes a given driving schedule. Vehicle input accommodated ...
Developing Software to Use Parallel Processing Effectively
1988-10-01
Experience, Vol 15(6), June 1985, p53 Gajski85 Gajski , Daniel D. and Jih-Kwon Peir, "Essential Issues in Multiprocessor Systems", IEEE Computer, June...Treleaven (eds.), Springer-Verlag, pp. 213-225 (June 1987). Kuck83 David Kuck, Duncan Lawrie, Ron Cytron, Ahmed Sameh and Daniel Gajski , The Architecture and
Evolutionary computing for the design search and optimization of space vehicle power subsystems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Klimeck, Gerhard; Hanks, David; Hua, Hook
2004-01-01
Evolutionary computing has proven to be a straightforward and robust approach for optimizing a wide range of difficult analysis and design problems. This paper discusses the application of these techniques to an existing space vehicle power subsystem resource and performance analysis simulation in a parallel processing environment. Out preliminary results demonstrate that this approach has the potential to improve the space system trade study process by allowing engineers to statistically weight subsystem goals of mass, cost and performance then automatically size power elements based on anticipated performance of the subsystem rather than on worst-case estimates.
The Evolution of Biological Complexity in Digital Organisms
NASA Astrophysics Data System (ADS)
Ofria, Charles
2013-03-01
When Darwin first proposed his theory of evolution by natural selection, he realized that it had a problem explaining the origins of traits of ``extreme perfection and complication'' such as the vertebrate eye. Critics of Darwin's theory have latched onto this perceived flaw as a proof that Darwinian evolution is impossible. In anticipation of this issue, Darwin described the perfect data needed to understand this process, but lamented that such data are ``scarcely ever possible'' to obtain. In this talk, I will discuss research where we use populations of digital organisms (self-replicating and evolving computer programs) to elucidate the genetic and evolutionary processes by which new, highly-complex traits arise, drawing inspiration directly from Darwin's wistful thinking and hypotheses. During the process of evolution in these fully-transparent computational environments we can measure the incorporation of new information into the genome, a process akin to a natural Maxwell's Demon, and identify the original source of any such information. We show that, as Darwin predicted, much of the information used to encode a complex trait was already in the genome as part of simpler evolved traits, and that many routes must be possible for a new complex trait to have a high probability of successfully evolving. In even more extreme examples of the evolution of complexity, we are now using these same principles to examine the evolutionary dynamics the drive major transitions in evolution; that is transitions to higher-levels of organization, which are some of the most complex evolutionary events to occur in nature. Finally, I will explore some of the implications of this research to other aspects of evolutionary biology and as well as ways that these evolutionary principles can be applied toward solving computational and engineering problems.
Artificial intelligence in peer review: How can evolutionary computation support journal editors?
Mrowinski, Maciej J; Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica
2017-01-01
With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.
Evolutionary fuzzy modeling human diagnostic decisions.
Peña-Reyes, Carlos Andrés
2004-05-01
Fuzzy CoCo is a methodology, combining fuzzy logic and evolutionary computation, for constructing systems able to accurately predict the outcome of a human decision-making process, while providing an understandable explanation of the underlying reasoning. Fuzzy logic provides a formal framework for constructing systems exhibiting both good numeric performance (accuracy) and linguistic representation (interpretability). However, fuzzy modeling--meaning the construction of fuzzy systems--is an arduous task, demanding the identification of many parameters. To solve it, we use evolutionary computation techniques (specifically cooperative coevolution), which are widely used to search for adequate solutions in complex spaces. We have successfully applied the algorithm to model the decision processes involved in two breast cancer diagnostic problems, the WBCD problem and the Catalonia mammography interpretation problem, obtaining systems both of high performance and high interpretability. For the Catalonia problem, an evolved system was embedded within a Web-based tool-called COBRA-for aiding radiologists in mammography interpretation.
Understanding Evolutionary Potential in Virtual CPU Instruction Set Architectures
Bryson, David M.; Ofria, Charles
2013-01-01
We investigate fundamental decisions in the design of instruction set architectures for linear genetic programs that are used as both model systems in evolutionary biology and underlying solution representations in evolutionary computation. We subjected digital organisms with each tested architecture to seven different computational environments designed to present a range of evolutionary challenges. Our goal was to engineer a general purpose architecture that would be effective under a broad range of evolutionary conditions. We evaluated six different types of architectural features for the virtual CPUs: (1) genetic flexibility: we allowed digital organisms to more precisely modify the function of genetic instructions, (2) memory: we provided an increased number of registers in the virtual CPUs, (3) decoupled sensors and actuators: we separated input and output operations to enable greater control over data flow. We also tested a variety of methods to regulate expression: (4) explicit labels that allow programs to dynamically refer to specific genome positions, (5) position-relative search instructions, and (6) multiple new flow control instructions, including conditionals and jumps. Each of these features also adds complication to the instruction set and risks slowing evolution due to epistatic interactions. Two features (multiple argument specification and separated I/O) demonstrated substantial improvements in the majority of test environments, along with versions of each of the remaining architecture modifications that show significant improvements in multiple environments. However, some tested modifications were detrimental, though most exhibit no systematic effects on evolutionary potential, highlighting the robustness of digital evolution. Combined, these observations enhance our understanding of how instruction architecture impacts evolutionary potential, enabling the creation of architectures that support more rapid evolution of complex solutions to a broad range of challenges. PMID:24376669
A framework for evolutionary systems biology
Loewe, Laurence
2009-01-01
Background Many difficult problems in evolutionary genomics are related to mutations that have weak effects on fitness, as the consequences of mutations with large effects are often simple to predict. Current systems biology has accumulated much data on mutations with large effects and can predict the properties of knockout mutants in some systems. However experimental methods are too insensitive to observe small effects. Results Here I propose a novel framework that brings together evolutionary theory and current systems biology approaches in order to quantify small effects of mutations and their epistatic interactions in silico. Central to this approach is the definition of fitness correlates that can be computed in some current systems biology models employing the rigorous algorithms that are at the core of much work in computational systems biology. The framework exploits synergies between the realism of such models and the need to understand real systems in evolutionary theory. This framework can address many longstanding topics in evolutionary biology by defining various 'levels' of the adaptive landscape. Addressed topics include the distribution of mutational effects on fitness, as well as the nature of advantageous mutations, epistasis and robustness. Combining corresponding parameter estimates with population genetics models raises the possibility of testing evolutionary hypotheses at a new level of realism. Conclusion EvoSysBio is expected to lead to a more detailed understanding of the fundamental principles of life by combining knowledge about well-known biological systems from several disciplines. This will benefit both evolutionary theory and current systems biology. Understanding robustness by analysing distributions of mutational effects and epistasis is pivotal for drug design, cancer research, responsible genetic engineering in synthetic biology and many other practical applications. PMID:19239699
A Bright Future for Evolutionary Methods in Drug Design.
Le, Tu C; Winkler, David A
2015-08-01
Most medicinal chemists understand that chemical space is extremely large, essentially infinite. Although high-throughput experimental methods allow exploration of drug-like space more rapidly, they are still insufficient to fully exploit the opportunities that such large chemical space offers. Evolutionary methods can synergistically blend automated synthesis and characterization methods with computational design to identify promising regions of chemical space more efficiently. We describe how evolutionary methods are implemented, and provide examples of published drug development research in which these methods have generated molecules with increased efficacy. We anticipate that evolutionary methods will play an important role in future drug discovery. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An evolutionary algorithm that constructs recurrent neural networks.
Angeline, P J; Saunders, G M; Pollack, J B
1994-01-01
Standard methods for simultaneously inducing the structure and weights of recurrent neural networks limit every task to an assumed class of architectures. Such a simplification is necessary since the interactions between network structure and function are not well understood. Evolutionary computations, which include genetic algorithms and evolutionary programming, are population-based search methods that have shown promise in many similarly complex tasks. This paper argues that genetic algorithms are inappropriate for network acquisition and describes an evolutionary program, called GNARL, that simultaneously acquires both the structure and weights for recurrent networks. GNARL's empirical acquisition method allows for the emergence of complex behaviors and topologies that are potentially excluded by the artificial architectural constraints imposed in standard network induction methods.
Crew/computer communications study. Volume 2: Appendixes
NASA Technical Reports Server (NTRS)
Johannes, J. D.
1974-01-01
The software routines developed during the crew/computer communications study are described to provide the user with an understanding of each routine, any restrictions in use, the required input data, and expected results after executing the routines. The combination of routines to generate a crew/computer communications application is also explained. The programmable keyboard and display used by the program is described, and an experiment scenario is provided to illustrate the relationship between the program frames when they are grouped into activity phases. Program descriptions and a user's guide are also presented. For Vol. 1, see N74-18843.
EvoluZion: A Computer Simulator for Teaching Genetic and Evolutionary Concepts
ERIC Educational Resources Information Center
Zurita, Adolfo R.
2017-01-01
EvoluZion is a forward-in-time genetic simulator developed in Java and designed to perform real time simulations on the evolutionary history of virtual organisms. These model organisms harbour a set of 13 genes that codify an equal number of phenotypic features. These genes change randomly during replication, and mutant genes can have null,…
Generative Representations for Computer-Automated Evolutionary Design
NASA Technical Reports Server (NTRS)
Hornby, Gregory S.
2006-01-01
With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.
In-vivo Diagnosis of Breast Cancer Using Gamma Stimulated Emission Computed Tomography
2011-04-01
2006. [9] Floyd CE, Howell CR, Harrawood BP, Crowell AS, Kapadia AJ, Macri R, Xia JQ, Pedroni R, Bowsher J, Kiser MR, Tourassi GD, Tornow W , and...spin-sequence 0-1-2), with emitted gamma-ray energy 3448keV, 2601keV, 2657.562keV. In our simulation, w take tw m jor de-excitation parts into... Walter R, "Neutron Stimulated Emission Computed Tomography of Stable Isotopes," Proceedings of SPIE Medical Imaging 2004, vol. 5368, pp. 248-254. 17
2001-10-25
a CT image, each voxel contains an integer number which is the CT value, in Hounsfield units (HU), of the voxel. Therefore, the standard method of...Task Number Work Unit Number Performing Organization Name(s) and Address(es) Department of Electrical and Computer Engineering, University of...34, Journal of Pediatric Surgery, vol 24(7), pp. 708-711, 1989. [4] I. N. Bankman, editor, Handbook of Medical Image Analysis, Academic Press, London, UK
Hybrid Systems: Computation and Control.
1999-02-17
computer science ; Vol. 1386) ISBN 3 -540-64358- 3 CR Subject Classification (1991): C.l.m, C. 3 , D.2.1,F.3.1, F.1.2, J.2 ISSN 0302-9743 ISBN 3 -540...64358- 3 Springer-Verlag Berlin Heidelberg New York This work is subject to copyright. All rights are reserved, whether the whole or part of the material...10632061 06/3142 - 5 4 3 2 1 0 Printed on acid-free paper Preface This volume contains the proceedings of the First International Workshop on Hybrid Systems
Computer-Aided Design Package for Designers of Digital Optical Computers
1993-07-01
Saul Levy, Chun Liew, Masoud Majidi , Donald Smith, and Thomas Stone Final Report for Grant #N00014-90-J-4018 Period Covered: 5/1/90 - 4/30/93 Miles...Logic Arrays," Applied Optics, 27, pp. 1651-1660, (May 1, 1988). [5] Murdocca, M. J., V. Gupta, and M. Majidi , "New Approaches to Digital Optical...Lanzl, F., H.-J. Preuss and G. Wiegelt, eds., Proc. SPIE, vol. 319, Garmisch, Bavaria, pp. 126-127, (1990). Murdocca, M. J., V. Gupta, and M. Majidi
1983-05-01
Parallel Computation that Assign Canonical Object-Based Frames of Refer- ence," Proc. 7th it. .nt. Onf. on Artifcial Intellig nce (IJCAI-81), Vol. 2...Perception of Linear Struc- ture in Imaged Data ." TN 276, Artiflci!.a Intelligence Center, SRI International, Feb. 1983. [Fram75] J.P. Frain and E.S...1983 May 1983 D C By: Martin A. Fischler, Program Director S ELECTE Principal Investigator, (415)859-5106 MAY 2 21990 Artificial Intelligence Center
Parametric Estimation of Load for Air Force Data Centers
2015-03-27
R. Nelson, L. Orsenigo and S . Winter, "’History-friendly’ models of industry evolution : the computer industry," Industrial and Corporate Change...34’History-friendly’ models of industry evolution : the computer industry," Industrial and Corporate Change, vol. 8, no. 1, pp. 3-40, 1999. [7] VMWare...NAME( S ) AND ADDRESS(ES) Vinh Phung, 38ES/ENOC 5813 Arnold St, Building 4064 Tinker AFB OK 73145-8120 COM : 405-734-7461, vinh.phung@us.af.mil 10
2015-05-28
recognition is simpler and requires less computational resources compared to other inputs such as facial expressions . The Berlin database of Emotional ...Processing Magazine, IEEE, vol. 18, no. 1, pp. 32– 80, 2001. [15] K. R. Scherer, T. Johnstone, and G. Klasmeyer, “Vocal expression of emotion ...Network for Real-Time Speech- Emotion Recognition 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62788F 6. AUTHOR(S) Q
Process Defects in Composites.
1995-01-30
mean velocity, U, a high kinematic viscosity, v , and a small diameter of the fibers, D , lead to a very small Reynolds number Re = UD << 1 (1) where p is...partial credit to ARO). 9. D . Krajcinovic and S . Mastilovic, "Damage Evolution and Failure Modes", in: Proc. of the Int. Conf. on Computational...34Computer Simulation of a Model for Irreversible Gelation", Journal of Physics A, Vol. 16., pp. 1221-1239. Kuksenko, V . S . and Tamuzs, V . P., 1981
Cancer evolution: mathematical models and computational inference.
Beerenwinkel, Niko; Schwarz, Roland F; Gerstung, Moritz; Markowetz, Florian
2015-01-01
Cancer is a somatic evolutionary process characterized by the accumulation of mutations, which contribute to tumor growth, clinical progression, immune escape, and drug resistance development. Evolutionary theory can be used to analyze the dynamics of tumor cell populations and to make inference about the evolutionary history of a tumor from molecular data. We review recent approaches to modeling the evolution of cancer, including population dynamics models of tumor initiation and progression, phylogenetic methods to model the evolutionary relationship between tumor subclones, and probabilistic graphical models to describe dependencies among mutations. Evolutionary modeling helps to understand how tumors arise and will also play an increasingly important prognostic role in predicting disease progression and the outcome of medical interventions, such as targeted therapy. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society of Systematic Biologists.
Effect of Shielding Gas on the Properties of AW 5083 Aluminum Alloy Laser Weld Joints
NASA Astrophysics Data System (ADS)
Vyskoč, Maroš; Sahul, Miroslav; Sahul, Martin
2018-04-01
The paper deals with the evaluation of the shielding gas influence on the properties of AW 5083 aluminum alloy weld joints produced with disk laser. Butt weld joints were produced under different shielding gas types, namely Ar, He, Ar + 5 vol.% He, Ar + 30 vol.% He and without shielding weld pool. Light and electron microscopy, computed tomography, microhardness measurements and tensile testing were used for evaluation of weld joint properties. He-shielded weld joints were the narrowest ones. On the other hand, Ar-shielded weld joints exhibited largest weld width. The choice of shielding gas had significant influence on the porosity level of welds. The lowest porosity was observed in weld joint produced in Ar with the addition of 5 vol.% He shielding atmosphere (only 0.03%), while the highest level of porosity was detected in weld joint produced in pure He (0.24%). Except unshielded aluminum alloy weld joint, the lowest tensile strength was recorded in He-shielded weld joints. On the contrary, the highest average microhardness was measured in He-shielded weld joints.
Computational evolution: taking liberties.
Correia, Luís
2010-09-01
Evolution has, for a long time, inspired computer scientists to produce computer models mimicking its behavior. Evolutionary algorithm (EA) is one of the areas where this approach has flourished. EAs have been used to model and study evolution, but they have been especially developed for their aptitude as optimization tools for engineering. Developed models are quite simple in comparison with their natural sources of inspiration. However, since EAs run on computers, we have the freedom, especially in optimization models, to test approaches both realistic and outright speculative, from the biological point of view. In this article, we discuss different common evolutionary algorithm models, and then present some alternatives of interest. These include biologically inspired models, such as co-evolution and, in particular, symbiogenetics and outright artificial operators and representations. In each case, the advantages of the modifications to the standard model are identified. The other area of computational evolution, which has allowed us to study basic principles of evolution and ecology dynamics, is the development of artificial life platforms for open-ended evolution of artificial organisms. With these platforms, biologists can test theories by directly manipulating individuals and operators, observing the resulting effects in a realistic way. An overview of the most prominent of such environments is also presented. If instead of artificial platforms we use the real world for evolving artificial life, then we are dealing with evolutionary robotics (ERs). A brief description of this area is presented, analyzing its relations to biology. Finally, we present the conclusions and identify future research avenues in the frontier of computation and biology. Hopefully, this will help to draw the attention of more biologists and computer scientists to the benefits of such interdisciplinary research.
Doss, C George Priya; Chakrabarty, Chiranjib; Debajyoti, C; Debottam, S
2014-11-01
Certain mysteries pointing toward their recruitment pathways, cell cycle regulation mechanisms, spindle checkpoint assembly, and chromosome segregation process are considered the centre of attraction in cancer research. In modern times, with the established databases, ranges of computational platforms have provided a platform to examine almost all the physiological and biochemical evidences in disease-associated phenotypes. Using existing computational methods, we have utilized the amino acid residues to understand the similarity within the evolutionary variance of different associated centromere proteins. This study related to sequence similarity, protein-protein networking, co-expression analysis, and evolutionary trajectory of centromere proteins will speed up the understanding about centromere biology and will create a road map for upcoming researchers who are initiating their work of clinical sequencing using centromere proteins.
Bipartite Graphs as Models of Population Structures in Evolutionary Multiplayer Games
Peña, Jorge; Rochat, Yannick
2012-01-01
By combining evolutionary game theory and graph theory, “games on graphs” study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner’s dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner’s dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures. PMID:22970237
The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking
NASA Astrophysics Data System (ADS)
Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng
Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).
November 2003 Vol. 4 No. 10 - 11 September 2003 Vol. 4 No. 9 August 2003 Vol.4 No. 8 July 2003 Vol.4 No. 7 Vol. 3 No. 9 August 2002 Vol. 3 No. 8 July 2002 Vol. 3 No. 7 June 2002 Vol. 3 No. 6 May 2002 Vol. 3 No . 2 No.10 August 2001 Vol. 2 No. 4 March 2001 Vol. 2 No. 3 February 2001 Vol. 2 No. 2 January 2001 Vol
The report gives results of the collection of emissions test data st two triethylene glycol units to provide data for the comparison to GRI-GLYCalc, a computer program developed to estimate emissions from glycol dehydrators. [NOTE: Glycol dehydrators are used in the natural gas i...
The report gives results of the collection of emissions tests data at two triethylene glycol units to provide data for comparison to GRI-GLYCalc, a computer program developed to estimate emissions from glycol dehydrators. (NOTE: Glycol dehydrators are used in the natural gas indu...
Current Trends in English Language Testing. Conference Proceedings for CTELT 1997 and 1998, Vol. 1.
ERIC Educational Resources Information Center
Coombe, Christine A., Ed.
Papers from the 1997 and 1998 Current Trends in English Language Testing (CTELT) conferences include: "Computer-Based Language Testing: The Call of the Internet" (G. Fulcher); "Uses of the PET (Preliminary English Test) at Sultan Qaboos University" (R. Taylor); "Issues in Foreign and Second Language Academic Listening…
ERIC Educational Resources Information Center
Ellis, Nick C.
2009-01-01
This article presents an analysis of interactions in the usage, structure, cognition, coadaptation of conversational partners, and emergence of linguistic constructions. It focuses on second language development of English verb-argument constructions (VACs: VL, verb locative; VOL, verb object locative; VOO, ditransitive) with particular reference…
Issues in the Convergence of Control with Communication and Computation
2004-10-04
Library/Upload/116/Cal1.doc. [42] M. H. Shwehdi and A. Z. Khan, “A power line data communication interface using spread spectrum technology in home ... automation ,” IEEE Transactions on Power Delivery, vol. 11, pp. 1232–1237, July 1996. ISSN: 0885-8977. [43] R. G. Olsen, “Technical considerations for
Characteristics, Control and Treatment of Leachate at Military Installations.
1981-02-01
points in the area adjacent to the landfill. Background data can be obtained 61C. W. Thornthwaite and J. R. Wather , "Instructions and Tables for...Development," Public Works, Vol 102, No. 2 (March 1971), pp 77-79. Thornthwaite, C. W., and J. R. Wather , "Instructions and Tables for Computing Potential
Notes on Search, Detection and Localization Modeling. Revision 4
1990-10-01
2, --- , k where the subregions are relabeled so that the following order relation holds: pi/6 1 > p 2!6 2 > ... > pr65 and where k is chosen so... Interfaces ," Operations Research, Vol. 19, No.3, pp 559-586, 1971. 15. Coggins, P.B., "Detection Probability Computations for Random Search of an
Reengineering Aircraft Structural Life Prediction Using a Digital Twin
2011-01-01
that exaflop-per-second computers will become available: “extrapolation of current hardware trends suggests that exascale systems could be available in...vol. 28, no. 5, pp. 339–350, 2002. [4] H. Simon, T. Zacharia, and R. Stevens, Modeling and Sim- ulation at the Exascale for Energy and the Environment
Library Micro-Computing, Vol. 2. Reprints from the Best of "ONLINE" [and]"DATABASE."
ERIC Educational Resources Information Center
Online, Inc., Weston, CT.
Reprints of 19 articles pertaining to library microcomputing appear in this collection, the second of two volumes on this topic in a series of volumes of reprints from "ONLINE" and "DATABASE" magazines. Edited for information professionals who use electronically distributed databases, these articles address such topics as: (1)…
Data Stream Mining Based Dynamic Link Anomaly Analysis Using Paired Sliding Time Window Data
2014-11-01
Conference on Knowledge Dis- covery and Data Mining, PAKDD’10, Hyderabad, India , (2010). [2] Almansoori, W., Gao, S., Jarada, T. N., Elsheikh, A. M...F., Greif, C., and Lakshmanan, L. V., “Fast Matrix Computations for Pairwise and Columnwise Commute Times and Katz Scores,” Internet Mathematics, Vol
NASA Astrophysics Data System (ADS)
Adem, ACIR; Eşref, BAYSAL
2018-07-01
In this paper, neutronic analysis in a laser fusion inertial confinement fusion fission energy (LIFE) engine fuelled plutonium and minor actinides using a MCNP codes was investigated. LIFE engine fuel zone contained 10 vol% TRISO particles and 90 vol% natural lithium coolant mixture. TRISO fuel compositions have Mod①: reactor grade plutonium (RG-Pu), Mod②: weapon grade plutonium (WG-Pu) and Mod③: minor actinides (MAs). Tritium breeding ratios (TBR) were computed as 1.52, 1.62 and 1.46 for Mod①, Mod② and Mod③, respectively. The operation period was computed as ∼21 years when the reference TBR > 1.05 for a self-sustained reactor for all investigated cases. Blanket energy multiplication values (M) were calculated as 4.18, 4.95 and 3.75 for Mod①, Mod② and Mod③, respectively. The burnup (BU) values were obtained as ∼1230, ∼1550 and ∼1060 GWd tM–1, respectively. As a result, the higher BU were provided with using TRISO particles for all cases in LIFE engine.
Sanibel Symposium in the Petascale-Exascale Computational Era
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Hai-Ping
The 56 th Sanibel Symposium was held February 14-19 2016 at the King and Prince Hotel, St. Simons Island, GA. It successfully brought quantum chemists and chemical and condensed matter physicists together in presentations, posters, and informal discussions bridging those two communities. The Symposium has had a significant role in preparing generations of quantum theorists. As computational potency and algorithmic sophistication have grown, the Symposium has evolved to emphasize more heavily computationally oriented method development in chemistry and materials physics, including nanoscience, complex molecular phenomena, and even bio-molecular methods and problems. Given this context, the 56 th Sanibel meeting systematicallymore » and deliberately had sessions focused on exascale computation. A selection of outstanding theoretical problems that need serious attention was included. Five invited sessions, two contributed sessions (hot topics), and a poster session were organized with the exascale theme. This was a historic milestone in the evolution of the Symposia. Just as years ago linear algebra, perturbation theory, density matrices, and band-structure methods dominated early Sanibel Symposia, the exascale sessions of the 56 thmeeting contributed a transformative influence to add structure and strength to the computational physical science community in an unprecedented way. A copy of the full program of the 56 th Symposium is attached. The exascale sessions were Linear Scaling, Non-Adabatic Dynamics, Interpretive Theory and Models, Computation, Software, and Algorithms, and Quantum Monte Carlo. The Symposium Proceedings will be published in Molecular Physics (2017). Note that the Sanibel proceedings from 2015 and 2014 were published as Molecular Physics vol. 114, issue 3-4 (2016) and vol. 113, issue 3-4 (2015) respectively.« less
Pediatric chest and abdominopelvic CT: organ dose estimation based on 42 patient models.
Tian, Xiaoyu; Li, Xiang; Segars, W Paul; Paulson, Erik K; Frush, Donald P; Samei, Ehsan
2014-02-01
To estimate organ dose from pediatric chest and abdominopelvic computed tomography (CT) examinations and evaluate the dependency of organ dose coefficients on patient size and CT scanner models. The institutional review board approved this HIPAA-compliant study and did not require informed patient consent. A validated Monte Carlo program was used to perform simulations in 42 pediatric patient models (age range, 0-16 years; weight range, 2-80 kg; 24 boys, 18 girls). Multidetector CT scanners were modeled on those from two commercial manufacturers (LightSpeed VCT, GE Healthcare, Waukesha, Wis; SOMATOM Definition Flash, Siemens Healthcare, Forchheim, Germany). Organ doses were estimated for each patient model for routine chest and abdominopelvic examinations and were normalized by volume CT dose index (CTDI(vol)). The relationships between CTDI(vol)-normalized organ dose coefficients and average patient diameters were evaluated across scanner models. For organs within the image coverage, CTDI(vol)-normalized organ dose coefficients largely showed a strong exponential relationship with the average patient diameter (R(2) > 0.9). The average percentage differences between the two scanner models were generally within 10%. For distributed organs and organs on the periphery of or outside the image coverage, the differences were generally larger (average, 3%-32%) mainly because of the effect of overranging. It is feasible to estimate patient-specific organ dose for a given examination with the knowledge of patient size and the CTDI(vol). These CTDI(vol)-normalized organ dose coefficients enable one to readily estimate patient-specific organ dose for pediatric patients in clinical settings. This dose information, and, as appropriate, attendant risk estimations, can provide more substantive information for the individual patient for both clinical and research applications and can yield more expansive information on dose profiles across patient populations within a practice. © RSNA, 2013.
Computer-Automated Evolution of Spacecraft X-Band Antennas
NASA Technical Reports Server (NTRS)
Lohn, Jason D.; Homby, Gregory S.; Linden, Derek S.
2010-01-01
A document discusses the use of computer- aided evolution in arriving at a design for X-band communication antennas for NASA s three Space Technology 5 (ST5) satellites, which were launched on March 22, 2006. Two evolutionary algorithms, incorporating different representations of the antenna design and different fitness functions, were used to automatically design and optimize an X-band antenna design. A set of antenna designs satisfying initial ST5 mission requirements was evolved by use these algorithms. The two best antennas - one from each evolutionary algorithm - were built. During flight-qualification testing of these antennas, the mission requirements were changed. After minimal changes in the evolutionary algorithms - mostly in the fitness functions - new antenna designs satisfying the changed mission requirements were evolved and within one month of this change, two new antennas were designed and prototypes of the antennas were built and tested. One of these newly evolved antennas was approved for deployment on the ST5 mission, and flight-qualified versions of this design were built and installed on the spacecraft. At the time of writing the document, these antennas were the first computer-evolved hardware in outer space.
Optimizing a reconfigurable material via evolutionary computation
NASA Astrophysics Data System (ADS)
Wilken, Sam; Miskin, Marc Z.; Jaeger, Heinrich M.
2015-08-01
Rapid prototyping by combining evolutionary computation with simulations is becoming a powerful tool for solving complex design problems in materials science. This method of optimization operates in a virtual design space that simulates potential material behaviors and after completion needs to be validated by experiment. However, in principle an evolutionary optimizer can also operate on an actual physical structure or laboratory experiment directly, provided the relevant material parameters can be accessed by the optimizer and information about the material's performance can be updated by direct measurements. Here we provide a proof of concept of such direct, physical optimization by showing how a reconfigurable, highly nonlinear material can be tuned to respond to impact. We report on an entirely computer controlled laboratory experiment in which a 6 ×6 grid of electromagnets creates a magnetic field pattern that tunes the local rigidity of a concentrated suspension of ferrofluid and iron filings. A genetic algorithm is implemented and tasked to find field patterns that minimize the force transmitted through the suspension. Searching within a space of roughly 1010 possible configurations, after testing only 1500 independent trials the algorithm identifies an optimized configuration of layered rigid and compliant regions.
Evaluation of Generation Alternation Models in Evolutionary Robotics
NASA Astrophysics Data System (ADS)
Oiso, Masashi; Matsumura, Yoshiyuki; Yasuda, Toshiyuki; Ohkura, Kazuhiro
For efficient implementation of Evolutionary Algorithms (EA) to a desktop grid computing environment, we propose a new generation alternation model called Grid-Oriented-Deletion (GOD) based on comparison with the conventional techniques. In previous research, generation alternation models are generally evaluated by using test functions. However, their exploration performance on the real problems such as Evolutionary Robotics (ER) has not been made very clear yet. Therefore we investigate the relationship between the exploration performance of EA on an ER problem and its generation alternation model. We applied four generation alternation models to the Evolutionary Multi-Robotics (EMR), which is the package-pushing problem to investigate their exploration performance. The results show that GOD is more effective than the other conventional models.
Scheduling Earth Observing Fleets Using Evolutionary Algorithms: Problem Description and Approach
NASA Technical Reports Server (NTRS)
Globus, Al; Crawford, James; Lohn, Jason; Morris, Robert; Clancy, Daniel (Technical Monitor)
2002-01-01
We describe work in progress concerning multi-instrument, multi-satellite scheduling. Most, although not all, Earth observing instruments currently in orbit are unique. In the relatively near future, however, we expect to see fleets of Earth observing spacecraft, many carrying nearly identical instruments. This presents a substantially new scheduling challenge. Inspired by successful commercial applications of evolutionary algorithms in scheduling domains, this paper presents work in progress regarding the use of evolutionary algorithms to solve a set of Earth observing related model problems. Both the model problems and the software are described. Since the larger problems will require substantial computation and evolutionary algorithms are embarrassingly parallel, we discuss our parallelization techniques using dedicated and cycle-scavenged workstations.
Nanotube Heterojunctions and Endo-Fullerenes for Nanoelectronics
NASA Technical Reports Server (NTRS)
Srivastava, Deepak; Menon, M.; Andriotis, Antonis; Cho, K.; Park, Jun; Biegel, Bryan A. (Technical Monitor)
2002-01-01
Topics discussed include: (1) Light-Weight Multi-Functional Materials: Nanomechanics; Nanotubes and Composites; Thermal/Chemical/Electrical Characterization; (2) Biomimetic/Revolutionary Concepts: Evolutionary Computing and Sensing; Self-Heating Materials; (3) Central Computing System: Molecular Electronics; Materials for Quantum Bits; and (4) Molecular Machines.
Precht, H; Kitslaar, P H; Broersen, A; Gerke, O; Dijkstra, J; Thygesen, J; Egstrup, K; Lambrechtsen, J
2017-02-01
Investigate the influence of adaptive statistical iterative reconstruction (ASIR) and the model-based IR (Veo) reconstruction algorithm in coronary computed tomography angiography (CCTA) images on quantitative measurements in coronary arteries for plaque volumes and intensities. Three patients had three independent dose reduced CCTA performed and reconstructed with 30% ASIR (CTDI vol at 6.7 mGy), 60% ASIR (CTDI vol 4.3 mGy) and Veo (CTDI vol at 1.9 mGy). Coronary plaque analysis was performed for each measured CCTA volumes, plaque burden and intensities. Plaque volume and plaque burden show a decreasing tendency from ASIR to Veo as median volume for ASIR is 314 mm 3 and 337 mm 3 -252 mm 3 for Veo and plaque burden is 42% and 44% for ASIR to 39% for Veo. The lumen and vessel volume decrease slightly from 30% ASIR to 60% ASIR with 498 mm 3 -391 mm 3 for lumen volume and vessel volume from 939 mm 3 to 830 mm 3 . The intensities did not change overall between the different reconstructions for either lumen or plaque. We found a tendency of decreasing plaque volumes and plaque burden but no change in intensities with the use of low dose Veo CCTA (1.9 mGy) compared to dose reduced ASIR CCTA (6.7 mGy & 4.3 mGy), although more studies are warranted. Copyright © 2016 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flores-M, E.; Gamboa de Buen, I.; Buenfil, A. E.
Computed Tomography (CT) is a high dose X ray imaging procedure and its use has rapidly increased in the last two decades fueled by the development of helical CT. The aim of this study is to present values of the dosimetric quantities for CT paediatric examinations of thoracic and abdominal regions. The protocols studied were those of chest, lung-mediastine, chest-abdomen, pulmonary high resolution and mediastine-abdomen, which are the more common examinations performed at ''Hospital Infantil de Mexico Federico Gomez'' in the thoracic-abdominal region. The measurements were performed on a Siemens SOMATOM Sensation 16 CT Scanner and the equipment used wasmore » a CT pencil ionization chamber, connected to an electrometer. This system was calibrated for RQT9 CT beam quality. A PMMA head phantom with diameter of 16 cm and length of 15 cm was also used. The dosimetric quantities measured were the weighted air kerma index (C{sub w}), the volumetric dose index (C{sub vol}) and the CT air kerma-length product. It was found that the pulmonary high resolution examination presented the highest values for the C{sub w}(31.1 mGy) and C{sub vol}(11.1 mGy). The examination with the lowest values of these two quantities was the chest-abdomen protocol with 10.5 mGy for C{sub w} and 5.5 mGy for C{sub vol}. However, this protocol presented the highest value for P{sub KL,CT}(282.2 mGy cm) when considering the average clinical length of the examinations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, Adam C.; Zankl, Maria; DeMarco, John J.
2010-04-15
Purpose: Monte Carlo radiation transport techniques have made it possible to accurately estimate the radiation dose to radiosensitive organs in patient models from scans performed with modern multidetector row computed tomography (MDCT) scanners. However, there is considerable variation in organ doses across scanners, even when similar acquisition conditions are used. The purpose of this study was to investigate the feasibility of a technique to estimate organ doses that would be scanner independent. This was accomplished by assessing the ability of CTDI{sub vol} measurements to account for differences in MDCT scanners that lead to organ dose differences. Methods: Monte Carlo simulationsmore » of 64-slice MDCT scanners from each of the four major manufacturers were performed. An adult female patient model from the GSF family of voxelized phantoms was used in which all ICRP Publication 103 radiosensitive organs were identified. A 120 kVp, full-body helical scan with a pitch of 1 was simulated for each scanner using similar scan protocols across scanners. From each simulated scan, the radiation dose to each organ was obtained on a per mA s basis (mGy/mA s). In addition, CTDI{sub vol} values were obtained from each scanner for the selected scan parameters. Then, to demonstrate the feasibility of generating organ dose estimates from scanner-independent coefficients, the simulated organ dose values resulting from each scanner were normalized by the CTDI{sub vol} value for those acquisition conditions. Results: CTDI{sub vol} values across scanners showed considerable variation as the coefficient of variation (CoV) across scanners was 34.1%. The simulated patient scans also demonstrated considerable differences in organ dose values, which varied by up to a factor of approximately 2 between some of the scanners. The CoV across scanners for the simulated organ doses ranged from 26.7% (for the adrenals) to 37.7% (for the thyroid), with a mean CoV of 31.5% across all organs. However, when organ doses are normalized by CTDI{sub vol} values, the differences across scanners become very small. For the CTDI{sub vol}, normalized dose values the CoVs across scanners for different organs ranged from a minimum of 2.4% (for skin tissue) to a maximum of 8.5% (for the adrenals) with a mean of 5.2%. Conclusions: This work has revealed that there is considerable variation among modern MDCT scanners in both CTDI{sub vol} and organ dose values. Because these variations are similar, CTDI{sub vol} can be used as a normalization factor with excellent results. This demonstrates the feasibility of establishing scanner-independent organ dose estimates by using CTDI{sub vol} to account for the differences between scanners.« less
Vision 2010: The Future of Higher Education Business and Learning Applications
ERIC Educational Resources Information Center
Carey, Patrick; Gleason, Bernard
2006-01-01
The global software industry is in the midst of a major evolutionary shift--one based on open computing--and this trend, like many transformative trends in technology, is being led by the IT staffs and academic computing faculty of the higher education industry. The elements of this open computing approach are open source, open standards, open…
Multi-Objective UAV Mission Planning Using Evolutionary Computation
2008-03-01
on a Solution Space. . . . . . . . . . . . . . . . . . . . 41 4.3. Crowding distance calculation. Dark points are non-dominated solutions. [14...SPEA2 was devel- oped by Zitzler [64] as an improvement to the original SPEA algorithm [65]. SPEA2 Figure 4.3: Crowding distance calculation. Dark ...thesis, Los Angeles, CA, USA, 2003. Adviser-Maja J. Mataric . 114 21. Homberger, Joerg and Hermann Gehring. “Two Evolutionary Metaheuristics for the
An Evolutionary Algorithm to Generate Ellipsoid Detectors for Negative Selection
2005-03-21
of Congress on Evolutionary Computation. Honolulu,. 58. Lamont, Gary B., Robert E. Marmelstein, and David A. Van Veldhuizen . A Distributed Architecture...antibody and an antigen is a function of several processes including electrostatic interactions, hydrogen bonding, van der Waals interaction, and others [20...Kelly, Patrick M., Don R. Hush, and James M. White. “An Adaptive Algorithm for Modifying Hyperellipsoidal Decision Surfaces”. Journal of Artificial
2016-05-01
reduction achieved is small due to the starting shape being near optimal. The general arrangement and x-y coordinate system are shown in Figure 23...Optimization, Vol. 28, pp. 55–68, 2004. [3] M Heller, J Calero, S Barter , RJ Wescott, J Choi. Fatigue life extension program for LAU-7 missile launcher
Evaluation of Available Software for Reconstruction of a Structure from its Imagery
2017-04-01
Math . 2, 164–168. Lowe, D. G. (1999) Object recognition from local scale-invariant features, in Proc. Int. Conf. Computer Vision, Vol. 2, pp. 1150–1157...Marquardt, D. (1963) An algorithm for least-squares estimation of nonlinear parameters, SIAM J. Appl. Math . 11(2), 431–441. UNCLASSIFIED 11 DST-Group–TR
Fast Electromagnetic Solvers for Large-Scale Naval Scattering Problems
2008-09-27
IEEE Trans. Antennas Propag., vol. 52, no. 8, pp. 2141–2146, 2004. [12] R. J. Burkholder and J. F. Lee, “Fast dual-MGS block-factorization algorithm...Golub and C. F. V. Loan, Matrix Computations. Baltimore: The Johns Hopkins University Press, 1996. [20] W. D. Li, W. Hong, and H. X. Zhou, “Integral
ERIC Educational Resources Information Center
Silvern, Steven B.
1990-01-01
Reviews articles in the Journal of Research in Childhood Education, Volume 5, Number 1, 1990. Topics include young children's oral language; effects of realistic versus nonrealistic play; textbook choice; written response to stereotypical and nonstereotypical story starters; and computers and kindergarten language development. (DG)
Optical Computing, 1991, Technical Digest Series, Vol. 6
1992-05-22
lasers). Compound semiconductors may satisfy these requirements. For example, optical signal amplification by two-beam coupling and amplified phase... compound semiconductors can provide this type of implementationi. This paper presents results from a detailed investigation on potentials of the...conductivity to achieve high multichannel cell performance. We describe several high performance Gallium Phosphide multichannel Bragg cells which employ these
Library Micro-Computing, Vol. 1. Reprints from the Best of "ONLINE" [and]"DATABASE."
ERIC Educational Resources Information Center
Online, Inc., Weston, CT.
Reprints of 18 articles pertaining to library microcomputing appear in this collection, the first of two volumes on this topic in a series of volumes of reprints from "ONLINE" and "DATABASE" magazines. Edited for information professionals who use electronically distributed databases, these articles address such topics as: (1) an integrated library…
Profile and Instrumentation Driven Methods for Embedded Signal Processing
2015-01-01
applications,” Computers, IEEE Transactions on, vol. 37, no. 9, pp. 1088–1098, Sep 1988. [7] Massimo Ravasi and Marco Mattavelli, “High-level algorithmic...profiling,” in Digital Media and its Application in Museum Heritages, Second Workshop on, Dec 2007, pp. 353–358. [15] H. Hubert , B. Stabernack, and K.-I. Wels
NASA Technical Reports Server (NTRS)
1975-01-01
A revised user's manual for the computer program MAPSEP is presented. Major changes from the interplanetary version of MAPSEP are summarized. The changes are intended to provide a basic capability to analyze anticipated solar electric missions, and a foundation for future more complex, modifications. For Vol. III, N75-16589.
Evolutionary Models for Simple Biosystems
NASA Astrophysics Data System (ADS)
Bagnoli, Franco
The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.
Artificial intelligence in peer review: How can evolutionary computation support journal editors?
Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica
2017-01-01
With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors’ workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems. PMID:28931033
1980-01-01
is identified in the flow chart simply as "Compute VECT’s ( predictor solution)" and "Compute V’s ( corrector solution)." A significant portion of the...TrintoTo Tm ANDera ionT SToION 28 ITIME :1 PRINCIPAL SUBROUTINES WALLPOINT (ITER,DT) ITER - iteration index for MacCormack Algorithm (ITER=1 for predictor ...WEILERSTEIN, R RAY, 6 MILLER F33615-7- C -3016UNLASSIFIED GASL-TR-254-VBL-2 AFFDL-TR-79-3162-VOL-2 NII III hImllllllllll EIEIIIIIIEIIEE EEIIIIIIIIIIII H
1998-07-01
author’s responsibility to obtain written permission to reproduce such material. 1 " vssmwmato srÄmaöNfTT fWi««-ii|<.1iw »■■«. i-i...interesting to compare papers in the issue with previous special issues of other jour- nals and monographs, for example [ 1 , 2]. HPC issues first attracted...environment, in particular the Kendall Square Research KSR- 1 . Fast algorithms have attracted considerable atten- tion in the CEM community, since they
NASA Astrophysics Data System (ADS)
Rivers, M. L.; Gualda, G. A.
2009-05-01
One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk, and (3) generate MPEG movies for inclusion in presentations, publications, websites, etc. Both are freely available as run-time ('.sav') versions that can be run using the free IDL Virtual Machine TM, available from ITT Visual Information Solutions: http://www.ittvis.com/ProductServices/IDL/VirtualMachine.aspx The run-time versions of 'tomo_display' and 'vol_tools' can be downloaded from: http://cars.uchicago.edu/software/idl/tomography.html http://sites.google.com/site/voltools/
Paqué, Frank; Rechenberg, Dan-Krister; Zehnder, Matthias
2012-05-01
Hard-tissue debris is accumulated during rotary instrumentation. This study investigated to what extent a calcium-complexing agent that has good short-term compatibility with sodium hypochlorite (NaOCl) could reduce debris accumulation when applied in an all-in-one irrigant during root canal instrumentation. Sixty extracted mandibular molars with isthmuses in the mesial root canal system were selected based on prescans using a micro-computed tomography system. Thirty teeth each were randomly assigned to be instrumented with a rotary system and irrigated with either 2.5% NaOCl or 2.5% NaOCl containing 9% (wt/vol) etidronic acid (HEBP). Using a side-vented irrigating tip, 2 mL of irrigant was applied by 1 blinded investigator to the mesial canals after each instrument. Five milliliters of irrigant was applied per canal as the final rinse. Mesial root canal systems were scanned at high resolution before and after treatment, and accumulated hard-tissue debris was calculated as vol% of the original canal anatomy. Values between groups were compared using the Student's t test (α < .05). Irrigation with 2.5% NaOCl resulted in 5.5 ± 3.6 vol% accumulated hard-tissue debris compared with 3.8 ± 1.8 vol% when HEBP was contained in the irrigant (P < .05). A hypochlorite-compatible chelator can reduce but not completely prevent hard-tissue debris accumulation during rotary root canal instrumentation. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Marr's levels and the minimalist program.
Johnson, Mark
2017-02-01
A simple change to a cognitive system at Marr's computational level may entail complex changes at the other levels of description of the system. The implementational level complexity of a change, rather than its computational level complexity, may be more closely related to the plausibility of a discrete evolutionary event causing that change. Thus the formal complexity of a change at the computational level may not be a good guide to the plausibility of an evolutionary event introducing that change. For example, while the Minimalist Program's Merge is a simple formal operation (Berwick & Chomsky, 2016), the computational mechanisms required to implement the language it generates (e.g., to parse the language) may be considerably more complex. This has implications for the theory of grammar: theories of grammar which involve several kinds of syntactic operations may be no less evolutionarily plausible than a theory of grammar that involves only one. A deeper understanding of human language at the algorithmic and implementational levels could strengthen Minimalist Program's account of the evolution of language.
Soft computing approach to 3D lung nodule segmentation in CT.
Badura, P; Pietka, E
2014-10-01
This paper presents a novel, multilevel approach to the segmentation of various types of pulmonary nodules in computed tomography studies. It is based on two branches of computational intelligence: the fuzzy connectedness (FC) and the evolutionary computation. First, the image and auxiliary data are prepared for the 3D FC analysis during the first stage of an algorithm - the masks generation. Its main goal is to process some specific types of nodules connected to the pleura or vessels. It consists of some basic image processing operations as well as dedicated routines for the specific cases of nodules. The evolutionary computation is performed on the image and seed points in order to shorten the FC analysis and improve its accuracy. After the FC application, the remaining vessels are removed during the postprocessing stage. The method has been validated using the first dataset of studies acquired and described by the Lung Image Database Consortium (LIDC) and by its latest release - the LIDC-IDRI (Image Database Resource Initiative) database. Copyright © 2014 Elsevier Ltd. All rights reserved.
Applications of genetic programming in cancer research.
Worzel, William P; Yu, Jianjun; Almal, Arpit A; Chinnaiyan, Arul M
2009-02-01
The theory of Darwinian evolution is the fundamental keystones of modern biology. Late in the last century, computer scientists began adapting its principles, in particular natural selection, to complex computational challenges, leading to the emergence of evolutionary algorithms. The conceptual model of selective pressure and recombination in evolutionary algorithms allow scientists to efficiently search high dimensional space for solutions to complex problems. In the last decade, genetic programming has been developed and extensively applied for analysis of molecular data to classify cancer subtypes and characterize the mechanisms of cancer pathogenesis and development. This article reviews current successes using genetic programming and discusses its potential impact in cancer research and treatment in the near future.
Luo, Xiongbiao; Wan, Ying; He, Xiangjian
2015-04-01
Electromagnetically guided endoscopic procedure, which aims at accurately and robustly localizing the endoscope, involves multimodal sensory information during interventions. However, it still remains challenging in how to integrate these information for precise and stable endoscopic guidance. To tackle such a challenge, this paper proposes a new framework on the basis of an enhanced particle swarm optimization method to effectively fuse these information for accurate and continuous endoscope localization. The authors use the particle swarm optimization method, which is one of stochastic evolutionary computation algorithms, to effectively fuse the multimodal information including preoperative information (i.e., computed tomography images) as a frame of reference, endoscopic camera videos, and positional sensor measurements (i.e., electromagnetic sensor outputs). Since the evolutionary computation method usually limits its possible premature convergence and evolutionary factors, the authors introduce the current (endoscopic camera and electromagnetic sensor's) observation to boost the particle swarm optimization and also adaptively update evolutionary parameters in accordance with spatial constraints and the current observation, resulting in advantageous performance in the enhanced algorithm. The experimental results demonstrate that the authors' proposed method provides a more accurate and robust endoscopic guidance framework than state-of-the-art methods. The average guidance accuracy of the authors' framework was about 3.0 mm and 5.6° while the previous methods show at least 3.9 mm and 7.0°. The average position and orientation smoothness of their method was 1.0 mm and 1.6°, which is significantly better than the other methods at least with (2.0 mm and 2.6°). Additionally, the average visual quality of the endoscopic guidance was improved to 0.29. A robust electromagnetically guided endoscopy framework was proposed on the basis of an enhanced particle swarm optimization method with using the current observation information and adaptive evolutionary factors. The authors proposed framework greatly reduced the guidance errors from (4.3, 7.8) to (3.0 mm, 5.6°), compared to state-of-the-art methods.
Simplified jet-A kinetic mechanism for combustor application
NASA Technical Reports Server (NTRS)
Lee, Chi-Ming; Kundu, Krishna; Ghorashi, Bahman
1993-01-01
Successful modeling of combustion and emissions in gas turbine engine combustors requires an adequate description of the reaction mechanism. For hydrocarbon oxidation, detailed mechanisms are only available for the simplest types of hydrocarbons such as methane, ethane, acetylene, and propane. These detailed mechanisms contain a large number of chemical species participating simultaneously in many elementary kinetic steps. Current computational fluid dynamic (CFD) models must include fuel vaporization, fuel-air mixing, chemical reactions, and complicated boundary geometries. To simulate these conditions a very sophisticated computer model is required, which requires large computer memory capacity and long run times. Therefore, gas turbine combustion modeling has frequently been simplified by using global reaction mechanisms, which can predict only the quantities of interest: heat release rates, flame temperature, and emissions. Jet fuels are wide-boiling-range hydrocarbons with ranges extending through those of gasoline and kerosene. These fuels are chemically complex, often containing more than 300 components. Jet fuel typically can be characterized as containing 70 vol pct paraffin compounds and 25 vol pct aromatic compounds. A five-step Jet-A fuel mechanism which involves pyrolysis and subsequent oxidation of paraffin and aromatic compounds is presented here. This mechanism is verified by comparing with Jet-A fuel ignition delay time experimental data, and species concentrations obtained from flametube experiments. This five-step mechanism appears to be better than the current one- and two-step mechanisms.
Defense Threat Reduction Agency > Research > DTRIAC > DTRIAC Dispatch
@mail.mil with your submissions. DTRIAC Dispatch - Vol 5, Issue 3 DTRIAC Dispatch - Vol 5, Issue 2 DTRIAC Dispatch - Vol 5, Issue 1 DTRIAC Dispatch - Vol 4, Issue 1 DTRIAC Dispatch - Vol 3, Issue 4 DTRIAC Dispatch - Vol 3, Issue 3 DTRIAC Dispatch - Vol 3, Issue 2 DTRIAC Dispatch - Vol 3, Issue 1 DTRIAC Dispatch - Vol
Generalizing roughness: experiments with flow-oriented roughness
NASA Astrophysics Data System (ADS)
Trevisani, Sebastiano
2015-04-01
Surface texture analysis applied to High Resolution Digital Terrain Models (HRDTMs) improves the capability to characterize fine-scale morphology and permits the derivation of useful morphometric indexes. An important indicator to be taken into account in surface texture analysis is surface roughness, which can have a discriminant role in the detection of different geomorphic processes and factors. The evaluation of surface roughness is generally performed considering it as an isotropic surface parameter (e.g., Cavalli, 2008; Grohmann, 2011). However, surface texture has often an anisotropic character, which means that surface roughness could change according to the considered direction. In some applications, for example involving surface flow processes, the anisotropy of roughness should be taken into account (e.g., Trevisani, 2012; Smith, 2014). Accordingly, we test the application of a flow-oriented directional measure of roughness, computed considering surface gravity-driven flow. For the calculation of flow-oriented roughness we use both classical variogram-based roughness (e.g., Herzfeld,1996; Atkinson, 2000) as well as an ad-hoc developed robust modification of variogram (i.e. MAD, Trevisani, 2014). The presented approach, based on a D8 algorithm, shows the potential impact of considering directionality in the calculation of roughness indexes. The use of flow-oriented roughness could improve the definition of effective proxies of impedance to flow. Preliminary results on the integration of directional roughness operators with morphometric-based models, are promising and can be extended to more complex approaches. Atkinson, P.M., Lewis, P., 2000. Geostatistical classification for remote sensing: an introduction. Computers & Geosciences 26, 361-371. Cavalli, M. & Marchi, L. 2008, "Characterization of the surface morphology of an alpine alluvial fan using airborne LiDAR", Natural Hazards and Earth System Science, vol. 8, no. 2, pp. 323-333. Grohmann, C.H., Smith, M.J., Riccomini, C., 2011. Multiscale Analysis of Topographic Surface Roughness in the Midland Valley, Scotland. IEEE Transactions on Geoscience and Remote Sensing 49, 1220-1213. Herzfeld, U.C., Higginson, C.A., 1996. Automated geostatistical seafloor classification - Principles, parameters, feature vectors, and discrimination criteria. Computers and Geosciences, 22 (1), pp. 35-52. Smith, M.W. 2014, "Roughness in the Earth Sciences", Earth-Science Reviews, vol. 136, pp. 202-225. Trevisani, S., Cavalli, M. & Marchi, L. 2012, "Surface texture analysis of a high-resolution DTM: Interpreting an alpine basin", Geomorphology, vol. 161-162, pp. 26-39. Trevisani S., Rocca M., 2014. Geomorphometric analysis of fine-scale morphology for extensive areas: a new surface-texture operator. Geophysical Research Abstracts, Vol. 16, EGU2014-5612, 2014. EGU General Assembly 2014.
Day, Troy
2012-01-01
The process of evolutionary diversification unfolds in a vast genotypic space of potential outcomes. During the past century, there have been remarkable advances in the development of theory for this diversification, and the theory's success rests, in part, on the scope of its applicability. A great deal of this theory focuses on a relatively small subset of the space of potential genotypes, chosen largely based on historical or contemporary patterns, and then predicts the evolutionary dynamics within this pre-defined set. To what extent can such an approach be pushed to a broader perspective that accounts for the potential open-endedness of evolutionary diversification? There have been a number of significant theoretical developments along these lines but the question of how far such theory can be pushed has not been addressed. Here a theorem is proven demonstrating that, because of the digital nature of inheritance, there are inherent limits on the kinds of questions that can be answered using such an approach. In particular, even in extremely simple evolutionary systems, a complete theory accounting for the potential open-endedness of evolution is unattainable unless evolution is progressive. The theorem is closely related to Gödel's incompleteness theorem, and to the halting problem from computability theory. PMID:21849390
Framework for computationally efficient optimal irrigation scheduling using ant colony optimization
USDA-ARS?s Scientific Manuscript database
A general optimization framework is introduced with the overall goal of reducing search space size and increasing the computational efficiency of evolutionary algorithm application for optimal irrigation scheduling. The framework achieves this goal by representing the problem in the form of a decisi...
Pervasive Computing and Communication Technologies for U-Learning
ERIC Educational Resources Information Center
Park, Young C.
2014-01-01
The development of digital information transfer, storage and communication methods influences a significant effect on education. The assimilation of pervasive computing and communication technologies marks another great step forward, with Ubiquitous Learning (U-learning) emerging for next generation learners. In the evolutionary view the 5G (or…
Langley's CSI evolutionary model: Phase O
NASA Technical Reports Server (NTRS)
Belvin, W. Keith; Elliott, Kenny B.; Horta, Lucas G.; Bailey, Jim P.; Bruner, Anne M.; Sulla, Jeffrey L.; Won, John; Ugoletti, Roberto M.
1991-01-01
A testbed for the development of Controls Structures Interaction (CSI) technology to improve space science platform pointing is described. The evolutionary nature of the testbed will permit the study of global line-of-sight pointing in phases 0 and 1, whereas, multipayload pointing systems will be studied beginning with phase 2. The design, capabilities, and typical dynamic behavior of the phase 0 version of the CSI evolutionary model (CEM) is documented for investigator both internal and external to NASA. The model description includes line-of-sight pointing measurement, testbed structure, actuators, sensors, and real time computers, as well as finite element and state space models of major components.
Kumar, S; Gadagkar, S R
2000-12-01
The neighbor-joining (NJ) method is widely used in reconstructing large phylogenies because of its computational speed and the high accuracy in phylogenetic inference as revealed in computer simulation studies. However, most computer simulation studies have quantified the overall performance of the NJ method in terms of the percentage of branches inferred correctly or the percentage of replications in which the correct tree is recovered. We have examined other aspects of its performance, such as the relative efficiency in correctly reconstructing shallow (close to the external branches of the tree) and deep branches in large phylogenies; the contribution of zero-length branches to topological errors in the inferred trees; and the influence of increasing the tree size (number of sequences), evolutionary rate, and sequence length on the efficiency of the NJ method. Results show that the correct reconstruction of deep branches is no more difficult than that of shallower branches. The presence of zero-length branches in realized trees contributes significantly to the overall error observed in the NJ tree, especially in large phylogenies or slowly evolving genes. Furthermore, the tree size does not influence the efficiency of NJ in reconstructing shallow and deep branches in our simulation study, in which the evolutionary process is assumed to be homogeneous in all lineages.
Spore: Spawning Evolutionary Misconceptions?
NASA Astrophysics Data System (ADS)
Bean, Thomas E.; Sinatra, Gale M.; Schrader, P. G.
2010-10-01
The use of computer simulations as educational tools may afford the means to develop understanding of evolution as a natural, emergent, and decentralized process. However, special consideration of developmental constraints on learning may be necessary when using these technologies. Specifically, the essentialist (biological forms possess an immutable essence), teleological (assignment of purpose to living things and/or parts of living things that may not be purposeful), and intentionality (assumption that events are caused by an intelligent agent) biases may be reinforced through the use of computer simulations, rather than addressed with instruction. We examine the video game Spore for its depiction of evolutionary content and its potential to reinforce these cognitive biases. In particular, we discuss three pedagogical strategies to mitigate weaknesses of Spore and other computer simulations: directly targeting misconceptions through refutational approaches, targeting specific principles of scientific inquiry, and directly addressing issues related to models as cognitive tools.
Parallel Evolutionary Optimization for Neuromorphic Network Training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuman, Catherine D; Disney, Adam; Singh, Susheela
One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impactmore » the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.« less
Eirín-López, José M
2013-01-01
The study of chromatin constitutes one of the most active research fields in life sciences, being subject to constant revisions that continuously redefine the state of the art in its knowledge. As every other rapidly changing field, chromatin biology requires clear and straightforward educational strategies able to efficiently translate such a vast body of knowledge to the classroom. With this aim, the present work describes a multidisciplinary computer lab designed to introduce undergraduate students to the dynamic nature of chromatin, within the context of the one semester course "Chromatin: Structure, Function and Evolution." This exercise is organized in three parts including (a) molecular evolutionary biology of histone families (using the H1 family as example), (b) histone structure and variation across different animal groups, and (c) effect of histone diversity on nucleosome structure and chromatin dynamics. By using freely available bioinformatic tools that can be run on common computers, the concept of chromatin dynamics is interactively illustrated from a comparative/evolutionary perspective. At the end of this computer lab, students are able to translate the bioinformatic information into a biochemical context in which the relevance of histone primary structure on chromatin dynamics is exposed. During the last 8 years this exercise has proven to be a powerful approach for teaching chromatin structure and dynamics, allowing students a higher degree of independence during the processes of learning and self-assessment. Copyright © 2013 International Union of Biochemistry and Molecular Biology, Inc.
A Multicenter Study of Volumetric Computed Tomography for Staging Malignant Pleural Mesothelioma
Rusch, Valerie W.; Gill, Ritu; Mitchell, Alan; Naidich, David; Rice, David C.; Pass, Harvey I.; Kindler, Hedy; De Perrot, Marc; Friedberg, Joseph
2016-01-01
Background Standard imaging modalities are inaccurate in staging malignant pleural mesothelioma (MPM). Single institution studies suggest that volumetric computed tomography (VolCT) is more accurate but labor intensive. We established a multicenter network to test interobserver variability, accuracy (relative to pathologic stage) and prognostic significance of semi-automated VolCT. Methods Six institutions electronically submitted clinical and pathologic data to an established multicenter database on patients with MPM who had surgery. Institutional radiologists reviewed preoperative CT scans for quality then submitted via electronic network (AG mednet) to biostatistical center (BC). Two reference radiologists, blinded to clinical data, performed semi-automated tumor volume calculations using commercially available software (Vitrea Enterprise 6.0), then submitted readings to BC. Study endpoints included: feasibility of network; interobserver variability for VolCT; correlation of tumor volume to pTN stages, and overall survival (OS). Results Of 164 cases, 129 were analyzable and read by reference radiologists. Most tumors were <500cm3. A small bias was observed between readers, as one provided consistently larger measurements than the other (mean difference=47.9, p=.0027), but for 80% of cases, the absolute difference was ≤ 200cm3. Spearman correlation between readers was 0.822. Volume correlated with pTN stages and OS, best defined by 3 groups with average volumes of: 91.2, 245.3, 511.3cm3, associated with median OS of 37, 18, 8 months respectively. Conclusions For the first time, a multicenter network was established and initial correlations of tumor volume to pTN stages and OS shown. A larger multicenter international study is planned to confirm results and refine correlations. PMID:27596916
1988-01-01
A Generator for Natural Language Interfaces," Computational Linguistis. Vol. 11, Number 4, October-December, 1985. pp. 219-242. de Joia , A. and...employ in order to communicate to their intended audience. Production, therefore, encompasses issues of deciding what is pertinent as well as de ...rhetorical predicates; design of a system motivated by the desire for domain and language independency, semantic connection of the generation system
Recent Naval Postgraduate School Publications
1988-09-30
Disciplines, Dallas, TX, Mar., 1986. Suchan( JBusinesspeople’s resistance to the plain language movement The Assoc. for Business Communication West...Suchan, J; Scott, C. Plain talk across the bargaining table: unclear contract language and its effect on corporate culture Business Horizons, vol. 29...Database Symp., Tokyo, Japan, Aug., 1986. Berzins, V The design of software interfaces in spec International Conference on Computer Languages , Miami
ERIC Educational Resources Information Center
ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.
This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 22 titles deal with the following topics: computer-assisted instruction; school characteristics and reading achievement; the process of reading acquisition; on-task behavior, teacher involvement, and reading achievement; the…
ERIC Educational Resources Information Center
Fjallbrant, Nancy, Ed.
1985-01-01
Papers presented at an August 1984 international seminar on online user education include "Library Policies and Strategies in The Netherlands" (Chris J. van Wijk, The Netherlands); "Promotion and Marketing of Library Services" (Nancy Fjallbrant, Sweden); "Library Promotion by Computer" (Ian Malley, United Kingdom); "Library User Education and…
Risley, Casey A.L.; Zydlewski, Joseph D.
2011-01-01
Assessing the Effects of Catch-and-Release Regulations on a Brook Trout Population Using an Age-Structured Model: North American Journal of Fisheries Management: Vol 30, No 6 var _prum=[['id','54ff88bcabe53dc41d1004a5'],['mark','firstbyte',(new Date()).getTime()
System. A Newsletter for Educational Technology and Language Learning Systems. Vol. 2, No. 3.
ERIC Educational Resources Information Center
Davies, Norman F., Ed.; Allen, John R., Ed.
This issue begins with an editorial comment on the journal's areas of interest. The articles are concerned with the following topics: (1) English composition and the use of the computer (Peter Zoller); (2) the teacher and the language laboratory (L. Ross and B. D. Sadler); (3) language aptitude tests in the language laboratory (in German, Peter…
Temporal Reasoning and Default Logics.
1985-10-01
Aritificial Intelligence ", Computer Science Research Report, Yale University, forthcoming (1985). . 74 .-, A Axioms for Describing Persistences and Clipping...34Circumscription - A Form of Non-Monotonic Reasoning", Artificial Intelligence , vol. 13 (1980), pp. 27-39. [13] McCarthy, John, "Applications of...and P. J. Hayes, "Some philosophical problems from the standpoint of artificial intelligence ", in: B. Meltzer and D. Michie (eds.), Machine
Inquiry-Based Learning of Molecular Phylogenetics
ERIC Educational Resources Information Center
Campo, Daniel; Garcia-Vazquez, Eva
2008-01-01
Reconstructing phylogenies from nucleotide sequences is a challenge for students because it strongly depends on evolutionary models and computer tools that are frequently updated. We present here an inquiry-based course aimed at learning how to trace a phylogeny based on sequences existing in public databases. Computer tools are freely available…
2002-03-07
Michalewicz, Eds., Evolutionary Computation 1: Basic Algorithms and Operators, Institute of Physics, Bristol (UK), 2000. [3] David A. Van Veldhuizen ...2000. [4] Carlos A. Coello Coello, David A. Van Veldhuizen , and Gary B. Lamont, Evolutionary Algorithms for Solving Multi-Objective Problems, Kluwer...Academic Publishers, 233 Spring St., New York, NY 10013, 2002. [5] David A. Van Veldhuizen , Multiobjective Evolution- ary Algorithms: Classifications
Wright, Cameron H G; Barrett, Steven F; Pack, Daniel J
2005-01-01
We describe a new approach to attacking the problem of robust computer vision for mobile robots. The overall strategy is to mimic the biological evolution of animal vision systems. Our basic imaging sensor is based upon the eye of the common house fly, Musca domestica. The computational algorithms are a mix of traditional image processing, subspace techniques, and multilayer neural networks.
Automated Antenna Design with Evolutionary Algorithms
NASA Technical Reports Server (NTRS)
Hornby, Gregory S.; Globus, Al; Linden, Derek S.; Lohn, Jason D.
2006-01-01
Current methods of designing and optimizing antennas by hand are time and labor intensive, and limit complexity. Evolutionary design techniques can overcome these limitations by searching the design space and automatically finding effective solutions. In recent years, evolutionary algorithms have shown great promise in finding practical solutions in large, poorly understood design spaces. In particular, spacecraft antenna design has proven tractable to evolutionary design techniques. Researchers have been investigating evolutionary antenna design and optimization since the early 1990s, and the field has grown in recent years as computer speed has increased and electromagnetic simulators have improved. Two requirements-compliant antennas, one for ST5 and another for TDRS-C, have been automatically designed by evolutionary algorithms. The ST5 antenna is slated to fly this year, and a TDRS-C phased array element has been fabricated and tested. Such automated evolutionary design is enabled by medium-to-high quality simulators and fast modern computers to evaluate computer-generated designs. Evolutionary algorithms automate cut-and-try engineering, substituting automated search though millions of potential designs for intelligent search by engineers through a much smaller number of designs. For evolutionary design, the engineer chooses the evolutionary technique, parameters and the basic form of the antenna, e.g., single wire for ST5 and crossed-element Yagi for TDRS-C. Evolutionary algorithms then search for optimal configurations in the space defined by the engineer. NASA's Space Technology 5 (ST5) mission will launch three small spacecraft to test innovative concepts and technologies. Advanced evolutionary algorithms were used to automatically design antennas for ST5. The combination of wide beamwidth for a circularly-polarized wave and wide impedance bandwidth made for a challenging antenna design problem. From past experience in designing wire antennas, we chose to constrain the evolutionary design to a monopole wire antenna. The results of the runs produced requirements-compliant antennas that were subsequently fabricated and tested. The evolved antenna has a number of advantages with regard to power consumption, fabrication time and complexity, and performance. Lower power requirements result from achieving high gain across a wider range of elevation angles, thus allowing a broader range of angles over which maximum data throughput can be achieved. Since the evolved antenna does not require a phasing circuit, less design and fabrication work is required. In terms of overall work, the evolved antenna required approximately three person-months to design and fabricate whereas the conventional antenna required about five. Furthermore, when the mission was modified and new orbital parameters selected, a redesign of the antenna to new requirements was required. The evolutionary system was rapidly modified and a new antenna evolved in a few weeks. The evolved antenna was shown to be compliant to the ST5 mission requirements. It has an unusual organic looking structure, one that expert antenna designers would not likely produce. This antenna has been tested, baselined and is scheduled to fly this year. In addition to the ST5 antenna, our laboratory has evolved an S-band phased array antenna element design that meets the requirements for NASA's TDRS-C communications satellite scheduled for launch early next decade. A combination of fairly broad bandwidth, high efficiency and circular polarization at high gain made for another challenging design problem. We chose to constrain the evolutionary design to a crossed-element Yagi antenna. The specification called for two types of elements, one for receive only and one for transmit/receive. We were able to evolve a single element design that meets both specifications thereby simplifying the antenna and reducing testing and integration costs. The highest performance antenna found using a getic algorithm and stochastic hill-climbing has been fabricated and tested. Laboratory results correspond well with simulation. Aerospace component design is an expensive and important step in space development. Evolutionary design can make a significant contribution wherever sufficiently fast, accurate and capable software simulators are available. We have demonstrated successful real-world design in the spacecraft antenna domain; and there is good reason to believe that these results could be replicated in other design spaces.
NASA Astrophysics Data System (ADS)
Vasant, Pandian; Barsoum, Nader
2008-10-01
Many engineering, science, information technology and management optimization problems can be considered as non linear programming real world problems where the all or some of the parameters and variables involved are uncertain in nature. These can only be quantified using intelligent computational techniques such as evolutionary computation and fuzzy logic. The main objective of this research paper is to solve non linear fuzzy optimization problem where the technological coefficient in the constraints involved are fuzzy numbers which was represented by logistic membership functions by using hybrid evolutionary optimization approach. To explore the applicability of the present study a numerical example is considered to determine the production planning for the decision variables and profit of the company.
NASA Technical Reports Server (NTRS)
Keymeulen, Didier; Ferguson, Michael I.; Fink, Wolfgang; Oks, Boris; Peay, Chris; Terrile, Richard; Cheng, Yen; Kim, Dennis; MacDonald, Eric; Foor, David
2005-01-01
We propose a tuning method for MEMS gyroscopes based on evolutionary computation to efficiently increase the sensitivity of MEMS gyroscopes through tuning. The tuning method was tested for the second generation JPL/Boeing Post-resonator MEMS gyroscope using the measurement of the frequency response of the MEMS device in open-loop operation. We also report on the development of a hardware platform for integrated tuning and closed loop operation of MEMS gyroscopes. The control of this device is implemented through a digital design on a Field Programmable Gate Array (FPGA). The hardware platform easily transitions to an embedded solution that allows for the miniaturization of the system to a single chip.
Discontinuous Galerkin Method with Numerical Roe Flux for Spherical Shallow Water Equations
NASA Astrophysics Data System (ADS)
Yi, T.; Choi, S.; Kang, S.
2013-12-01
In developing the dynamic core of a numerical weather prediction model with discontinuous Galerkin method, a numerical flux at the boundaries of grid elements plays a vital role since it preserves the local conservation properties and has a significant impact on the accuracy and stability of numerical solutions. Due to these reasons, we developed the numerical Roe flux based on an approximate Riemann problem for spherical shallow water equations in Cartesian coordinates [1] to find out its stability and accuracy. In order to compare the performance with its counterpart flux, we used the Lax-Friedrichs flux, which has been used in many dynamic cores such as NUMA [1], CAM-DG [2] and MCore [3] because of its simplicity. The Lax-Friedrichs flux is implemented by a flux difference between left and right states plus the maximum characteristic wave speed across the boundaries of elements. It has been shown that the Lax-Friedrichs flux with the finite volume method is more dissipative and unstable than other numerical fluxes such as HLLC, AUSM+ and Roe. The Roe flux implemented in this study is based on the decomposition of flux difference over the element boundaries where the nonlinear equations are linearized. It is rarely used in dynamic cores due to its complexity and thus computational expensiveness. To compare the stability and accuracy of the Roe flux with the Lax-Friedrichs, two- and three-dimensional test cases are performed on a plane and cubed-sphere, respectively, with various numbers of element and polynomial order. For the two-dimensional case, the Gaussian bell is simulated on the plane with two different numbers of elements at the fixed polynomial orders. In three-dimensional cases on the cubed-sphere, we performed the test cases of a zonal flow over an isolated mountain and a Rossby-Haurwitz wave, of which initial conditions are the same as those of Williamson [4]. This study presented that the Roe flux with the discontinuous Galerkin method is less dissipative and has stronger numerical stability than the Lax-Friedrichs. Reference 1. 2002, Giraldo, F.X., Hesthaven, J.S. and Warburton, T., "Nodal High-Order Discontinous Galerkin Methods for the Spherical Shallow Water Equations," Journal of Computational Physics, Vol.181, pp.499-525. 2. 2005, Nair, R.D., Thomas, S.J. and Loft, R.D., "A Discontinuous Galerkin Transport Scheme on the Cubed Sphere," Monthly Weather Review, Vol.133, pp.814-828. 3. 2010, Ullrich, P.A., Jablonowski, C. and Leer, van B., "High-Order Finite-Volume Methods for the Shallow-Water Equations on the Sphere," Journal of Computational Physics, Vol.229, pp.6104-6134. 4. 1992, Williamson, D.L., Drake, J.B., Hack, J., Jacob, R. and Swartztrauber, P.N., "A Standard Test Set for Numerical Approximations to the Shallow Water Equations in Spherical Geometry," Journal of Computational Physics, Vol.102, pp.211-224.
NASA Astrophysics Data System (ADS)
Karp, Matthew Eugene
Lithium-ion (rechargeable) and lithium-metal (non-rechargeable) battery cells put aircraft at risk of igniting and fueling fires. Lithium batteries can be packed in bulk and shipped in the cargo holds of freighter aircraft; currently lithium batteries are banned from bulk shipment on passenger aircraft [1]. The federally regulated Class C cargo compartment extinguishing system's utilization of a 5 %vol Halon 1301 knockdown concentration and a sustained 3 %vol Halon 1301 may not be sufficient at inerting lithium-ion battery vent gas and air mixtures [2]. At 5 %vol Halon 1301 the flammability limits of lithium-ion premixed battery vent gas (Li-Ion pBVG) in air range from 13.80 %vol to 26.07 %vol Li-Ion pBVG. Testing suggests that 8.59 %vol Halon 1301 is required to render all ratios of the Li-Ion pBVG in air inert. The lower flammability limit (LFL) and upper flammability limit (UFL) of hydrogen and air mixtures are 4.95 %vol and 76.52 %vol hydrogen, respectively. With the addition of 10 %vol and 20 %vol Halon 1301 the LFL is 9.02 %vol and 11.55 %vol hydrogen, respectively, and the UFL is 45.70 %vol and 28.39 %vol hydrogen, respectively. The minimum inerting concentration (MIC) of Halon 1301 in hydrogen and air mixtures is 26.72 %vol Halon 1301 at 16.2 %vol hydrogen. The LFL and UFL of Li-Ion pBVG and air mixtures are 7.88 %vol and 37.14 %vol Li-Ion pBVG, respectively. With the addition of 5 %vol, 7 %vol, and 8 %vol Halon 1301 the LFL is 13.80 %vol, 16.15 %vol, and 17.62 % vol Li-Ion pBVG, respectively, and the UFL is 26.07 %vol, 23.31 %vol, and 21.84 %vol Li- Ion pBVG, respectively. The MIC of Halon 1301 in Li-Ion pBVG and air mixtures is 8.59 %vol Halon 1301 at 19.52 %vol Li-Ion pBVG. Le Chatelier's mixing rule has been shown to be an effective measure for estimating the flammability limits of Li-Ion pBVGes. The LFL has a 1.79 % difference while the UFL has a 4.53 % difference. The state of charge (SOC) affects the flammability limits in an apparent parabolic manner, where the widest flammability limits are at or near 100 % SOC. [1] IATA. Lithium Battery Guidance Document. 7 Jan. 2016. Guidance for complying with provisions applicable to the transport by air of lithium batteries as set out in the 57th Edition of the IATA Dangerous Goods Regulations (DGR). [2] Webster, Harry. Flammability assessment of bulk-packed, rechargeable lithium-ion cells in transport category aircraft. Office of Aviation Research, Federal Aviation Administration, 2006.
Evolution of Collective Behaviour in an Artificial World Using Linguistic Fuzzy Rule-Based Systems
Lebar Bajec, Iztok
2017-01-01
Collective behaviour is a fascinating and easily observable phenomenon, attractive to a wide range of researchers. In biology, computational models have been extensively used to investigate various properties of collective behaviour, such as: transfer of information across the group, benefits of grouping (defence against predation, foraging), group decision-making process, and group behaviour types. The question ‘why,’ however remains largely unanswered. Here the interest goes into which pressures led to the evolution of such behaviour, and evolutionary computational models have already been used to test various biological hypotheses. Most of these models use genetic algorithms to tune the parameters of previously presented non-evolutionary models, but very few attempt to evolve collective behaviour from scratch. Of these last, the successful attempts display clumping or swarming behaviour. Empirical evidence suggests that in fish schools there exist three classes of behaviour; swarming, milling and polarized. In this paper we present a novel, artificial life-like evolutionary model, where individual agents are governed by linguistic fuzzy rule-based systems, which is capable of evolving all three classes of behaviour. PMID:28045964
NASA Astrophysics Data System (ADS)
Dash, Rajashree
2017-11-01
Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.
Evolution of Collective Behaviour in an Artificial World Using Linguistic Fuzzy Rule-Based Systems.
Demšar, Jure; Lebar Bajec, Iztok
2017-01-01
Collective behaviour is a fascinating and easily observable phenomenon, attractive to a wide range of researchers. In biology, computational models have been extensively used to investigate various properties of collective behaviour, such as: transfer of information across the group, benefits of grouping (defence against predation, foraging), group decision-making process, and group behaviour types. The question 'why,' however remains largely unanswered. Here the interest goes into which pressures led to the evolution of such behaviour, and evolutionary computational models have already been used to test various biological hypotheses. Most of these models use genetic algorithms to tune the parameters of previously presented non-evolutionary models, but very few attempt to evolve collective behaviour from scratch. Of these last, the successful attempts display clumping or swarming behaviour. Empirical evidence suggests that in fish schools there exist three classes of behaviour; swarming, milling and polarized. In this paper we present a novel, artificial life-like evolutionary model, where individual agents are governed by linguistic fuzzy rule-based systems, which is capable of evolving all three classes of behaviour.
Jaeger, Johannes; Crombach, Anton
2012-01-01
We propose an approach to evolutionary systems biology which is based on reverse engineering of gene regulatory networks and in silico evolutionary simulations. We infer regulatory parameters for gene networks by fitting computational models to quantitative expression data. This allows us to characterize the regulatory structure and dynamical repertoire of evolving gene regulatory networks with a reasonable amount of experimental and computational effort. We use the resulting network models to identify those regulatory interactions that are conserved, and those that have diverged between different species. Moreover, we use the models obtained by data fitting as starting points for simulations of evolutionary transitions between species. These simulations enable us to investigate whether such transitions are random, or whether they show stereotypical series of regulatory changes which depend on the structure and dynamical repertoire of an evolving network. Finally, we present a case study-the gap gene network in dipterans (flies, midges, and mosquitoes)-to illustrate the practical application of the proposed methodology, and to highlight the kind of biological insights that can be gained by this approach.
Wang, Xue; Wang, Sheng; Ma, Jun-Jie
2007-01-01
The effectiveness of wireless sensor networks (WSNs) depends on the coverage and target detection probability provided by dynamic deployment, which is usually supported by the virtual force (VF) algorithm. However, in the VF algorithm, the virtual force exerted by stationary sensor nodes will hinder the movement of mobile sensor nodes. Particle swarm optimization (PSO) is introduced as another dynamic deployment algorithm, but in this case the computation time required is the big bottleneck. This paper proposes a dynamic deployment algorithm which is named “virtual force directed co-evolutionary particle swarm optimization” (VFCPSO), since this algorithm combines the co-evolutionary particle swarm optimization (CPSO) with the VF algorithm, whereby the CPSO uses multiple swarms to optimize different components of the solution vectors for dynamic deployment cooperatively and the velocity of each particle is updated according to not only the historical local and global optimal solutions, but also the virtual forces of sensor nodes. Simulation results demonstrate that the proposed VFCPSO is competent for dynamic deployment in WSNs and has better performance with respect to computation time and effectiveness than the VF, PSO and VFPSO algorithms.
The tangled bank of amino acids
Pollock, David D.
2016-01-01
Abstract The use of amino acid substitution matrices to model protein evolution has yielded important insights into both the evolutionary process and the properties of specific protein families. In order to make these models tractable, standard substitution matrices represent the average results of the evolutionary process rather than the underlying molecular biophysics and population genetics, treating proteins as a set of independently evolving sites rather than as an integrated biomolecular entity. With advances in computing and the increasing availability of sequence data, we now have an opportunity to move beyond current substitution matrices to more interpretable mechanistic models with greater fidelity to the evolutionary process of mutation and selection and the holistic nature of the selective constraints. As part of this endeavour, we consider how epistatic interactions induce spatial and temporal rate heterogeneity, and demonstrate how these generally ignored factors can reconcile standard substitution rate matrices and the underlying biology, allowing us to better understand the meaning of these substitution rates. Using computational simulations of protein evolution, we can demonstrate the importance of both spatial and temporal heterogeneity in modelling protein evolution. PMID:27028523
Evolutionary Optimization of a Geometrically Refined Truss
NASA Technical Reports Server (NTRS)
Hull, P. V.; Tinker, M. L.; Dozier, G. V.
2007-01-01
Structural optimization is a field of research that has experienced noteworthy growth for many years. Researchers in this area have developed optimization tools to successfully design and model structures, typically minimizing mass while maintaining certain deflection and stress constraints. Numerous optimization studies have been performed to minimize mass, deflection, and stress on a benchmark cantilever truss problem. Predominantly traditional optimization theory is applied to this problem. The cross-sectional area of each member is optimized to minimize the aforementioned objectives. This Technical Publication (TP) presents a structural optimization technique that has been previously applied to compliant mechanism design. This technique demonstrates a method that combines topology optimization, geometric refinement, finite element analysis, and two forms of evolutionary computation: genetic algorithms and differential evolution to successfully optimize a benchmark structural optimization problem. A nontraditional solution to the benchmark problem is presented in this TP, specifically a geometrically refined topological solution. The design process begins with an alternate control mesh formulation, multilevel geometric smoothing operation, and an elastostatic structural analysis. The design process is wrapped in an evolutionary computing optimization toolset.
On ``Overestimation-free Computational Version of Interval Analysis''
NASA Astrophysics Data System (ADS)
Popova, Evgenija D.
2013-10-01
The transformation of interval parameters into trigonometric functions, proposed in Int. J. Comput. Meth. Eng. Sci. Mech., vol. 13, pp. 319-328 (2012), is not motivated in comparison to the infinitely many equivalent algebraic transformations. The conclusions about the efficacy of the methodology used are based on incorrect comparisons between solutions of different problems. We show theoretically, and in the examples considered in the commented article, that changing the number of the parameters in a system of linear algebraic equations may change the initial problem, respectively, its solution set. We also correct various misunderstandings and bugs that appear in the article noted above.
On the numerical treatment of selected oscillatory evolutionary problems
NASA Astrophysics Data System (ADS)
Cardone, Angelamaria; Conte, Dajana; D'Ambrosio, Raffaele; Paternoster, Beatrice
2017-07-01
We focus on evolutionary problems whose qualitative behaviour is known a-priori and exploited in order to provide efficient and accurate numerical schemes. For classical numerical methods, depending on constant coefficients, the required computational effort could be quite heavy, due to the necessary employ of very small stepsizes needed to accurately reproduce the qualitative behaviour of the solution. In these situations, it may be convenient to use special purpose formulae, i.e. non-polynomially fitted formulae on basis functions adapted to the problem (see [16, 17] and references therein). We show examples of special purpose strategies to solve two families of evolutionary problems exhibiting periodic solutions, i.e. partial differential equations and Volterra integral equations.
Human evolutionary genomics: ethical and interpretive issues.
Vitti, Joseph J; Cho, Mildred K; Tishkoff, Sarah A; Sabeti, Pardis C
2012-03-01
Genome-wide computational studies can now identify targets of natural selection. The unique information about humans these studies reveal, and the media attention they attract, indicate the need for caution and precision in communicating results. This need is exacerbated by ways in which evolutionary and genetic considerations have been misapplied to support discriminatory policies, by persistent misconceptions of these fields and by the social sensitivity surrounding discussions of racial ancestry. We discuss the foundations, accomplishments and future directions of human evolutionary genomics, attending to ways in which the interpretation of good science can go awry, and offer suggestions for researchers to prevent misapplication of their work. Copyright © 2011 Elsevier Ltd. All rights reserved.
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P; Marin, Jean-Michel; Balding, David J; Guillemaud, Thomas; Estoup, Arnaud
2008-12-01
Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc.
Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments
NASA Astrophysics Data System (ADS)
Lane, Peter C. R.; Gobet, Fernand
2013-03-01
Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.
A program to compute the soft Robinson-Foulds distance between phylogenetic networks.
Lu, Bingxin; Zhang, Louxin; Leong, Hon Wai
2017-03-14
Over the past two decades, phylogenetic networks have been studied to model reticulate evolutionary events. The relationships among phylogenetic networks, phylogenetic trees and clusters serve as the basis for reconstruction and comparison of phylogenetic networks. To understand these relationships, two problems are raised: the tree containment problem, which asks whether a phylogenetic tree is displayed in a phylogenetic network, and the cluster containment problem, which asks whether a cluster is represented at a node in a phylogenetic network. Both the problems are NP-complete. A fast exponential-time algorithm for the cluster containment problem on arbitrary networks is developed and implemented in C. The resulting program is further extended into a computer program for fast computation of the Soft Robinson-Foulds distance between phylogenetic networks. Two computer programs are developed for facilitating reconstruction and validation of phylogenetic network models in evolutionary and comparative genomics. Our simulation tests indicated that they are fast enough for use in practice. Additionally, the distribution of the Soft Robinson-Foulds distance between phylogenetic networks is demonstrated to be unlikely normal by our simulation data.
Survey of Advanced Technologies in Japan, Vol. 3: Database Reports
1990-05-01
INDUSTRIAL CO. LTD. Y NEC CORPORATION, CSC INFORMATION TECHNOLOGY RESEARCH LABORATORIES Y THE UNIVERSITY OF ELECTRO-COMMUNICATIONS Y TOKYO INSTITUTE...77 by the National *"eau of Standards, and is an outgrowth of research performed by IBM. which was based on information theory, using computer...Y COMMUNICATION RESEARCH LABORATORY, MINISTRY OF POSTS AND TELECOMMUNICATIONS Y MATSUSHITA ELECTRONICS CORP., ELECTRONICS RESEARCH LABORATORY Y
Affine invariants of convex polygons.
Flusser, Jan
2002-01-01
In this correspondence, we prove that the affine invariants, for image registration and object recognition, proposed recently by Yang and Cohen (see ibid., vol.8, no.7, p.934-46, July 1999) are algebraically dependent. We show how to select an independent and complete set of the invariants. The use of this new set leads to a significant reduction of the computing complexity without decreasing the discrimination power.
Automatic Multimodal Cognitive Load Measurement (AMCLM)
2011-06-01
Design and procedure A computer-based training application, running on a tablet monitor, was designed for basketball players to learn playing strategies... MRI ) and near-infrared (NIR) neuroimaging, have also been employed to detect changes in cognitive workload (Callicott et al., 1999; He et al., 2007...Physiological characteristics of capacity constraints in working memory as revealed by functional MRI , Cerebral Cortex, vol. 9, pp. 20-26, 1999
ERIC Educational Resources Information Center
Bremmer, Dale; Childs, Bart
This document discusses the importance of computing knowledge and experience in the techniques of fast data retrieval for today's engineer. It describes a course designed to teach the engineer the COBOL Language structure. One of the projects of the course, a report generator (REGE) written in COBOL which is used to alter, sort and print selected…
Cloud Offload in Hostile Environments
2011-12-01
of recognized objects in an input image. FACE: Windows XP C++ application based on the OpenCV library [45]. It returns the coordinates and identities...SOLDIER. Energy-Efficient Technolo- gies for the Dismounted Soldier”. National Research Council, 1997. [16] COMMITTEE ON SOLDIER POWER/ENERGY SYSTEMS...vol. 4658 of Lecture Notes in Computer Science. Springer Berlin / Heidelberg, 2007. [45] OPENCV . OpenCV Wiki. http://opencv.willowgarage.com/wiki/. [46
Area-Efficient Graph Layouts (for VLSI).
1980-08-13
thle short side, then no rectangle is ew r generated x’.ho se aspect r~itho i s \\orse di ai aJ. ’I lie d i % ide-I mid -cimq tier clInt ruolIn in... Sutherland and Donald Oestrcichcr, "flow big should a printed circuit board be?," ILEEE, Transactions on Computers, Vol. C-22, May 1973, pp. 537-542. 22
Theory, Computation and Experiment on Criticality and Stability of Vortices Separating from Edges
2016-08-15
aerospace engineering research. These include dynamic stall in wind turbines and helicopter rotors, and flapping-wing vehicle (micro-air vehicle) design...and Robinson, M., “Blade Three-Dimensional Dynamic Stall Response to Wind Turbine Operating Condition,” Journal of Solar Energy Engineering , Vol...Snapshots of TEV shedding in vortex ring representation. . . . . . . . . . . . . . . . 57 7.3 Schematic description of separated tip flow model
Recent Naval Postgraduate School Publications.
1980-04-01
Numerical models of ocean circulation and Climate interaction Revs, of Geophis,.and Space Phys., vol. 17, no. 7, p. 1494-1507, (1 979) Haney, R 1...POSTGRADUATE SCHOOL Monterey, California DEPARTMENT OF COMPUTER SCIENCE C06FEBENCE PRESENTATIONS Bradley, G H Enerqy modelling with network optimization...Systems Analysis, Sept., 97 Bradley, G H; Brown, G G Network optimization and defense modeling Center for Nay. Analyses, Arlington, Va., Aug., 1976
1990-06-01
Darjalainen, A., and Jarvensivu, P., "Radioimmunoassay of Detomidine , A New Benzylimidazole Drug With Analgesic Sedation Properties," Life Sciences Vol. 40...14 H 27 -1.738080 -0.103370 3.881670 5 14 H 28 -0.914700 1.516040 3.900160 5 14 APPENDIX B 47 28mpv253-- detomidine N 1 -2.435160 -1.456250 0.448050 9
A Visual Analytic for Improving Human Terrain Understanding
2013-06-01
Kim, S., Minotra, D., Strater, L ., Cuevas, and Colombo, D. “Knowledge Visualization to Enhance Human-Agent Situation Awareness within a Computational...1971). A General Coefficient of Similarity and Some of Its Properties Biometrics, Vol. 27, No. 4, pp. 857-871. [14] Coppock, S. & Mazlack, L ...and allow human interpretation. HDPT Component Overview PostgreSQL DBS Apache Tomcat Web Server [’...... _./ Globa l Graph Web ~ Application
ERIC Educational Resources Information Center
Marcovitz, Alan B., Ed.
Four computer programs written in FORTRAN and BASIC develop theoretical predictions and data reduction for a junior-senior level heat exchanger experiment. Programs may be used at the terminal in the laboratory to check progress of the experiment or may be used in the batch mode for interpretation of final information for a formal report. Several…
Statistical Memristor Modeling and Case Study in Neuromorphic Computing
2012-06-01
use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and...Sundareswaran, R. Panda , and D. Pan, “Electrical impact of line-edge roughness on sub-45nm node standard cell,” in Proc. SPIE, vol. 7275, 2009, pp. 727 518–727 518–10. 590 26.3
Cortical Substrate of Haptic Representation
1993-08-24
experience and data from primates , we have developed computational models of short-term active memory. Such models may have technological interest...neurobiological work on primate memory. It is on that empirical work that our current theoretical efforts are 5 founded. Our future physiological research...Academy of Sciences, New York, vol. 608, pp. 318-329, 1990. J.M. Fuster - Behavioral electrophysiology of the prefrontal cortex of the primate . Progress
Multiobjective Multifactorial Optimization in Evolutionary Multitasking.
Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen
2016-05-03
In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.
Replaying evolutionary transitions from the dental fossil record
Harjunmaa, Enni; Seidel, Kerstin; Häkkinen, Teemu; Renvoisé, Elodie; Corfe, Ian J.; Kallonen, Aki; Zhang, Zhao-Qun; Evans, Alistair R.; Mikkola, Marja L.; Salazar-Ciudad, Isaac; Klein, Ophir D.; Jernvall, Jukka
2014-01-01
The evolutionary relationships of extinct species are ascertained primarily through the analysis of morphological characters. Character inter-dependencies can have a substantial effect on evolutionary interpretations, but the developmental underpinnings of character inter-dependence remain obscure because experiments frequently do not provide detailed resolution of morphological characters. Here we show experimentally and computationally how gradual modification of development differentially affects characters in the mouse dentition. We found that intermediate phenotypes could be produced by gradually adding ectodysplasin A (EDA) protein in culture to tooth explants carrying a null mutation in the tooth-patterning gene Eda. By identifying development-based character interdependencies, we show how to predict morphological patterns of teeth among mammalian species. Finally, in vivo inhibition of sonic hedgehog signalling in Eda null teeth enabled us to reproduce characters deep in the rodent ancestry. Taken together, evolutionarily informative transitions can be experimentally reproduced, thereby providing development-based expectations for character state transitions used in evolutionary studies. PMID:25079326
Selection on Network Dynamics Drives Differential Rates of Protein Domain Evolution
Mannakee, Brian K.; Gutenkunst, Ryan N.
2016-01-01
The long-held principle that functionally important proteins evolve slowly has recently been challenged by studies in mice and yeast showing that the severity of a protein knockout only weakly predicts that protein’s rate of evolution. However, the relevance of these studies to evolutionary changes within proteins is unknown, because amino acid substitutions, unlike knockouts, often only slightly perturb protein activity. To quantify the phenotypic effect of small biochemical perturbations, we developed an approach to use computational systems biology models to measure the influence of individual reaction rate constants on network dynamics. We show that this dynamical influence is predictive of protein domain evolutionary rate within networks in vertebrates and yeast, even after controlling for expression level and breadth, network topology, and knockout effect. Thus, our results not only demonstrate the importance of protein domain function in determining evolutionary rate, but also the power of systems biology modeling to uncover unanticipated evolutionary forces. PMID:27380265
Incorporating evolutionary processes into population viability models.
Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G
2015-06-01
We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence. © 2014 Society for Conservation Biology.
Historical Contingency in Controlled Evolution
NASA Astrophysics Data System (ADS)
Schuster, Peter
2014-12-01
A basic question in evolution is dealing with the nature of an evolutionary memory. At thermodynamic equilibrium, at stable stationary states or other stable attractors the memory on the path leading to the long-time solution is erased, at least in part. Similar arguments hold for unique optima. Optimality in biology is discussed on the basis of microbial metabolism. Biology, on the other hand, is characterized by historical contingency, which has recently become accessible to experimental test in bacterial populations evolving under controlled conditions. Computer simulations give additional insight into the nature of the evolutionary memory, which is ultimately caused by the enormous space of possibilities that is so large that it escapes all attempts of visualization. In essence, this contribution is dealing with two questions of current evolutionary theory: (i) Are organisms operating at optimal performance? and (ii) How is the evolutionary memory built up in populations?
Evidence for a strong sulfur-aromatic interaction derived from crystallographic data.
Zauhar, R J; Colbert, C L; Morgan, R S; Welsh, W J
2000-03-01
We have uncovered new evidence for a significant interaction between divalent sulfur atoms and aromatic rings. Our study involves a statistical analysis of interatomic distances and other geometric descriptors derived from entries in the Cambridge Crystallographic Database (F. H. Allen and O. Kennard, Chem. Design Auto. News, 1993, Vol. 8, pp. 1 and 31-37). A set of descriptors was defined sufficient in number and type so as to elucidate completely the preferred geometry of interaction between six-membered aromatic carbon rings and divalent sulfurs for all crystal structures of nonmetal-bearing organic compounds present in the database. In order to test statistical significance, analogous probability distributions for the interaction of the moiety X-CH(2)-X with aromatic rings were computed, and taken a priori to correspond to the null hypothesis of no significant interaction. Tests of significance were carried our pairwise between probability distributions of sulfur-aromatic interaction descriptors and their CH(2)-aromatic analogues using the Smirnov-Kolmogorov nonparametric test (W. W. Daniel, Applied Nonparametric Statistics, Houghton-Mifflin: Boston, New York, 1978, pp. 276-286), and in all cases significance at the 99% confidence level or better was observed. Local maxima of the probability distributions were used to define a preferred geometry of interaction between the divalent sulfur moiety and the aromatic ring. Molecular mechanics studies were performed in an effort to better understand the physical basis of the interaction. This study confirms observations based on statistics of interaction of amino acids in protein crystal structures (R. S. Morgan, C. E. Tatsch, R. H. Gushard, J. M. McAdon, and P. K. Warme, International Journal of Peptide Protein Research, 1978, Vol. 11, pp. 209-217; R. S. Morgan and J. M. McAdon, International Journal of Peptide Protein Research, 1980, Vol. 15, pp. 177-180; K. S. C. Reid, P. F. Lindley, and J. M. Thornton, FEBS Letters, 1985, Vol. 190, pp. 209-213), as well as studies involving molecular mechanics (G. Nemethy and H. A. Scheraga, Biochemistry and Biophysics Research Communications, 1981, Vol. 98, pp. 482-487) and quantum chemical calculations (B. V. Cheney, M. W. Schulz, and J. Cheney, Biochimica Biophysica Acta, 1989, Vol. 996, pp.116-124; J. Pranata, Bioorganic Chemistry, 1997, Vol. 25, pp. 213-219)-all of which point to the possible importance of the sulfur-aromatic interaction. However, the preferred geometry of the interaction, as determined from our analysis of the small-molecule crystal data, differs significantly from that found by other approaches. Copyright 2000 John Wiley & Sons, Inc.
NASA Astrophysics Data System (ADS)
Nanihar, Nadiarulah; Khalid, Amir; Mustaffa, Norrizal; Jaat, Norrizam; Sapit, Azwan; Razali, Azahari; Sunar, Norshuhaila Mohamed
2017-10-01
Biodiesel based on vegetable oil is an alternative that had various advantage in term of sustainability and environmental attractive compare to others conventional diesel. Biodiesel is product of any fat or oil that derived from any organic sources through a refinery process called transesterification process. This research investigates the effects of storage duration and variant ambient condition on the biodiesel properties and characteristics. In this study, there are three types of blending which is 5vol% blends ( 5vol% plant oil 95vol% diesel), 10vol% blending (10vol% plant oil and 90vol% diesel) and 15vol% blending (15vol% plant oil and 85vol% diesel) each called CPO5 (crude palm oil 5vol%), CPO10 (crude palm oil 10vol%),CPO15 (crude palm oil 15vol%), JO5 (jatropha oil 5vol%), JO10 (jatropha oil 10vol%),and JO15 (jatropha oil 15vol%) respectively. Biodiesel samples were stored at indoor condition and outdoor condition for a 3 months period. The fuel properties such as acid value, viscosity, density, water content and flash point are observed with the laboratory instrument. Flash point value and water content increased under both of indoor and outdoor condition and a steady data for viscosity and density. However, acid value at indoor condition nearly constant but increased dramatically for outdoor condition over the time.
NASA Astrophysics Data System (ADS)
Nehm, Ross H.; Haertig, Hendrik
2012-02-01
Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with equal fidelity as expert human scorers in a sample of >1,000 essays. We used SPSS Text Analysis 3.0 to perform our CAS and measure Kappa values (inter-rater reliability) of KC detection (i.e., computer-human rating correspondence). Our first analysis indicated that the text analysis functions (or extraction rules) developed and deployed in SPSSTA to extract individual Key Concepts (KCs) from three different items differing in several surface features (e.g., taxon, trait, type of evolutionary change) produced "substantial" (Kappa 0.61-0.80) or "almost perfect" (0.81-1.00) agreement. The second analysis explored the measurement of human-computer correspondence for KC diversity (the number of different accurate knowledge elements) in the combined sample of all 827 essays. Here we found outstanding correspondence; extraction rules generated using one prompt type are broadly applicable to other evolutionary scenarios (e.g., bacterial resistance, cheetah running speed, etc.). This result is encouraging, as it suggests that the development of new item sets may not necessitate the development of new text analysis rules. Overall, our findings suggest that CAS tools such as SPSS Text Analysis may compensate for some of the intrinsic limitations of currently used multiple-choice Concept Inventories designed to measure student knowledge of natural selection.
Caetano-Anollés, Gustavo; Caetano-Anollés, Derek
2015-01-01
Accretion occurs pervasively in nature at widely different timeframes. The process also manifests in the evolution of macromolecules. Here we review recent computational and structural biology studies of evolutionary accretion that make use of the ideographic (historical, retrodictive) and nomothetic (universal, predictive) scientific frameworks. Computational studies uncover explicit timelines of accretion of structural parts in molecular repertoires and molecules. Phylogenetic trees of protein structural domains and proteomes and their molecular functions were built from a genomic census of millions of encoded proteins and associated terminal Gene Ontology terms. Trees reveal a ‘metabolic-first’ origin of proteins, the late development of translation, and a patchwork distribution of proteins in biological networks mediated by molecular recruitment. Similarly, the natural history of ancient RNA molecules inferred from trees of molecular substructures built from a census of molecular features shows patchwork-like accretion patterns. Ideographic analyses of ribosomal history uncover the early appearance of structures supporting mRNA decoding and tRNA translocation, the coevolution of ribosomal proteins and RNA, and a first evolutionary transition that brings ribosomal subunits together into a processive protein biosynthetic complex. Nomothetic structural biology studies of tertiary interactions and ancient insertions in rRNA complement these findings, once concentric layering assumptions are removed. Patterns of coaxial helical stacking reveal a frustrated dynamics of outward and inward ribosomal growth possibly mediated by structural grafting. The early rise of the ribosomal ‘turnstile’ suggests an evolutionary transition in natural biological computation. Results make explicit the need to understand processes of molecular growth and information transfer of macromolecules. PMID:27096056
Computer-automated evolution of an X-band antenna for NASA's Space Technology 5 mission.
Hornby, Gregory S; Lohn, Jason D; Linden, Derek S
2011-01-01
Whereas the current practice of designing antennas by hand is severely limited because it is both time and labor intensive and requires a significant amount of domain knowledge, evolutionary algorithms can be used to search the design space and automatically find novel antenna designs that are more effective than would otherwise be developed. Here we present our work in using evolutionary algorithms to automatically design an X-band antenna for NASA's Space Technology 5 (ST5) spacecraft. Two evolutionary algorithms were used: the first uses a vector of real-valued parameters and the second uses a tree-structured generative representation for constructing the antenna. The highest-performance antennas from both algorithms were fabricated and tested and both outperformed a hand-designed antenna produced by the antenna contractor for the mission. Subsequent changes to the spacecraft orbit resulted in a change in requirements for the spacecraft antenna. By adjusting our fitness function we were able to rapidly evolve a new set of antennas for this mission in less than a month. One of these new antenna designs was built, tested, and approved for deployment on the three ST5 spacecraft, which were successfully launched into space on March 22, 2006. This evolved antenna design is the first computer-evolved antenna to be deployed for any application and is the first computer-evolved hardware in space.
Improving Search Properties in Genetic Programming
NASA Technical Reports Server (NTRS)
Janikow, Cezary Z.; DeWeese, Scott
1997-01-01
With the advancing computer processing capabilities, practical computer applications are mostly limited by the amount of human programming required to accomplish a specific task. This necessary human participation creates many problems, such as dramatically increased cost. To alleviate the problem, computers must become more autonomous. In other words, computers must be capable to program/reprogram themselves to adapt to changing environments/tasks/demands/domains. Evolutionary computation offers potential means, but it must be advanced beyond its current practical limitations. Evolutionary algorithms model nature. They maintain a population of structures representing potential solutions to the problem at hand. These structures undergo a simulated evolution by means of mutation, crossover, and a Darwinian selective pressure. Genetic programming (GP) is the most promising example of an evolutionary algorithm. In GP, the structures that evolve are trees, which is a dramatic departure from previously used representations such as strings in genetic algorithms. The space of potential trees is defined by means of their elements: functions, which label internal nodes, and terminals, which label leaves. By attaching semantic interpretation to those elements, trees can be interpreted as computer programs (given an interpreter), evolved architectures, etc. JSC has begun exploring GP as a potential tool for its long-term project on evolving dextrous robotic capabilities. Last year we identified representation redundancies as the primary source of inefficiency in GP. Subsequently, we proposed a method to use problem constraints to reduce those redundancies, effectively reducing GP complexity. This method was implemented afterwards at the University of Missouri. This summer, we have evaluated the payoff from using problem constraints to reduce search complexity on two classes of problems: learning boolean functions and solving the forward kinematics problem. We have also developed and implemented methods to use additional problem heuristics to fine-tune the searchable space, and to use typing information to further reduce the search space. Additional improvements have been proposed, but they are yet to be explored and implemented.
NASA Technical Reports Server (NTRS)
Omori, S.
1973-01-01
As described in Vol. 1, the eddy viscosity is calculated through the turbulent kinetic energy, in order to include the history of the flow and the effect of chemical reaction on boundary layer characteristics. Calculations can be performed for two different cooling concepts; that is, transpiration and regeneratively cooled wall cases. For the regenerative cooling option, coolant and gas side wall temperature and coolant bulk temperature in a rocket engine can be computed along the nozzle axis. Thus, this computer program is useful in designing coolant flow rate and cooling tube geometry, including the tube wall thickness as well as in predicting the effects of boundary layers along the gas side wall on thrust performances.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Xiongbiao, E-mail: xluo@robarts.ca, E-mail: Ying.Wan@student.uts.edu.au; Wan, Ying, E-mail: xluo@robarts.ca, E-mail: Ying.Wan@student.uts.edu.au; He, Xiangjian
Purpose: Electromagnetically guided endoscopic procedure, which aims at accurately and robustly localizing the endoscope, involves multimodal sensory information during interventions. However, it still remains challenging in how to integrate these information for precise and stable endoscopic guidance. To tackle such a challenge, this paper proposes a new framework on the basis of an enhanced particle swarm optimization method to effectively fuse these information for accurate and continuous endoscope localization. Methods: The authors use the particle swarm optimization method, which is one of stochastic evolutionary computation algorithms, to effectively fuse the multimodal information including preoperative information (i.e., computed tomography images) asmore » a frame of reference, endoscopic camera videos, and positional sensor measurements (i.e., electromagnetic sensor outputs). Since the evolutionary computation method usually limits its possible premature convergence and evolutionary factors, the authors introduce the current (endoscopic camera and electromagnetic sensor’s) observation to boost the particle swarm optimization and also adaptively update evolutionary parameters in accordance with spatial constraints and the current observation, resulting in advantageous performance in the enhanced algorithm. Results: The experimental results demonstrate that the authors’ proposed method provides a more accurate and robust endoscopic guidance framework than state-of-the-art methods. The average guidance accuracy of the authors’ framework was about 3.0 mm and 5.6° while the previous methods show at least 3.9 mm and 7.0°. The average position and orientation smoothness of their method was 1.0 mm and 1.6°, which is significantly better than the other methods at least with (2.0 mm and 2.6°). Additionally, the average visual quality of the endoscopic guidance was improved to 0.29. Conclusions: A robust electromagnetically guided endoscopy framework was proposed on the basis of an enhanced particle swarm optimization method with using the current observation information and adaptive evolutionary factors. The authors proposed framework greatly reduced the guidance errors from (4.3, 7.8) to (3.0 mm, 5.6°), compared to state-of-the-art methods.« less
... Century-Old Evolutionary Puzzle Computing Genetics Model Organisms RNA Interference The New Genetics is a science education ... the basics of DNA and its molecular cousin RNA, and new directions in genetic research. The New ...
Breen, Gerald-Mark; Matusitz, Jonathan
2009-01-01
Telemedicine, the use of advanced communication technologies in the healthcare context, has a rich history and a clear evolutionary course. In this paper, the authors identify telemedicine as operationally defined, the services and technologies it comprises, the direction telemedicine has taken, along with its increased acceptance in the healthcare communities. The authors also describe some of the key pitfalls warred with by researchers and activists to advance telemedicine to its full potential and lead to an unobstructed team of technicians to identify telemedicine’s diverse utilities. A discussion and future directions section is included to provide fresh ideas to health communication and computer-mediated scholars wishing to delve into this area and make a difference to enhance public understanding of this field. PMID:20300559
The Chomsky—Place correspondence 1993–1994
Chomsky, Noam; Place, Ullin T.
2000-01-01
Edited correspondence between Ullin T. Place and Noam Chomsky, which occurred in 1993–1994, is presented. The principal topics are (a) deep versus surface structure; (b) computer modeling of the brain; (c) the evolutionary origins of language; (d) behaviorism; and (e) a dispositional account of language. This correspondence includes Chomsky's denial that he ever characterized deep structure as innate; Chomsky's critique of computer modeling (both traditional and connectionist) of the brain; Place's critique of Chomsky's alleged failure to provide an adequate account of the evolutionary origins of language, and Chomsky's response that such accounts are “pop-Darwinian fairy tales”; and Place's arguments for, and Chomsky's against, the relevance of behaviorism to linguistic theory, especially the relevance of a behavioral approach to language that is buttressed by a dispositional account of sentence construction. PMID:22477211
The Chomsky-Place correspondence 1993-1994.
Chomsky, N; Place, U T
2000-01-01
Edited correspondence between Ullin T. Place and Noam Chomsky, which occurred in 1993-1994, is presented. The principal topics are (a) deep versus surface structure; (b) computer modeling of the brain; (c) the evolutionary origins of language; (d) behaviorism; and (e) a dispositional account of language. This correspondence includes Chomsky's denial that he ever characterized deep structure as innate; Chomsky's critique of computer modeling (both traditional and connectionist) of the brain; Place's critique of Chomsky's alleged failure to provide an adequate account of the evolutionary origins of language, and Chomsky's response that such accounts are "pop-Darwinian fairy tales"; and Place's arguments for, and Chomsky's against, the relevance of behaviorism to linguistic theory, especially the relevance of a behavioral approach to language that is buttressed by a dispositional account of sentence construction.
The development of the red giant branch. I - Theoretical evolutionary sequences
NASA Technical Reports Server (NTRS)
Sweigart, Allen V.; Greggio, Laura; Renzini, Alvio
1989-01-01
A grid of 100 evolutionary sequences extending from the zero-age main sequence to the onset of helium burning has been computed for stellar masses between 1.4 and 3.4 solar masses, helium abundances of 0.20 and 0.30, and heavy-element abundances of 0.004, 0.01, and 0.04. Using these computations the transition in the morphology of the red giant branch (RGB) between low-mass stars, which have an extended and luminous first RGB phase prior to helium ignition, and intermediate-mass stars, which do not, is investigated. Extensive tabulations of the numerical results are provided to aid in applying these sequences. The effects of the first dredge-up on the surface helium and CNO abundances of the sequences is discussed.
Squires, R Burke; Pickett, Brett E; Das, Sajal; Scheuermann, Richard H
2014-12-01
In 2009 a novel pandemic H1N1 influenza virus (H1N1pdm09) emerged as the first official influenza pandemic of the 21st century. Early genomic sequence analysis pointed to the swine origin of the virus. Here we report a novel computational approach to determine the evolutionary trajectory of viral sequences that uses data-driven estimations of nucleotide substitution rates to track the gradual accumulation of observed sequence alterations over time. Phylogenetic analysis and multiple sequence alignments show that sequences belonging to the resulting evolutionary trajectory of the H1N1pdm09 lineage exhibit a gradual accumulation of sequence variations and tight temporal correlations in the topological structure of the phylogenetic trees. These results suggest that our evolutionary trajectory analysis (ETA) can more effectively pinpoint the evolutionary history of viruses, including the host and geographical location traversed by each segment, when compared against either BLAST or traditional phylogenetic analysis alone. Copyright © 2014 Elsevier B.V. All rights reserved.
An Orthogonal Evolutionary Algorithm With Learning Automata for Multiobjective Optimization.
Dai, Cai; Wang, Yuping; Ye, Miao; Xue, Xingsi; Liu, Hailin
2016-12-01
Research on multiobjective optimization problems becomes one of the hottest topics of intelligent computation. In order to improve the search efficiency of an evolutionary algorithm and maintain the diversity of solutions, in this paper, the learning automata (LA) is first used for quantization orthogonal crossover (QOX), and a new fitness function based on decomposition is proposed to achieve these two purposes. Based on these, an orthogonal evolutionary algorithm with LA for complex multiobjective optimization problems with continuous variables is proposed. The experimental results show that in continuous states, the proposed algorithm is able to achieve accurate Pareto-optimal sets and wide Pareto-optimal fronts efficiently. Moreover, the comparison with the several existing well-known algorithms: nondominated sorting genetic algorithm II, decomposition-based multiobjective evolutionary algorithm, decomposition-based multiobjective evolutionary algorithm with an ensemble of neighborhood sizes, multiobjective optimization by LA, and multiobjective immune algorithm with nondominated neighbor-based selection, on 15 multiobjective benchmark problems, shows that the proposed algorithm is able to find more accurate and evenly distributed Pareto-optimal fronts than the compared ones.
Development of an Evolutionary Algorithm for the ab Initio Discovery of Two-Dimensional Materials
NASA Astrophysics Data System (ADS)
Revard, Benjamin Charles
Crystal structure prediction is an important first step on the path toward computational materials design. Increasingly robust methods have become available in recent years for computing many materials properties, but because properties are largely a function of crystal structure, the structure must be known before these methods can be brought to bear. In addition, structure prediction is particularly useful for identifying low-energy structures of subperiodic materials, such as two-dimensional (2D) materials, which may adopt unexpected structures that differ from those of the corresponding bulk phases. Evolutionary algorithms, which are heuristics for global optimization inspired by biological evolution, have proven to be a fruitful approach for tackling the problem of crystal structure prediction. This thesis describes the development of an improved evolutionary algorithm for structure prediction and several applications of the algorithm to predict the structures of novel low-energy 2D materials. The first part of this thesis contains an overview of evolutionary algorithms for crystal structure prediction and presents our implementation, including details of extending the algorithm to search for clusters, wires, and 2D materials, improvements to efficiency when running in parallel, improved composition space sampling, and the ability to search for partial phase diagrams. We then present several applications of the evolutionary algorithm to 2D systems, including InP, the C-Si and Sn-S phase diagrams, and several group-IV dioxides. This thesis makes use of the Cornell graduate school's "papers" option. Chapters 1 and 3 correspond to the first-author publications of Refs. [131] and [132], respectively, and chapter 2 will soon be submitted as a first-author publication. The material in chapter 4 is taken from Ref. [144], in which I share joint first-authorship. In this case I have included only my own contributions.
Jacobs, Christopher; Lambourne, Luke; Xia, Yu; Segrè, Daniel
2017-01-01
System-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism's genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene's fitness contribution to an organism "here and now" and the same gene's historical importance as evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call "function-loss cost", which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.
Julien, Clavel; Leandro, Aristide; Hélène, Morlon
2018-06-19
Working with high-dimensional phylogenetic comparative datasets is challenging because likelihood-based multivariate methods suffer from low statistical performances as the number of traits p approaches the number of species n and because some computational complications occur when p exceeds n. Alternative phylogenetic comparative methods have recently been proposed to deal with the large p small n scenario but their use and performances are limited. Here we develop a penalized likelihood framework to deal with high-dimensional comparative datasets. We propose various penalizations and methods for selecting the intensity of the penalties. We apply this general framework to the estimation of parameters (the evolutionary trait covariance matrix and parameters of the evolutionary model) and model comparison for the high-dimensional multivariate Brownian (BM), Early-burst (EB), Ornstein-Uhlenbeck (OU) and Pagel's lambda models. We show using simulations that our penalized likelihood approach dramatically improves the estimation of evolutionary trait covariance matrices and model parameters when p approaches n, and allows for their accurate estimation when p equals or exceeds n. In addition, we show that penalized likelihood models can be efficiently compared using Generalized Information Criterion (GIC). We implement these methods, as well as the related estimation of ancestral states and the computation of phylogenetic PCA in the R package RPANDA and mvMORPH. Finally, we illustrate the utility of the new proposed framework by evaluating evolutionary models fit, analyzing integration patterns, and reconstructing evolutionary trajectories for a high-dimensional 3-D dataset of brain shape in the New World monkeys. We find a clear support for an Early-burst model suggesting an early diversification of brain morphology during the ecological radiation of the clade. Penalized likelihood offers an efficient way to deal with high-dimensional multivariate comparative data.
Energy and time determine scaling in biological and computer designs
Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie
2016-01-01
Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy–time minimization principle may govern the design of many complex systems that process energy, materials and information. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431524
Nemo: an evolutionary and population genetics programming framework.
Guillaume, Frédéric; Rougemont, Jacques
2006-10-15
Nemo is an individual-based, genetically explicit and stochastic population computer program for the simulation of population genetics and life-history trait evolution in a metapopulation context. It comes as both a C++ programming framework and an executable program file. Its object-oriented programming design gives it the flexibility and extensibility needed to implement a large variety of forward-time evolutionary models. It provides developers with abstract models allowing them to implement their own life-history traits and life-cycle events. Nemo offers a large panel of population models, from the Island model to lattice models with demographic or environmental stochasticity and a variety of already implemented traits (deleterious mutations, neutral markers and more), life-cycle events (mating, dispersal, aging, selection, etc.) and output operators for saving data and statistics. It runs on all major computer platforms including parallel computing environments. The source code, binaries and documentation are available under the GNU General Public License at http://nemo2.sourceforge.net.
Energy and time determine scaling in biological and computer designs.
Moses, Melanie; Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie
2016-08-19
Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy-time minimization principle may govern the design of many complex systems that process energy, materials and information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. © 2016 The Author(s).
Guerra, Concettina
2015-01-01
Protein complexes are key molecular entities that perform a variety of essential cellular functions. The connectivity of proteins within a complex has been widely investigated with both experimental and computational techniques. We developed a computational approach to identify and characterise proteins that play a role in interconnecting complexes. We computed a measure of inter-complex centrality, the crossroad index, based on disjoint paths connecting proteins in distinct complexes and identified inter-complex hubs as proteins with a high value of the crossroad index. We applied the approach to a set of stable complexes in Saccharomyces cerevisiae and in Homo sapiens. Just as done for hubs, we evaluated the topological and biological properties of inter-complex hubs addressing the following questions. Do inter-complex hubs tend to be evolutionary conserved? What is the relation between crossroad index and essentiality? We found a good correlation between inter-complex hubs and both evolutionary conservation and essentiality.
Lashin, Sergey A; Suslov, Valentin V; Matushkin, Yuri G
2010-06-01
We propose an original program "Evolutionary constructor" that is capable of computationally efficient modeling of both population-genetic and ecological problems, combining these directions in one model of required detail level. We also present results of comparative modeling of stability, adaptability and biodiversity dynamics in populations of unicellular haploid organisms which form symbiotic ecosystems. The advantages and disadvantages of two evolutionary strategies of biota formation--a few generalists' taxa-based biota formation and biodiversity-based biota formation--are discussed.
2009-06-01
Availability C2PC Command and Control Personal Computer CAS Close Air Support CCA Clinger-Cohen Act CDR Critical Design Review CJCSI Chairman of the Joint... kids , Jackie and Anna and my future boy whose name is TBD, I think my time at NPS has made me a better person and hopefully a better father. Thank... can the USMC apply the essential principles of rapid, value-based, evolutionary acquisition to the development and procurement of a TSOA? 4 THIS
Laboratory evolution of protein conformational dynamics.
Campbell, Eleanor C; Correy, Galen J; Mabbitt, Peter D; Buckle, Ashley M; Tokuriki, Nobuhiko; Jackson, Colin J
2017-11-08
This review focuses on recent work that has begun to establish specific functional roles for protein conformational dynamics, specifically how the conformational landscapes that proteins can sample can evolve under laboratory based evolutionary selection. We discuss recent technical advances in computational and biophysical chemistry, which have provided us with new ways to dissect evolutionary processes. Finally, we offer some perspectives on the emerging view of conformational dynamics and evolution, and the challenges that we face in rationally engineering conformational dynamics. Copyright © 2017 Elsevier Ltd. All rights reserved.
From micro-scale 3D simulations to macro-scale model of periodic porous media
NASA Astrophysics Data System (ADS)
Crevacore, Eleonora; Tosco, Tiziana; Marchisio, Daniele; Sethi, Rajandrea; Messina, Francesca
2015-04-01
In environmental engineering, the transport of colloidal suspensions in porous media is studied to understand the fate of potentially harmful nano-particles and to design new remediation technologies. In this perspective, averaging techniques applied to micro-scale numerical simulations are a powerful tool to extrapolate accurate macro-scale models. Choosing two simplified packing configurations of soil grains and starting from a single elementary cell (module), it is possible to take advantage of the periodicity of the structures to reduce the computation costs of full 3D simulations. Steady-state flow simulations for incompressible fluid in laminar regime are implemented. Transport simulations are based on the pore-scale advection-diffusion equation, that can be enriched introducing also the Stokes velocity (to consider the gravity effect) and the interception mechanism. Simulations are carried on a domain composed of several elementary modules, that serve as control volumes in a finite volume method for the macro-scale method. The periodicity of the medium involves the periodicity of the flow field and this will be of great importance during the up-scaling procedure, allowing relevant simplifications. Micro-scale numerical data are treated in order to compute the mean concentration (volume and area averages) and fluxes on each module. The simulation results are used to compare the micro-scale averaged equation to the integral form of the macroscopic one, making a distinction between those terms that could be computed exactly and those for which a closure in needed. Of particular interest it is the investigation of the origin of macro-scale terms such as the dispersion and tortuosity, trying to describe them with micro-scale known quantities. Traditionally, to study the colloidal transport many simplifications are introduced, such those concerning ultra-simplified geometry that usually account for a single collector. Gradual removal of such hypothesis leads to a detailed description of colloidal transport mechanisms. Starting from nearly realistic 3D geometries, the ultimate purpose of this work is that of develop an improved understanding of the fate of colloidal particles through, for example, an accurate description of the deposition efficiency, in order design efficient remediation techniques. G. Boccardo, D.L. Marchisio, R.Sethi, Journal of colloid and interface science, Vol 417C, pp 227-237, 2014 M. Icardi, G. Boccardo, D.L. Marchisio, T. Tosco, R.Sethi, Physical Review E - Statistical, Nonlinear, and Soft Matter Physics, 2014 S. Torkzaban, S.S. Tazehkand, S.L. Walker, S.A. Bradford, Water resources research, Vol 44, 2008 S.M. Hassanizadeh, Adv in Water Resources, Vol. 2, pp 131-144, 1979 S. Whitaker, AIChE Journal, Vol. 13 No. 3, pp 420-428, May 1967
ERIC Educational Resources Information Center
Nehm, Ross H.; Haertig, Hendrik
2012-01-01
Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with…
Automated Design of a High-Velocity Channel
2006-05-01
using Newton’s method. 2.2.2 Groundwater Applications Optimization methods are also very useful for solving groundwater problems. Townley et al... Townley 85] apply present computational algorithms to steady and transient models for groundwater °ow. The aquifer storage coe±cients, transmissivities...Reliability Analysis", Water Resources Research, Vol. 28, No. 12, December 1992, pp. 3269-3280. [ Townley 85] Townley , L. R. and Wilson, J. L
ERIC Educational Resources Information Center
Marcovitz, Alan B., Ed.
A particularly difficult area for many engineering students is the approximate nature of the relation between models and physical systems. This is notably true when the models consist of differential equations. An approach applied to this problem has been to use analog computers to assist in portraying the output of a model as it is progressively…
Eddy Resolving Global Ocean Prediction including Tides
2013-09-30
atlantic meridional overturning circulation in the subpolar North Atlantic . Journal of Geophysical Research vol 118, doi:10.1002/jgrc,20065. [published, refereed] ...global ocean circulation model was examined using results from years 2005-2009 of a seven and a half year 1/12.5° global simulation that resolves...internal tides, along with barotropic tides and the eddying general circulation . We examined tidal amplitudes computed using 18 183-day windows that
ERIC Educational Resources Information Center
ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.
This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 16 titles deal with a variety of topics, including the following: (1) a study of the meanings of experience of ten published feminist women writers; (2) the composing activities of computer literate writers; (3) the informational…
Modeling Laser Damage Thresholds Using the Thompson-Gerstman Model
2014-10-01
Gerstman model was intended to be a modular tool fit for integration into other computational models. This adds usability to the standalone code...Advanced Study Institute, Series A – Life Sciences, Vol. 34, pp. 77-97. New York: Plenum Press . 4. Birngruber, R., V.-P. Gabel and F. Hillenkamp...Random granule placement - varies with melnum. ; ii. Depth averaging or shadowing - varies with melnum. ; iii. T(r,t) single granule calc
Enhanced Lesion Visualization in Image-Guided Noninvasive Surgery With Ultrasound Phased Arrays
2001-10-25
81, 1995. [4] N. Sanghvi et al., “Noninvasive surgery of prostate tissue by high-intensity focused ultrasound ,” IEEE Trans. UFFC, vol. 43, no. 6, pp...ENHANCED LESION VISUALIZATION IN IMAGE-GUIDED NONINVASIVE SURGERY WITH ULTRASOUND PHASED ARRAYS Hui Yao, Pornchai Phukpattaranont and Emad S. Ebbini...Department of Electrical and Computer Engineering University of Minnesota Minneapolis, MN 55455 Abstract- We describe dual-mode ultrasound phased
Optimal Repair And Replacement Policy For A System With Multiple Components
2016-06-17
Numerical Demonstration To implement the linear program, we use the Python Programming Language (PSF 2016) with the Pyomo optimization modeling language...opre.1040.0133. Hart, W.E., C. Laird, J. Watson, D.L. Woodruff. 2012. Pyomo–optimization modeling in python , vol. 67. Springer Science & Business...Media. Hart, W.E., J. Watson, D.L. Woodruff. 2011. Pyomo: modeling and solving mathematical programs in python . Mathematical Programming Computation 3(3
Cumulative Reports and Publications through December 31, 1990.
1991-02-01
visiting scientists from universities and industry who have resident appointments for limited periods of time , and by consultants. Members of NASA’s...David M.: The cost of conservative synchronization in parallel discrete event simula- tions. ICASE Report No. 90-20, May 9, 1990, 31 pages. Submitted...Computing Conference, Charleston, South Carolina, Vol. II, pp. 1028-1037, April 1990. Saltz, Joel H., Ravi Mirchandaney and Kay Crowley: Run- time
Morphing Aircraft Structures: Research in AFRL/RB
2008-09-01
various iterative steps in the process, etc. The solver also internally controls the step size for integration, as this is independent of the step...Coupling of Substructures for Dynamic Analyses,” AIAA Journal , Vol. 6, No. 7, 1968, pp. 1313-1319. 2“Using the State-Dependent Modal Force (MFORCE),” AFL...an actuation system consisting of multiple internal actuators, centrally computer controlled to implement any commanded morphing configuration; and
Autonomous Robot Control via Autonomy Levels (ARCAL)
2015-08-21
same simulated objects. VRF includes a detailed graphical user interface (GUI) front end that subscribes to objects over HLA and renders them, along...forces.html 8. Gao, H., LI, Z., and Zhao, X., "The User -defined and Func- tion-strengthened for CGF of VR -Forces [J]." Computer Simulation, vol. 6...info Scout vehicle commands Scout vehicle Sensor measurements Mission vehicle Mission goals Operator interface Scout belief update Logistics
Autonomous Robot Control via Autonomy Levels (ARCAL)
2015-06-25
simulated objects. VRF includes a detailed graphical user interface (GUI) front end that subscribes to objects over HLA and renders them, along...forces.html 8. Gao, H., LI, Z., and Zhao, X., "The User -defined and Func- tion-strengthened for CGF of VR -Forces [J]." Computer Simulation, vol. 6, 2007...info Scout vehicle commands Scout vehicle Sensor measurements Mission vehicle Mission goals Operator interface Scout belief update Logistics executive
Humanoid Robots: A New Kind of Tool
2000-01-01
Breazeal (Ferrell), R. Irie, C. C. Kemp, M. J. Marjanovic , B. Scassellati, M. M. Williamson, Alternate Essences of Intelligence, AAAI 1998. 2 R. A. Brooks, C...Breazeal, M. J. Marjanovic , B. Scassellati, M. M. Williamson, The Cog Project: Building a Humanoid Robot, Computation fbr Metaphors, Analogy and...Functions, Vol. 608, 1990, New York Academy of Sciences, pp. 637-676. 7 M. J. Marjanovic , B. Scassellati, M. M. Williamson, Self-Taught Visually-Guided
Building the Joint Battlespace Infosphere. Volume 2: Interactive Information Technologies
1999-12-17
G. A . Vouros, “ A Knowledge- Based Methodology for Supporting Multilingual and User -Tailored Interfaces ,” Interacting With Computers, Vol. 9 (1998), p...project is to develop a two-handed user interface to the stereoscopic field analyzer, an interactive 3-D scientific visualization system. The...62 See http://www.hitl.washington.edu/research/vrd/. 63 R. Baumann and R. Clavel, “Haptic Interface for Virtual Reality Based
ERIC Educational Resources Information Center
Marcovitz, Alan B., Ed.
The method of phase-plane presentation as an educational tool in the study of the dynamic behavior of systems is discussed. In the treatment of nonlinear or piecewise-linear systems, the phase-plane portrait is used to exhibit the nature of singular points, regions of stability, and switching lines to aid comprehension. A technique is described by…
Three-Dimensional Shallow Water Acoustics
2015-09-30
converts the Helmholtz wave equation of elliptic type to a one-way wave equation of parabolic type. The conversion allows efficient marching solution ...algorithms for 2 solving the boundary value problem posed by the Helmholtz equation . This can reduce significantly the requirement for computational...Fourier parabolic- equation sound propagation solution scheme," J. Acoust. Soc. Am, vol. 132, pp. EL61-EL67 (2012). [6] Y.-T. Lin, J.M. Collis and T.F
Analysis of SSEM Sensor Data Using BEAM
NASA Technical Reports Server (NTRS)
Zak, Michail; Park, Han; James, Mark
2004-01-01
A report describes analysis of space shuttle main engine (SSME) sensor data using Beacon-based Exception Analysis for Multimissions (BEAM) [NASA Tech Briefs articles, the two most relevant being Beacon-Based Exception Analysis for Multimissions (NPO- 20827), Vol. 26, No.9 (September 2002), page 32 and Integrated Formulation of Beacon-Based Exception Analysis for Multimissions (NPO- 21126), Vol. 27, No. 3 (March 2003), page 74] for automated detection of anomalies. A specific implementation of BEAM, using the Dynamical Invariant Anomaly Detector (DIAD), is used to find anomalies commonly encountered during SSME ground test firings. The DIAD detects anomalies by computing coefficients of an autoregressive model and comparing them to expected values extracted from previous training data. The DIAD was trained using nominal SSME test-firing data. DIAD detected all the major anomalies including blade failures, frozen sense lines, and deactivated sensors. The DIAD was particularly sensitive to anomalies caused by faulty sensors and unexpected transients. The system offers a way to reduce SSME analysis time and cost by automatically indicating specific time periods, signals, and features contributing to each anomaly. The software described here executes on a standard workstation and delivers analyses in seconds, a computing time comparable to or faster than the test duration itself, offering potential for real-time analysis.
Hybrid evolutionary computing model for mobile agents of wireless Internet multimedia
NASA Astrophysics Data System (ADS)
Hortos, William S.
2001-03-01
The ecosystem is used as an evolutionary paradigm of natural laws for the distributed information retrieval via mobile agents to allow the computational load to be added to server nodes of wireless networks, while reducing the traffic on communication links. Based on the Food Web model, a set of computational rules of natural balance form the outer stage to control the evolution of mobile agents providing multimedia services with a wireless Internet protocol WIP. The evolutionary model shows how mobile agents should behave with the WIP, in particular, how mobile agents can cooperate, compete and learn from each other, based on an underlying competition for radio network resources to establish the wireless connections to support the quality of service QoS of user requests. Mobile agents are also allowed to clone themselves, propagate and communicate with other agents. A two-layer model is proposed for agent evolution: the outer layer is based on the law of natural balancing, the inner layer is based on a discrete version of a Kohonen self-organizing feature map SOFM to distribute network resources to meet QoS requirements. The former is embedded in the higher OSI layers of the WIP, while the latter is used in the resource management procedures of Layer 2 and 3 of the protocol. Algorithms for the distributed computation of mobile agent evolutionary behavior are developed by adding a learning state to the agent evolution state diagram. When an agent is in an indeterminate state, it can communicate to other agents. Computing models can be replicated from other agents. Then the agents transitions to the mutating state to wait for a new information-retrieval goal. When a wireless terminal or station lacks a network resource, an agent in the suspending state can change its policy to submit to the environment before it transitions to the searching state. The agents learn the facts of agent state information entered into an external database. In the cloning process, two agents on a host station sharing a common goal can be merged or married to compose a new agent. Application of the two-layer set of algorithms for mobile agent evolution, performed in a distributed processing environment, is made to the QoS management functions of the IP multimedia IM sub-network of the third generation 3G Wideband Code-division Multiple Access W-CDMA wireless network.
Johansen, S; Reinertsen, K V; Knutstad, K; Olsen, D R; Fosså, S D
2011-06-09
To relate the development of post-treatment hypothyroidism with the dose distribution within the thyroid gland in breast cancer (BC) patients treated with loco-regional radiotherapy (RT). In two groups of BC patients postoperatively irradiated by computer tomography (CT)-based RT, the individual dose distributions in the thyroid gland were compared with each other; Cases developed post-treatment hypothyroidism after multimodal treatment including 4-field RT technique. Matched patients in Controls remained free for hypothyroidism. Based on each patient's dose volume histogram (DVH) the volume percentages of the thyroid absorbing respectively 20, 30, 40 and 50 Gy were then estimated (V20, V30, V40 and V50) together with the individual mean thyroid dose over the whole gland (MeanTotGy). The mean and median thyroid dose for the included patients was about 30 Gy, subsequently the total volume of the thyroid gland (VolTotGy) and the absolute volumes (cm3) receiving respectively <30 Gy and ≥30 Gy were calculated (Vol<30 and Vol≥30) and analyzed. No statistically significant inter-group differences were found between V20, V30, V40 and V50Gy or the median of MeanTotGy. The median VolTotGy in Controls was 2.3 times above VolTotGy in Cases (ρ=0.003), with large inter-individual variations in both groups. The volume of the thyroid gland receiving<30 Gy in Controls was almost 2.5 times greater than the comparable figure in Cases. We concluded that in patients with small thyroid glands after loco-radiotherapy of BC, the risk of post-treatment hypothyroidism depends on the volume of the thyroid gland.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Editor); Venneri, Samuel L. (Editor)
1993-01-01
Various papers on flight vehicle materials, structures, and dynamics are presented. Individual topics addressed include: general modeling methods, component modeling techniques, time-domain computational techniques, dynamics of articulated structures, structural dynamics in rotating systems, structural dynamics in rotorcraft, damping in structures, structural acoustics, structural design for control, structural modeling for control, control strategies for structures, system identification, overall assessment of needs and benefits in structural dynamics and controlled structures. Also discussed are: experimental aeroelasticity in wind tunnels, aeroservoelasticity, nonlinear aeroelasticity, aeroelasticity problems in turbomachines, rotary-wing aeroelasticity with application to VTOL vehicles, computational aeroelasticity, structural dynamic testing and instrumentation.
From Conception to Birth: The Forces Responsible for AFCyber’s Evolution
2014-06-01
matter how good or bad my days were – and I experienced a fair number of both during the 11-month course – she provided a shoulder to cry on, a...Robert J. Lamb , “Joint Task Force for Computer Network Defense,” IA Newsletter, Winter 98/99, Vol 2, No. 3, http://www.iwar.org.uk/infocon/dtic‐ia...Future of Warfare.” Real Clear Defense, 24 February 2014. Lamb , Robert J. “Joint Task Force for Computer Network Defense.” IA Newsletter, Winter 98
Consistent Parameter and Transfer Function Estimation using Context Free Grammars
NASA Astrophysics Data System (ADS)
Klotz, Daniel; Herrnegger, Mathew; Schulz, Karsten
2017-04-01
This contribution presents a method for the inference of transfer functions for rainfall-runoff models. Here, transfer functions are defined as parametrized (functional) relationships between a set of spatial predictors (e.g. elevation, slope or soil texture) and model parameters. They are ultimately used for estimation of consistent, spatially distributed model parameters from a limited amount of lumped global parameters. Additionally, they provide a straightforward method for parameter extrapolation from one set of basins to another and can even be used to derive parameterizations for multi-scale models [see: Samaniego et al., 2010]. Yet, currently an actual knowledge of the transfer functions is often implicitly assumed. As a matter of fact, for most cases these hypothesized transfer functions can rarely be measured and often remain unknown. Therefore, this contribution presents a general method for the concurrent estimation of the structure of transfer functions and their respective (global) parameters. Note, that by consequence an estimation of the distributed parameters of the rainfall-runoff model is also undertaken. The method combines two steps to achieve this. The first generates different possible transfer functions. The second then estimates the respective global transfer function parameters. The structural estimation of the transfer functions is based on the context free grammar concept. Chomsky first introduced context free grammars in linguistics [Chomsky, 1956]. Since then, they have been widely applied in computer science. But, to the knowledge of the authors, they have so far not been used in hydrology. Therefore, the contribution gives an introduction to context free grammars and shows how they can be constructed and used for the structural inference of transfer functions. This is enabled by new methods from evolutionary computation, such as grammatical evolution [O'Neill, 2001], which make it possible to exploit the constructed grammar as a search space for equations. The parametrization of the transfer functions is then achieved through a second optimization routine. The contribution explores different aspects of the described procedure through a set of experiments. These experiments can be divided into three categories: (1) The inference of transfer functions from directly measurable parameters; (2) The estimation of global parameters for given transfer functions from runoff data; and (3) The estimation of sets of completely unknown transfer functions from runoff data. The conducted tests reveal different potentials and limits of the procedure. In concrete it is shown that example (1) and (2) work remarkably well. Example (3) is much more dependent on the setup. In general, it can be said that in that case much more data is needed to derive transfer function estimations, even for simple models and setups. References: - Chomsky, N. (1956): Three Models for the Description of Language. IT IRETr. 2(3), p 113-124 - O'Neil, M. (2001): Grammatical Evolution. IEEE ToEC, Vol.5, No. 4 - Samaniego, L.; Kumar, R.; Attinger, S. (2010): Multiscale parameter regionalization of a grid-based hydrologic model at the mesoscale. WWR, Vol. 46, W05523, doi:10.1029/2008WR007327
Huang, Lei; Liao, Li; Wu, Cathy H.
2016-01-01
Revealing the underlying evolutionary mechanism plays an important role in understanding protein interaction networks in the cell. While many evolutionary models have been proposed, the problem about applying these models to real network data, especially for differentiating which model can better describe evolutionary process for the observed network urgently remains as a challenge. The traditional way is to use a model with presumed parameters to generate a network, and then evaluate the fitness by summary statistics, which however cannot capture the complete network structures information and estimate parameter distribution. In this work we developed a novel method based on Approximate Bayesian Computation and modified Differential Evolution (ABC-DEP) that is capable of conducting model selection and parameter estimation simultaneously and detecting the underlying evolutionary mechanisms more accurately. We tested our method for its power in differentiating models and estimating parameters on the simulated data and found significant improvement in performance benchmark, as compared with a previous method. We further applied our method to real data of protein interaction networks in human and yeast. Our results show Duplication Attachment model as the predominant evolutionary mechanism for human PPI networks and Scale-Free model as the predominant mechanism for yeast PPI networks. PMID:26357273
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A.; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud
2008-01-01
Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. Availability: The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc. Contact: j.cornuet@imperial.ac.uk Supplementary information: Supplementary data are also available at http://www.montpellier.inra.fr/CBGP/diyabc PMID:18842597
Evolutionary Study of Interethnic Cooperation
NASA Astrophysics Data System (ADS)
Kvasnicka, Vladimir; Pospichal, Jiri
The purpose of this communication is to present an evolutionary study of cooperation between two ethnic groups. The used model is stimulated by the seminal paper of J. D. Fearon and D. D. Laitin (Explaining Interethnic Cooperation, American Political Science Review, 90 (1996), pp. 715-735), where the iterated prisoner's dilemma was used to model intra- and interethnic interactions. We reformulated their approach in a form of evolutionary prisoner's dilemma method, where a population of strategies is evolved by applying simple reproduction process with a Darwin metaphor of natural selection (a probability of selection to the reproduction is proportional to a fitness). Our computer simulations show that an application of a principle of collective guilt does not lead to an emergence of an interethnic cooperation. When an administrator is introduced, then an emergence of interethnic cooperation may be observed. Furthermore, if the ethnic groups are of very different sizes, then the principle of collective guilt may be very devastating for smaller group so that intraethnic cooperation is destroyed. The second strategy of cooperation is called the personal responsibility, where agents that defected within interethnic interactions are punished inside of their ethnic groups. It means, unlikely to the principle of collective guilt, that there exists only one type of punishment, loosely speaking, agents are punished "personally." All the substantial computational results were checked and interpreted analytically within the theory of evolutionary stable strategies. Moreover, this theoretical approach offers mechanisms of simple scenarios explaining why some particular strategies are stable or not.
Pareto-optimal phylogenetic tree reconciliation
Libeskind-Hadas, Ran; Wu, Yi-Chieh; Bansal, Mukul S.; Kellis, Manolis
2014-01-01
Motivation: Phylogenetic tree reconciliation is a widely used method for reconstructing the evolutionary histories of gene families and species, hosts and parasites and other dependent pairs of entities. Reconciliation is typically performed using maximum parsimony, in which each evolutionary event type is assigned a cost and the objective is to find a reconciliation of minimum total cost. It is generally understood that reconciliations are sensitive to event costs, but little is understood about the relationship between event costs and solutions. Moreover, choosing appropriate event costs is a notoriously difficult problem. Results: We address this problem by giving an efficient algorithm for computing Pareto-optimal sets of reconciliations, thus providing the first systematic method for understanding the relationship between event costs and reconciliations. This, in turn, results in new techniques for computing event support values and, for cophylogenetic analyses, performing robust statistical tests. We provide new software tools and demonstrate their use on a number of datasets from evolutionary genomic and cophylogenetic studies. Availability and implementation: Our Python tools are freely available at www.cs.hmc.edu/∼hadas/xscape. Contact: mukul@engr.uconn.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24932009
NASA Astrophysics Data System (ADS)
Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria
2008-05-01
Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria
2008-05-08
Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. Thismore » work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic.« less
The tangled bank of amino acids.
Goldstein, Richard A; Pollock, David D
2016-07-01
The use of amino acid substitution matrices to model protein evolution has yielded important insights into both the evolutionary process and the properties of specific protein families. In order to make these models tractable, standard substitution matrices represent the average results of the evolutionary process rather than the underlying molecular biophysics and population genetics, treating proteins as a set of independently evolving sites rather than as an integrated biomolecular entity. With advances in computing and the increasing availability of sequence data, we now have an opportunity to move beyond current substitution matrices to more interpretable mechanistic models with greater fidelity to the evolutionary process of mutation and selection and the holistic nature of the selective constraints. As part of this endeavour, we consider how epistatic interactions induce spatial and temporal rate heterogeneity, and demonstrate how these generally ignored factors can reconcile standard substitution rate matrices and the underlying biology, allowing us to better understand the meaning of these substitution rates. Using computational simulations of protein evolution, we can demonstrate the importance of both spatial and temporal heterogeneity in modelling protein evolution. © 2016 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
Evolving binary classifiers through parallel computation of multiple fitness cases.
Cagnoni, Stefano; Bergenti, Federico; Mordonini, Monica; Adorni, Giovanni
2005-06-01
This paper describes two versions of a novel approach to developing binary classifiers, based on two evolutionary computation paradigms: cellular programming and genetic programming. Such an approach achieves high computation efficiency both during evolution and at runtime. Evolution speed is optimized by allowing multiple solutions to be computed in parallel. Runtime performance is optimized explicitly using parallel computation in the case of cellular programming or implicitly taking advantage of the intrinsic parallelism of bitwise operators on standard sequential architectures in the case of genetic programming. The approach was tested on a digit recognition problem and compared with a reference classifier.
Computational Intelligence and Its Impact on Future High-Performance Engineering Systems
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler)
1996-01-01
This document contains presentations from the joint UVA/NASA Workshop on Computational Intelligence held at the Virginia Consortium of Engineering and Science Universities, Hampton, Virginia, June 27-28, 1995. The presentations addressed activities in the areas of fuzzy logic, neural networks, and evolutionary computations. Workshop attendees represented NASA, the National Science Foundation, the Department of Energy, National Institute of Standards and Technology (NIST), the Jet Propulsion Laboratory, industry, and academia. The workshop objectives were to assess the state of technology in the Computational intelligence area and to provide guidelines for future research.
Explicit Building Block Multiobjective Evolutionary Computation: Methods and Applications
2005-06-16
which is introduced in 1990 by Richard Dawkins in his book ”The Selfish Gene .” [34] 356 E.5.7 Pareto Envelop-based Selection Algorithm I and II...IGC Intelligent Gene Collector . . . . . . . . . . . . . . . . . 59 OED Orthogonal Experimental Design . . . . . . . . . . . . . 59 MED Main Effect...complete one experiment 74 `′ The string length hold within the computer (can be longer than number of genes
ERIC Educational Resources Information Center
Lamb, Richard L.; Firestone, Jonah B.
2017-01-01
Conflicting explanations and unrelated information in science classrooms increase cognitive load and decrease efficiency in learning. This reduced efficiency ultimately limits one's ability to solve reasoning problems in the science. In reasoning, it is the ability of students to sift through and identify critical pieces of information that is of…
2010-11-01
sur planeur . On a également examiné le rôle que jouent l’expérience de vol et la propension à prendre des risques dans le but d’anticiper les...données transversales, obtenues par l’observation de 144 instructeurs de vol sur planeur , en activité ou non, œuvrant dans cinq centres de vol à...137 Sommaire ..... Expérience de vol, prise de risque et attitudes dangereuses des instructeurs de vol sur planeur Ann-Renee Blais
Growth Control and Disease Mechanisms in Computational Embryogeny
NASA Technical Reports Server (NTRS)
Shapiro, Andrew A.; Yogev, Or; Antonsson, Erik K.
2008-01-01
This paper presents novel approach to applying growth control and diseases mechanisms in computational embryogeny. Our method, which mimics fundamental processes from biology, enables individuals to reach maturity in a controlled process through a stochastic environment. Three different mechanisms were implemented; disease mechanisms, gene suppression, and thermodynamic balancing. This approach was integrated as part of a structural evolutionary model. The model evolved continuum 3-D structures which support an external load. By using these mechanisms we were able to evolve individuals that reached a fixed size limit through the growth process. The growth process was an integral part of the complete development process. The size of the individuals was determined purely by the evolutionary process where different individuals matured to different sizes. Individuals which evolved with these characteristics have been found to be very robust for supporting a wide range of external loads.
Can An Evolutionary Process Create English Text?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, David H.
Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less
Evolution and Vaccination of Influenza Virus.
Lam, Ham Ching; Bi, Xuan; Sreevatsan, Srinand; Boley, Daniel
2017-08-01
In this study, we present an application paradigm in which an unsupervised machine learning approach is applied to the high-dimensional influenza genetic sequences to investigate whether vaccine is a driving force to the evolution of influenza virus. We first used a visualization approach to visualize the evolutionary paths of vaccine-controlled and non-vaccine-controlled influenza viruses in a low-dimensional space. We then quantified the evolutionary differences between their evolutionary trajectories through the use of within- and between-scatter matrices computation to provide the statistical confidence to support the visualization results. We used the influenza surface Hemagglutinin (HA) gene for this study as the HA gene is the major target of the immune system. The visualization is achieved without using any clustering methods or prior information about the influenza sequences. Our results clearly showed that the evolutionary trajectories between vaccine-controlled and non-vaccine-controlled influenza viruses are different and vaccine as an evolution driving force cannot be completely eliminated.
The evolutionary dynamics of language.
Steels, Luc; Szathmáry, Eörs
2018-02-01
The well-established framework of evolutionary dynamics can be applied to the fascinating open problems how human brains are able to acquire and adapt language and how languages change in a population. Schemas for handling grammatical constructions are the replicating unit. They emerge and multiply with variation in the brains of individuals and undergo selection based on their contribution to needed expressive power, communicative success and the reduction of cognitive effort. Adopting this perspective has two major benefits. (i) It makes a bridge to neurobiological models of the brain that have also adopted an evolutionary dynamics point of view, thus opening a new horizon for studying how human brains achieve the remarkably complex competence for language. And (ii) it suggests a new foundation for studying cultural language change as an evolutionary dynamics process. The paper sketches this novel perspective, provides references to empirical data and computational experiments, and points to open problems. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Tamura, Koichiro; Tao, Qiqing; Kumar, Sudhir
2018-01-01
Abstract RelTime estimates divergence times by relaxing the assumption of a strict molecular clock in a phylogeny. It shows excellent performance in estimating divergence times for both simulated and empirical molecular sequence data sets in which evolutionary rates varied extensively throughout the tree. RelTime is computationally efficient and scales well with increasing size of data sets. Until now, however, RelTime has not had a formal mathematical foundation. Here, we show that the basis of the RelTime approach is a relative rate framework (RRF) that combines comparisons of evolutionary rates in sister lineages with the principle of minimum rate change between evolutionary lineages and their respective descendants. We present analytical solutions for estimating relative lineage rates and divergence times under RRF. We also discuss the relationship of RRF with other approaches, including the Bayesian framework. We conclude that RelTime will be useful for phylogenies with branch lengths derived not only from molecular data, but also morphological and biochemical traits. PMID:29893954
Evolutionary Algorithms for Boolean Functions in Diverse Domains of Cryptography.
Picek, Stjepan; Carlet, Claude; Guilley, Sylvain; Miller, Julian F; Jakobovic, Domagoj
2016-01-01
The role of Boolean functions is prominent in several areas including cryptography, sequences, and coding theory. Therefore, various methods for the construction of Boolean functions with desired properties are of direct interest. New motivations on the role of Boolean functions in cryptography with attendant new properties have emerged over the years. There are still many combinations of design criteria left unexplored and in this matter evolutionary computation can play a distinct role. This article concentrates on two scenarios for the use of Boolean functions in cryptography. The first uses Boolean functions as the source of the nonlinearity in filter and combiner generators. Although relatively well explored using evolutionary algorithms, it still presents an interesting goal in terms of the practical sizes of Boolean functions. The second scenario appeared rather recently where the objective is to find Boolean functions that have various orders of the correlation immunity and minimal Hamming weight. In both these scenarios we see that evolutionary algorithms are able to find high-quality solutions where genetic programming performs the best.
Evolutionary Construction of Block-Based Neural Networks in Consideration of Failure
NASA Astrophysics Data System (ADS)
Takamori, Masahito; Koakutsu, Seiichi; Hamagami, Tomoki; Hirata, Hironori
In this paper we propose a modified gene coding and an evolutionary construction in consideration of failure in evolutionary construction of Block-Based Neural Networks. In the modified gene coding, we arrange the genes of weights on a chromosome in consideration of the position relation of the genes of weight and structure. By the modified gene coding, the efficiency of search by crossover is increased. Thereby, it is thought that improvement of the convergence rate of construction and shortening of construction time can be performed. In the evolutionary construction in consideration of failure, the structure which is adapted for failure is built in the state where failure occured. Thereby, it is thought that BBNN can be reconstructed in a short time at the time of failure. To evaluate the proposed method, we apply it to pattern classification and autonomous mobile robot control problems. The computational experiments indicate that the proposed method can improve convergence rate of construction and shorten of construction and reconstruction time.
Computer-Based Instruction Authoring Tools System (CATS): Lesson Maintenance
1990-07-01
Tols-System ( CATS ): Lesson Maintenance Vol N. Hutton Michael R. Flaningam Barbara Tarker Ann Rybowlak Susan Sulzbach Mark Lyon Brian Thomason DTC"",~1...Authoring Tools &ystem ( CATS ): Lesson Maintenance Vel N. Hulton Michael R. Flaningam Barbara Tarker Ann Rybowiak Navy Personnel Research and...lools •ystem ( CATS ): Program Element 0604722A Lesson Maintenance 6. AUTHOR(S) V. N. Huhoa, N. R. Fl1ingam. B. Tarktr, A. Rybowiak. S. Sulzbach. M
1990-06-30
gastronomes . In Food Aversion Learning, ed. N. W. Milgram, L. Krames, T. Alloway. New York: Plenum Press, 1977. Grill, H. J., Berridge, K. C. Taste...Jun 25 10:4,6:21 1990 ZLS: syr GRP: Po JOB: aug 0V: 12 Pb ok, &,vpr. VoL 4&, 000-=. 0 Pervnoe Press pl. 1990. Prited a tft USA . 0031-938"S90 53.00 + .00
Use of Monte-Carlo Simulations in Polyurethane Polymerization Processes
1995-11-01
situations, the mechanisms of molecular species diffusion must be considered. Gupta et al (Ref. 10) have demonstrated the use of Monte-Carlo simulations in...many thoughtful discussions. P154742.PDF [Page: 41 of 78] UNCLASSIFIED 29 9. 0 REFERENCES 1. Muthiah, R. M.; Krishnamurthy, V. N.; Gupta , B. R...Time Evolution of Coupled Chemical Reactions", Journal of Computational Physics, Vol. 22, 1976, pg. 403 7. Pandit,Shubhangi S.; Juvekar, Vinay A
Investigating Mental Workload Changes in a Long Duration Supervisory Control Task
2015-05-06
attention to local and global target features. Brain Cogn ., 81, 370–375. Derosière, G., Mandrick, K., Dray, G., Ward, T.E. and Perrey, S. (2013) NIRS...measured prefrontal cortex activity in neuroer- gonomics: strengths and weaknesses. Front. Hum. Neurosci ., 7, 583. Durantin, G., Gagnon, J.-F., Tremblay...Foundations of Augmented Cognition. Neuroergonomics and Operational Neuroscience , San Diego, CA. Interacting with Computers, Vol. 27 No. 5, 2015 by
Recent Naval Postgraduate School Publications
1988-08-30
kind. Part 1: Regular kernals Applied Mathematics and Computation, vol. 21, p. 171-184, (1987). Neta B- Williams, R T Stability and phase speed for...Cong., Oslo Norway Aug. 5-9, 1985. IN Proc., IMAC, p. 209-213, (198). Neta Bi Williams, R T Stability and phase speed for various finite element...development phases DoD Software Technol. for Adaptable, Reliable Systems (STARS) Business Practices Area Manage. Workshop, Los Angeles, CA, Nov. 18-22, (1985
1985-02-01
Then it follows that H2 (fl,-f 2) (-flf Thus, the total amplitude of the intermodulation signal at fre- quency fAF is given by6 V = V +I mA2 III(fl 1-f...RFI suppression in the manner described. 229 . * . ... ..... 4 -*.-., ,e ’ - ."..~ REFERENCES 1. G. Kaplan , "Computer Aided Design," IEEE Spectrum, Vol
Privacy Analysis of the Internet Protocol
2002-12-01
mixing approach first proposed for e-mail by David Chaum [Cha81]. The Onion Routing system maintains a set of mixing centers called onion routers...IEEE Computer, vol. 33, no. 5, pp. 59-67, May, 2000. 5. [Cha81] Chaum , D., Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms...the Internet,” Proceedings of IEEE COMPCON, 1997. 11. [Gol99] Goldschlag, David M., Reed, Michael G., and Syverson, Paul F., “Onion Routing for
Recent Naval Postgraduate School Publications.
1982-04-01
477 p. Haney, R L; et al.; eds. Ocean models for climate research: A workshop Sponsored by the U.S. Committee for the Global Atmos. Hes. Program. Nat... climate variability Oceanus, vol. 21, no. 4, p. 33-39, (1978). Williams, R T A review of theoretical models of atmospheric frontogenesis Chapman Conf...structure in large-scale optimization models Symp. 9 n Computer-Assisted Analysis and Model Simpification, Boulder, Colo., Mar. 24, 1980. Brown, G G
Information Extraction from Large-Multi-Layer Social Networks
2015-08-06
mization [4]. Methods that fall into this category include spec- tral algorithms, modularity methods, and methods that rely on statistical inference...Snijders and Chris Baerveldt, “A multilevel network study of the effects of delinquent behavior on friendship evolution,” Journal of mathematical sociol- ogy...1970. [10] Ulrike Luxburg, “A tutorial on spectral clustering,” Statistics and Computing, vol. 17, no. 4, pp. 395–416, Dec. 2007. [11] R. A. Fisher, “On
HERALD OF COMMUNICATIONS, 1963, VOL. 23, NO. 3 (275).
all possible ways; communication facilities and computer technique help direct the industry ; carrier-telephony equipment of the type KV-12; widening of...facsimile appara tuses for the elimination of flaws in the process ing of telegrams; public supervision over the performance quality of communication workers ...simplified cable finder. Results of the completion for the best suggestion in the field of postal-service mechanization; and the training of postal workers at a polytechnic school.
Computer Center CDC Libraries/NSRDC (Subprograms).
1981-02-01
TRANSFORM." COMM, OF THE ACM, VOL, 10, NO. 10, OCTOBER 1967. 3. SYSTEM/360 SCIENTIFIC SUBROUTINE PACKAGE, IBM TECHNICAL PUBLICATONS DEPARTMENT, 1967...VARIABLE 3) UP TO 9 DEPENDENT VARIABLES PER PLOT. FUNCTIONAL CATEGORIES: J5 LANGUAGE: FORTRAN IV USAGE COMMON /PLO/ NRUN, NPLOT, ITP .6), ITY(6), ITX(61...PLO/ NRUN - NUMBER OF THIS RUN iDEFAULT: 1) NPLOT - NUMBER OF PLOT (DEFAULT: 1 ITP - PAGE TITLE (DEFAULT: BLANK) ITY - Y TITLE (DEFAULT: BLANK) ITX - X
Technology Demonstration of the Zero Emissions Chromium Electroplating System
2008-02-01
Phase I trivalent chromium results ................................................................... 23 18 Phase II total chromium in PRD fluid results...0 xa B D F H J L Sam pies Figure 16. Phase II iron results. ERDC/CERL TR-05-35, Vol. 1 23 Trivalent Chromium Phase I Analysis for Phase I was...with the samples. Each sample was analyzed twice, and an average was computed. Figure 17 shows the results. ANAD has specified that Trivalent Chromium
Control Demonstration of a Thin Deformable In-Plane Actuated Mirror
2006-03-01
where a four-quadrant electrode grid sitting behind a pre-shaped membrane mirror uses electrostatic forces to deform the surface. Any manufacturing...to receive the Wavescope data due to its MATLAB and Simulink capa- bilities. The dSPACE computer system is stocked with a UART (Universal Asynchronous...cations,” SPIE Smart Structures and Materials Symposium, EAPAD Conference, Vol. 5051-45 (2003). 6. Bennet, H. E. and others, . “Development of
Fault Tolerance for Fight Through (FTFT)
2013-02-01
eventually to the lowest level. Now this information pyramid is being inverted: the lowest, most populated level is being “elevated” so that it is the... Egypt , December 2010, pp. 269-273. 8. Roger Myerson, Game Theory: Analysis of Conflict, Harvard University Press, 1997. 9. Li Wang, Zheng Li...Published by Springer, Delhi, India, May 2012, pp. 883-896. 27. “Inverting the Information Pyramid ,” Federal Computer Week, Vol. 26, No.4, March
1993-12-01
Mechanical Engineering Associate, PhD Laboratory: PL/VT Division Engineering University of Texas, San Anton Vol-Page No: 3-26 San Antonio, TX 7824-9065...parameters. The modules can be primitive or compound. Primitive modules represent the elementary computation units and define their interfaces. The... linear under varying conditions for the range of processor numbers. Discussion Performance: Our evaluation of the performance measurement results is the
Parallel-Computing Architecture for JWST Wavefront-Sensing Algorithms
2011-09-01
results due to the increasing cost and complexity of each test. 2. ALGORITHM OVERVIEW Phase retrieval is an image-based wavefront-sensing...broadband illumination problems we have found that hand-tuning the right matrix sizes can account for a speedup of 86x faster. This comes from hand-picking...Wavefront Sensing and Control”. Proceedings of SPIE (2007) vol. 6687 (08). [5] Greenhouse, M. A., Drury , M. P., Dunn, J. L., Glazer, S. D., Greville, E
Working Papers in Dialogue Modeling. Volume 1
1977-01-01
34). The participants communicated remotely by typing into computer terminals using the TENEX "link" facility: whatever either person types appears sir ...CARNAP, Rudolf, "Meaning and Necessity", The University of Chicago Press, Chicago, 1956 CARTWRiGHT, Richard L, "Propositions", in ’Analytical...DENNETT, D.C., "Geach on Intentional Identity", Journal of Philosophy, Vol. 65, May 30, 1968 FABER, Richard N, "Statements and What is Stated
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capellen, J.; Svec, H.J.; Sage, C.R.
1975-08-01
This report covers the year 1972, and lists approximately 10,000 articles of interest to mass spectroscopists. This two-volume report consists of three sections. Vol. I contains bibliography and author index sections. The bibliography section lists the authors, the title, and the publication data for each article. The author index lists the authors' names and the reference numbers of their articles. (auth)
2017-02-17
Psychology. Brooke, J. (1996). SUS: a ‘quick and dirty ’ usability scale. In P. Jordan, B. Thomas, I. McClelland, & B. Weerdmeester (Eds.), Usability...level modeling, International Journal of Human Computer Studies, Vol. 45(3). Menzies, T. (1996b). On the Practicality of Abductive Validation, ECAI...1). Shima, T., & Rasmussen, S. (2009). UAV Cooperative Decision and Control: Challenges and Practical Approaches, SIAM Publications, ISBN
A New Interface Specification Methodology and its Application to Transducer Synthesis
1988-05-01
structural, and physical. Within each domain descriptive methods are distinguished by the level of abstraction they emphasize. The Gajski -Kuhn Y...4.2. The Gajski -Kuhn Y-chart’s three axes correspond to three different domains for describing designs: behavioral, structural, and physical. The...Gajski83] D. Gajski , R. Kuhn, Guest Editors’ Introduction: New VLSI Tools, IEEE Computer, Vol. 16, No. 12, December 1983. [Girczyc85] E. Girczyc, R
Deductive Synthesis of the Unification Algorithm,
1981-06-01
DEDUCTIVE SYNTHESIS OF THE I - UNIFICATION ALGORITHM Zohar Manna Richard Waldinger I F? Computer Science Department Artificial Intelligence Center...theorem proving," Artificial Intelligence Journal, Vol. 9, No. 1, pp. 1-35. Boyer, R. S. and J S. Moore [Jan. 19751, "Proving theorems about LISP...d’Intelligence Artificielle , U.E.R. de Luminy, Universit6 d’ Aix-Marseille II. Green, C. C. [May 1969], "Application of theorem proving to problem
Recent Naval Postgraduate School Publications.
1981-05-01
Technol., vol. 16, no. 2, p. 629-634, (Har./Apr., 1979). Winograd, NGa rlso ,B J; garrison, D B.Angular distrlbutlo f ejected articles from ion bombarded...AD-A119 757 NAVAL POSTGRADUATE SCHOOL MONTEREY CA F /O 5/1 RECENT NAVAL POSTGRADUATE SCHOOL PUBLZCATIONS.(U) MAY 81 W M TOLLES UNCLASSIFIED NPS-012...SCIENCE PUBLISHED PAPERS (contsd) Burkhead F Parallel roces inq of recursive functions Ann. Cnf. o te Assoc _1or Computing Machine.,Detroit, Hic
XTALOPT: An open-source evolutionary algorithm for crystal structure prediction
NASA Astrophysics Data System (ADS)
Lonie, David C.; Zurek, Eva
2011-02-01
The implementation and testing of XTALOPT, an evolutionary algorithm for crystal structure prediction, is outlined. We present our new periodic displacement (ripple) operator which is ideally suited to extended systems. It is demonstrated that hybrid operators, which combine two pure operators, reduce the number of duplicate structures in the search. This allows for better exploration of the potential energy surface of the system in question, while simultaneously zooming in on the most promising regions. A continuous workflow, which makes better use of computational resources as compared to traditional generation based algorithms, is employed. Various parameters in XTALOPT are optimized using a novel benchmarking scheme. XTALOPT is available under the GNU Public License, has been interfaced with various codes commonly used to study extended systems, and has an easy to use, intuitive graphical interface. Program summaryProgram title:XTALOPT Catalogue identifier: AEGX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.1 or later [1] No. of lines in distributed program, including test data, etc.: 36 849 No. of bytes in distributed program, including test data, etc.: 1 149 399 Distribution format: tar.gz Programming language: C++ Computer: PCs, workstations, or clusters Operating system: Linux Classification: 7.7 External routines: QT [2], OpenBabel [3], AVOGADRO [4], SPGLIB [8] and one of: VASP [5], PWSCF [6], GULP [7]. Nature of problem: Predicting the crystal structure of a system from its stoichiometry alone remains a grand challenge in computational materials science, chemistry, and physics. Solution method: Evolutionary algorithms are stochastic search techniques which use concepts from biological evolution in order to locate the global minimum on their potential energy surface. Our evolutionary algorithm, XTALOPT, is freely available to the scientific community for use and collaboration under the GNU Public License. Running time: User dependent. The program runs until stopped by the user.
Evolution, learning, and cognition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Y.C.
1988-01-01
The book comprises more than fifteen articles in the areas of neural networks and connectionist systems, classifier systems, adaptive network systems, genetic algorithm, cellular automata, artificial immune systems, evolutionary genetics, cognitive science, optical computing, combinatorial optimization, and cybernetics.
Application of high technology in highway transportation.
DOT National Transportation Integrated Search
1985-01-01
Highway and traffic engineering practice is rapidly changing as communications technology and computer systems are being adopted to facilitate the work of the practitioners and expand their capabilities. This field has been an evolutionary one since ...
Numerical Control/Computer Aided Manufacturing (NC/CAM), A Descom Study
1979-07-01
CAM machines operate directly from computers, but most get instructions in the form of punched tape. The applications of NC/CAM are virtually...Although most NC/CAM equipment is metal working, its applications include electronics manufacturing, glass making, food processing, materiel handling...drafting, woodworking, plastics and inspection, just to name a few. Numerical control, like most technologies, is an advancing and evolutionary process
Supermultiplicative Speedups of Probabilistic Model-Building Genetic Algorithms
2009-02-01
physicists as well as practitioners in evolutionary computation. The project was later extended to the one-dimensional SK spin glass with power -law... Brasil ) 10. Yuji Sato (Hosei University, Japan) 11. Shunsukc Saruwatari (Tokyo University, Japan) 12. Jian-Hung Chen (Feng Chia University, Taiwan...scalability. In A. Tiwari, J. Knowlcs, E. Avincri, K. Dahal, and R. Roy (Eds.) Applications of Soft Computing: Recent Trends. Berlin: Springer (2006
NASA Astrophysics Data System (ADS)
Alunno-Bruscia, Marianne; van der Veer, Henk W.; Kooijman, Sebastiaan A. L. M.
2011-11-01
This second special issue of the Journal of Sea Research on development and applications of Dynamic Energy Budget (DEB) theory concludes the European Research Project AquaDEB (2007-2011). In this introductory paper we summarise the progress made during the running time of this 5 years' project, present context for the papers in this volume and discuss future directions. The main scientific objectives in AquaDEB were (i) to study and compare the sensitivity of aquatic species (mainly molluscs and fish) to environmental variability within the context of DEB theory for metabolic organisation, and (ii) to evaluate the inter-relationships between different biological levels (individual, population, ecosystem) and temporal scales (life cycle, population dynamics, evolution). AquaDEB phase I focussed on quantifying bio-energetic processes of various aquatic species ( e.g. molluscs, fish, crustaceans, algae) and phase II on: (i) comparing of energetic and physiological strategies among species through the DEB parameter values and identifying the factors responsible for any differences in bioenergetics and physiology; (ii) considering different scenarios of environmental disruption (excess of nutrients, diffuse or massive pollution, exploitation by man, climate change) to forecast effects on growth, reproduction and survival of key species; (iii) scaling up the models for a few species from the individual level up to the level of evolutionary processes. Apart from the three special issues in the Journal of Sea Research — including the DEBIB collaboration (see vol. 65 issue 2), a theme issue on DEB theory appeared in the Philosophical Transactions of the Royal Society B (vol 365, 2010); a large number of publications were produced; the third edition of the DEB book appeared (2010); open-source software was substantially expanded (over 1000 functions); a large open-source systematic collection of ecophysiological data and DEB parameters has been set up; and a series of DEB tele-courses and symposia have been further developed and expanded, bringing together people from a wide variety of backgrounds (experimental and theoretical biologists, mathematicians, engineers, physicists, chemists, environmental sciences, computer scientists) and training levels in DEB theory. Some 15 PhD students graduated during the running time of AquaDEB with a strong DEB component in their projects and over 15 will complete their thesis within a few years. Five post-doctoral projects were also part of the training network. Several universities (Brest, Marseille, Lisbon, Bergen) included DEB courses in their standard curriculum for biology students.
NASA Astrophysics Data System (ADS)
Wagh, Aditi
Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both modalities, with students who built models not incorporating slippage explanations in responses. Study 3 compares these modalities with a control using traditional activities. Pre and posttests reveal that the two modalities manifested greater facility with accessing and assembling rules than the control. The dissertation offers implications for the design of learning environments for evolutionary change, design of the two modalities based on their strengths and weaknesses, and teacher training for the same.
More efficient evolutionary strategies for model calibration with watershed model for demonstration
NASA Astrophysics Data System (ADS)
Baggett, J. S.; Skahill, B. E.
2008-12-01
Evolutionary strategies allow automatic calibration of more complex models than traditional gradient based approaches, but they are more computationally intensive. We present several efficiency enhancements for evolution strategies, many of which are not new, but when combined have been shown to dramatically decrease the number of model runs required for calibration of synthetic problems. To reduce the number of expensive model runs we employ a surrogate objective function for an adaptively determined fraction of the population at each generation (Kern et al., 2006). We demonstrate improvements to the adaptive ranking strategy that increase its efficiency while sacrificing little reliability and further reduce the number of model runs required in densely sampled parts of parameter space. Furthermore, we include a gradient individual in each generation that is usually not selected when the search is in a global phase or when the derivatives are poorly approximated, but when selected near a smooth local minimum can dramatically increase convergence speed (Tahk et al., 2007). Finally, the selection of the gradient individual is used to adapt the size of the population near local minima. We show, by incorporating these enhancements into the Covariance Matrix Adaption Evolution Strategy (CMAES; Hansen, 2006), that their synergetic effect is greater than their individual parts. This hybrid evolutionary strategy exploits smooth structure when it is present but degrades to an ordinary evolutionary strategy, at worst, if smoothness is not present. Calibration of 2D-3D synthetic models with the modified CMAES requires approximately 10%-25% of the model runs of ordinary CMAES. Preliminary demonstration of this hybrid strategy will be shown for watershed model calibration problems. Hansen, N. (2006). The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larrañga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of distribution algorithms. pp. 75-102, Springer Kern, S., N. Hansen and P. Koumoutsakos (2006). Local Meta-Models for Optimization Using Evolution Strategies. In Ninth International Conference on Parallel Problem Solving from Nature PPSN IX, Proceedings, pp.939-948, Berlin: Springer. Tahk, M., Woo, H., and Park. M, (2007). A hybrid optimization of evolutionary and gradient search. Engineering Optimization, (39), 87-104.
Petty, Stephen E; Nicas, Mark; Boiarski, Anthony A
2011-01-01
This study examines a method for estimating the dermal absorption of benzene contained in hydrocarbon liquids that contact the skin. This method applies to crude oil, gasoline, organic solvents, penetrants, and oils. The flux of benzene through occluded skin as a function of the percent vol/vol benzene in the liquid is derived by fitting a curve to experimental data; the function is supralinear at benzene concentrations < or = 5% vol/vol. When a liquid other than pure benzene is on nonoccluded skin, benzene may preferentially evaporate from the liquid, which thereby decreases the benzene flux. We present a time-averaging method here for estimating the reduced dermal flux during evaporation. Example calculations are presented for benzene at 2% vol/vol in gasoline, and for benzene at 0.1% vol/vol in a less volatile liquid. We also discuss other factors affecting dermal absorption.
On Improving Efficiency of Differential Evolution for Aerodynamic Shape Optimization Applications
NASA Technical Reports Server (NTRS)
Madavan, Nateri K.
2004-01-01
Differential Evolution (DE) is a simple and robust evolutionary strategy that has been provEn effective in determining the global optimum for several difficult optimization problems. Although DE offers several advantages over traditional optimization approaches, its use in applications such as aerodynamic shape optimization where the objective function evaluations are computationally expensive is limited by the large number of function evaluations often required. In this paper various approaches for improving the efficiency of DE are reviewed and discussed. Several approaches that have proven effective for other evolutionary algorithms are modified and implemented in a DE-based aerodynamic shape optimization method that uses a Navier-Stokes solver for the objective function evaluations. Parallelization techniques on distributed computers are used to reduce turnaround times. Results are presented for standard test optimization problems and for the inverse design of a turbine airfoil. The efficiency improvements achieved by the different approaches are evaluated and compared.
Evolutionary neurobiology and aesthetics.
Smith, Christopher Upham
2005-01-01
If aesthetics is a human universal, it should have a neurobiological basis. Although use of all the senses is, as Aristotle noted, pleasurable, the distance senses are primarily involved in aesthetics. The aesthetic response emerges from the central processing of sensory input. This occurs very rapidly, beneath the level of consciousness, and only the feeling of pleasure emerges into the conscious mind. This is exemplified by landscape appreciation, where it is suggested that a computation built into the nervous system during Paleolithic hunter-gathering is at work. Another inbuilt computation leading to an aesthetic response is the part-whole relationship. This, it is argued, may be traced to the predator-prey "arms races" of evolutionary history. Mate selection also may be responsible for part of our response to landscape and visual art. Aesthetics lies at the core of human mentality, and its study is consequently of importance not only to philosophers and art critics but also to neurobiologists.
A hybrid multi-objective evolutionary algorithm for wind-turbine blade optimization
NASA Astrophysics Data System (ADS)
Sessarego, M.; Dixon, K. R.; Rival, D. E.; Wood, D. H.
2015-08-01
A concurrent-hybrid non-dominated sorting genetic algorithm (hybrid NSGA-II) has been developed and applied to the simultaneous optimization of the annual energy production, flapwise root-bending moment and mass of the NREL 5 MW wind-turbine blade. By hybridizing a multi-objective evolutionary algorithm (MOEA) with gradient-based local search, it is believed that the optimal set of blade designs could be achieved in lower computational cost than for a conventional MOEA. To measure the convergence between the hybrid and non-hybrid NSGA-II on a wind-turbine blade optimization problem, a computationally intensive case was performed using the non-hybrid NSGA-II. From this particular case, a three-dimensional surface representing the optimal trade-off between the annual energy production, flapwise root-bending moment and blade mass was achieved. The inclusion of local gradients in the blade optimization, however, shows no improvement in the convergence for this three-objective problem.
Decentralized Grid Scheduling with Evolutionary Fuzzy Systems
NASA Astrophysics Data System (ADS)
Fölling, Alexander; Grimme, Christian; Lepping, Joachim; Papaspyrou, Alexander
In this paper, we address the problem of finding workload exchange policies for decentralized Computational Grids using an Evolutionary Fuzzy System. To this end, we establish a non-invasive collaboration model on the Grid layer which requires minimal information about the participating High Performance and High Throughput Computing (HPC/HTC) centers and which leaves the local resource managers completely untouched. In this environment of fully autonomous sites, independent users are assumed to submit their jobs to the Grid middleware layer of their local site, which in turn decides on the delegation and execution either on the local system or on remote sites in a situation-dependent, adaptive way. We find for different scenarios that the exchange policies show good performance characteristics not only with respect to traditional metrics such as average weighted response time and utilization, but also in terms of robustness and stability in changing environments.
Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory
NASA Astrophysics Data System (ADS)
Matsumura, Koki; Kawamoto, Masaru
This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.
Evolutionary psychology: new perspectives on cognition and motivation.
Cosmides, Leda; Tooby, John
2013-01-01
Evolutionary psychology is the second wave of the cognitive revolution. The first wave focused on computational processes that generate knowledge about the world: perception, attention, categorization, reasoning, learning, and memory. The second wave views the brain as composed of evolved computational systems, engineered by natural selection to use information to adaptively regulate physiology and behavior. This shift in focus--from knowledge acquisition to the adaptive regulation of behavior--provides new ways of thinking about every topic in psychology. It suggests a mind populated by a large number of adaptive specializations, each equipped with content-rich representations, concepts, inference systems, and regulatory variables, which are functionally organized to solve the complex problems of survival and reproduction encountered by the ancestral hunter-gatherers from whom we are descended. We present recent empirical examples that illustrate how this approach has been used to discover new features of attention, categorization, reasoning, learning, emotion, and motivation.
Blood Program in World War II. Medical Department, United States Army
1964-01-01
Branch Lieutenant Colonel JEROME RUDBERG, MSC, USA, Chief, Information Activities Branch RODERICK M. ENGERT, Chief, General Reference and Research... Activities of Medical Consultants Vol. II. Infectious Diseases Preventive Medicine in World War II: Vol. II. Environmental Hygiene Vol. III. Personal...Other Than Malaria Vr VIII Surgery in World War II: Activities of Surgical Consultants, vol. I Activities of Surgical Consultants, vol. II General
Recombinant transfer in the basic genome of E. coli
Dixit, Purushottam; Studier, F. William; Pang, Tin Yau; ...
2015-07-07
An approximation to the ~4-Mbp basic genome shared by 32 strains of E. coli representing six evolutionary groups has been derived and analyzed computationally. A multiple-alignment of the 32 complete genome sequences was filtered to remove mobile elements and identify the most reliable ~90% of the aligned length of each of the resulting 496 basic-genome pairs. Patterns of single bp mutations (SNPs) in aligned pairs distinguish clonally inherited regions from regions where either genome has acquired DNA fragments from diverged genomes by homologous recombination since their last common ancestor. Such recombinant transfer is pervasive across the basic genome, mostly betweenmore » genomes in the same evolutionary group, and generates many unique mosaic patterns. The six least-diverged genome-pairs have one or two recombinant transfers of length ~40–115 kbp (and few if any other transfers), each containing one or more gene clusters known to confer strong selective advantage in some environments. Moderately diverged genome pairs (0.4–1% SNPs) show mosaic patterns of interspersed clonal and recombinant regions of varying lengths throughout the basic genome, whereas more highly diverged pairs within an evolutionary group or pairs between evolutionary groups having >1.3% SNPs have few clonal matches longer than a few kbp. Many recombinant transfers appear to incorporate fragments of the entering DNA produced by restriction systems of the recipient cell. A simple computational model can closely fit the data. As a result, most recombinant transfers seem likely to be due to generalized transduction by co-evolving populations of phages, which could efficiently distribute variability throughout bacterial genomes.« less
Recombinant transfer in the basic genome of E. coli
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dixit, Purushottam; Studier, F. William; Pang, Tin Yau
An approximation to the ~4-Mbp basic genome shared by 32 strains of E. coli representing six evolutionary groups has been derived and analyzed computationally. A multiple-alignment of the 32 complete genome sequences was filtered to remove mobile elements and identify the most reliable ~90% of the aligned length of each of the resulting 496 basic-genome pairs. Patterns of single bp mutations (SNPs) in aligned pairs distinguish clonally inherited regions from regions where either genome has acquired DNA fragments from diverged genomes by homologous recombination since their last common ancestor. Such recombinant transfer is pervasive across the basic genome, mostly betweenmore » genomes in the same evolutionary group, and generates many unique mosaic patterns. The six least-diverged genome-pairs have one or two recombinant transfers of length ~40–115 kbp (and few if any other transfers), each containing one or more gene clusters known to confer strong selective advantage in some environments. Moderately diverged genome pairs (0.4–1% SNPs) show mosaic patterns of interspersed clonal and recombinant regions of varying lengths throughout the basic genome, whereas more highly diverged pairs within an evolutionary group or pairs between evolutionary groups having >1.3% SNPs have few clonal matches longer than a few kbp. Many recombinant transfers appear to incorporate fragments of the entering DNA produced by restriction systems of the recipient cell. A simple computational model can closely fit the data. As a result, most recombinant transfers seem likely to be due to generalized transduction by co-evolving populations of phages, which could efficiently distribute variability throughout bacterial genomes.« less
Evolutionary dynamics on graphs: Efficient method for weak selection
NASA Astrophysics Data System (ADS)
Fu, Feng; Wang, Long; Nowak, Martin A.; Hauert, Christoph
2009-04-01
Investigating the evolutionary dynamics of game theoretical interactions in populations where individuals are arranged on a graph can be challenging in terms of computation time. Here, we propose an efficient method to study any type of game on arbitrary graph structures for weak selection. In this limit, evolutionary game dynamics represents a first-order correction to neutral evolution. Spatial correlations can be empirically determined under neutral evolution and provide the basis for formulating the game dynamics as a discrete Markov process by incorporating a detailed description of the microscopic dynamics based on the neutral correlations. This framework is then applied to one of the most intriguing questions in evolutionary biology: the evolution of cooperation. We demonstrate that the degree heterogeneity of a graph impedes cooperation and that the success of tit for tat depends not only on the number of rounds but also on the degree of the graph. Moreover, considering the mutation-selection equilibrium shows that the symmetry of the stationary distribution of states under weak selection is skewed in favor of defectors for larger selection strengths. In particular, degree heterogeneity—a prominent feature of scale-free networks—generally results in a more pronounced increase in the critical benefit-to-cost ratio required for evolution to favor cooperation as compared to regular graphs. This conclusion is corroborated by an analysis of the effects of population structures on the fixation probabilities of strategies in general 2×2 games for different types of graphs. Computer simulations confirm the predictive power of our method and illustrate the improved accuracy as compared to previous studies.
Courbebaisse, Marie; Gaillard, François; Tissier, Anne-Marie; Fournier, Catherine; Le Nestour, Alexis; Corréas, Jean-Michel; Slimani-Thevenet, Hind; Martinez, Frank; Léon, Carine; Eladari, Dominique; Timsit, Marc-Olivier; Otal, Philippe; Hignette, Chantal; Friedlander, Gérard; Méjean, Arnaud; Houillier, Pascal; Kamar, Nassim; Legendre, Christophe
2016-08-08
The predictors of long-term renal function in living kidney donors are currently discussed. Our objectives were to describe the predictors of functional gain of the remaining kidney after kidney donation. We hypothesized that GFR of the remaining kidney divided by volume of this kidney (rk-GFR/vol) would reflect the density of functional nephrons and be inversely associated with functional gain of the remaining kidney. We conducted a prospective monocentric study including 63 living donors (26 men; 50.3±11.8 years old) who had been evaluated for (51)Cr-EDTA and measured GFR, split renal function by scintigraphy before donation (between 2004 and 2009), and measured GFR at 5.7±0.5 years after donation. For 52 donors, volume of the remaining kidney (measured and estimated with the ellipsoid formula using renal computed tomography scannography) was determined before donation. We tested our hypothesis in an external validation cohort of 39 living donors (13 men; 51.0±9.4 years old) from another single center during the same time period. For the main cohort, the mean measured GFR was 97.6±13.0 ml/min per 1.73 m(2) before donation and 63.8±9.4 ml/min per 1.73 m(2) at 5 years. Functional gain averaged 16.2±7.2 ml/min per 1.73 m(2) (+35.3%±16.7%). Multivariate analysis showed that age, body mass index, and rk-GFR/vol at donation were negatively correlated with functional gain and had strong predictive power of the 5-year functional gain (adjusted 5-year functional gain for age: -0.4 [95% confidence interval (95% CI), -0.5 to -0.1]; body mass index: -0.3 [95% CI, -0.6 to -0.1]; rk-GFR/vol: -55.1 [95% CI, -92.3 to -17.9]). We tested this model in the external validation cohort (adjusted 5-year functional gain for age: -0.1 [95% CI, -0.5 to 0.3]; body mass index: -0.9 [95% CI, -1.8 to -0.1]; rk-GFR/vol: -97.6 [95% CI, -137.5 to -57.6]) and confirmed that rk-GFR/vol was inversely associated with 5-year functional gain. For given age and body mass index, the long-term functional gain of the remaining kidney is inversely associated with the new variable rk-GFR/vol at donation. Copyright © 2016 by the American Society of Nephrology.
Reconstruction of Mammary Gland Structure Using Three-Dimensional Computer-Based Microscopy
2004-08-01
for image analysis in cytology" Ortiz de Solorzano C., R . Malladi , Lockett S. In: Geometric methods in bio-medical image processing. Ravikanth Malladi ...Deschamps T., Idica A.K., Malladi R ., Ortiz de Solorzano C. Journal of Biomedical Optics 9(3):445-453, 2004.. Manuscripts (in preparation): "* "Three...Deschamps T., Idica A.K., 16 Malladi R ., Ortiz de Solorzano C., Proceedings of Photonics West 2003, Vol. 4964, 2003 "* "Automatic and segmentation
2002-01-01
1998. [36] T. Sakai, Riemannian Geometry, AMS Translations of Mathematical Monographs, vol 149. [37] N. Sochen, R . Kimmel, and R , Malladi , “A general...matical Physics 107, pp. 649-705, 1986. [5] V. Caselles, R . Kimmel, G. Sapiro, and C. Sbert, “Minimal surfaces based object segmentation,” IEEE- PAMI...June 2000 [9] R . Cohen, R . M. Hardt, D. Kinderlehrer, S. Y. Lin, and M. Luskin, “Minimum energy configurations for liquid crystals: Computational
2012-09-01
ensures that the trainer will produce a cascade that achieves a 0.9044 hit rate (= 0.9910) or better, or it will fail trying. The Viola-Jones...by the user. Thus, a final cascade cannot be produced, and the trainer has failed at the specific hit and FA rate requirements. 19 THIS PAGE...International Journal of Computer Vision, vol. 63, no. 2, pp. 153–161, July 2005. [3] L. Lee, “ Gait dynamics for recognition and classification,” in AI Memo
2018-03-14
pricing, Appl. Math . Comp. Vol.305, 174-187 (2017) 5. W. Li, S. Wang, Pricing European options with proportional transaction costs and stochastic...for fractional differential equation. Numer. Math . Theor. Methods Appl. 5, 229–241, 2012. [23] Kilbas A.A. and Marzan, S.A., Cauchy problem for...numerical technique for solving fractional optimal control problems, Comput. Math . Appl., 62, Issue 3, 1055–1067, 2011. [26] Lotfi A., Yousefi SA., Dehghan M
1987-10-30
1489-1496, 1985. 13. W.T. Welford and R. Winston , The Optics of Nonimaging Concentrators , Academic Press, New York, N.Y., 1978 (see Appendix A). 14. R.H...34, Applied Optics , Vol. 25, pp. 3033-3046 (1986). 2. P. Idell and J.W. Goodman, "Design of optical imaging concentrators for partially coherent light: absolute...AD-fIB? Ŗ OPTICAL CONPIITINO RESEAIRCII(U STANFORD UlNIV CA STINFORD / ELECTRONICS LASS J N 0000W4 30 OCT 97 SMAFOSR-TR-S?-1635 RFOSR-96
A Prototype Decision Support System for the Location of Military Water Points.
1980-06-01
create an environ- ment which is conductive to an efficient man/machine decision making system . This could be accomplished by designing the operating...Figure 12. Flowchart of Program COMPUTE 50 Procedure This Decision Support System was designed to be interactive. That is, it requests data from the user...Pg. 82-114, 1974. 24. Geoffrion, A.M. and G.W. Graves, "Multicomodity Distribution System Design by Benders Partition", Management Science, Vol. 20, Pg
Pseudorandom Number Generators for Mobile Devices: An Examination and Attempt to Improve Randomness
2013-09-01
Notes in Computer Science (LNCS), Vol. 4341), (Hanoi, Vietnam: Springer, 2006), 260–270. 36 Simon R. Blackburn , “The Linear Complexity of the Self... Blackburn , Simon R. ‘The Linear Complexity of the Self-Shrinking Generator.” IEEE Trans. Inf. Theory, 45 (September 1999). Blum, Leonore, Manuel...afloat when the waters have been rough! xv THIS PAGE INTENTIONALLY LEFT BLANK xvi I. INTRODUCTION When the average man thinks about war and
1991-06-01
intensive systems, including the use of onboard digital computers. Topics include: measurements that are digital in origin, sampling, encoding, transmitting...Individuals charged with designing aircraft measuring systems to become better acquainted with new solutions to their requirements. This volume Is...concerned with aircraft measuring systems as related to flight test and flight research. Measure - ments that are digital in origin or that must be
Analyzing the Effects of Technological Change: A Computable General Equilibrium Approach
1988-09-01
to obtain. One way out of this dilemma is to change slightly the interpretation of the formal model of consumer behavior which has been presented above...approach to analyzing the economy-wide effects of a phenomenon such as technological change. By focussing on aggregate producer and consumer behavior , it is... Consumer Behavior ." In R.L. Baseman and G. Rhodes, eds., Advances in Econometrics, vol. 1. Greenwich: JAI Press, 1982. Nagurney [1987] Nagurney, A
Exploratory Modeling and the use of Simulation for Policy Analysis
1992-01-01
and the Use of Simulation for Policy Analysis Steven C. Barikes Prepared for the United States Army R A N D Approved for public release; distribution...Research, Vol. 39, No. 3, May-June 1991, pp. 355-365. Lipton, Richard J ., Thomas G. Marr, and J . Douglas Welsh, "Computational Approaches to Discovering...the Visual Cortex, John Wiley & Sons, New York, 1985. / -30- Rothenberg, J ., N. Z. Shapiro, and C. Hefley, "A Propagative’ Approach to Sensitivity
2015-12-01
combine satisficing behaviour with learning and adaptation through environmental feedback. This a sequential decision making with one alternative...next action that an opponent will most likely take in a strategic interaction. Also, cognitive models derived from instance- based learning theory (IBL... through instance- based learning . In Y. Li (Ed.), Lecture Notes in Computer Science (Vol. 6818, pp. 281-293). Heidelberg: Springer Berlin. Gonzalez, C
NASA Astrophysics Data System (ADS)
Papers are presented on such topics as the wireless data network in PCS, advances in digital mobile networks, ATM switching experiments, broadband applications, network planning, and advances in SONET/SDH implementations. Consideration is also given to gigabit computer networks, techniques for modeling large high-speed networks, coding and modulation, the next-generation lightwave system, signaling systems for broadband ISDN, satellite technologies, and advances in standardization of low-rate signal processing.
Computer-Assisted Communication Device for Botulinum-Intoxicated Patients
2008-01-01
development of small molecule therapeutics for botulinum neurotoxin and on the development of nerve agent pretreatments and therapies . He has published...Souayah, N., Karim, H., Kamin, S.S., McArdle, J. and Marcus, S. (2006) ‘Severe botulism after focal injection of botulinum toxin’, Neurology , Vol...67, pp.1855–1856. Tacket, C.O., Shandera, W.X., Mann, J.M., Hargrett, N.T. and Blake, P.A. (1984) ‘ Equine antitoxin use and other factors that
Neurale Netwerken en Radarsystemen (Neural Networks and Radar Systems)
1989-08-01
general issues in cognitive science", Parallel distributed processing, Vol 1: Foundations, Rumelhart et al. 1986 pp 110-146 THO rapport Pagina 151 36 D.E...34Neural networks (part 2)",Expert Focus, IEEE Expert, Spring 1988. 61 J.A. Anderson, " Cognitive and Psychological Computations with Neural Models", IEEE...Pagina 154 69 David H. Ackley, Geoffrey E. Hinton and Terrence J. Sejnowski, "A Learning Algorithm for Boltzmann machines", cognitive science 9, 147-169
Intelligent Scene Analysis and Recognition
2010-03-30
Database, 1998, pp. 42–51. [9] I. Biederman , Aspects and extension of a theory of human image understanding, Z. Pylyshyn, Ed. Ablex Publishing Corporation...geometry in the visual system,” Biological Cybernetics, vol. 55, no. 6, pp. 367–375, 1987 . [30] W. T. Freeman and E. H. Adelson, “The design and use of...Computer Vision and Pattern Recognition, 2009, pp. 1980– 1987 . [47] M. Leordeanu and M. Hebert, “A spectral technique for correspondence problems using
United States Air Force Summer Research Program -- 1993. Volume 1. Program Management Report
1993-12-01
IEEE Spectrum and Physics Today. High school applicants can participate only in laboratories located no more than 20 miles from their residence. Tailored...faculty and $37/day for graduate students whose homes were more than 50 miles from the laboratory. Transportation to the laboratory at the beginning of...TX 78212- 7200 Branting, Luther Field: Dept of Computer Science Assistant Professor, PhD Laboratory: AL/HR PC Box 3682 University of Wyoming Vol-Page
Weberized Mumford-Shah Model with Bose-Einstein Photon Noise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen Jianhong, E-mail: jhshen@math.umn.edu; Jung, Yoon-Mo
Human vision works equally well in a large dynamic range of light intensities, from only a few photons to typical midday sunlight. Contributing to such remarkable flexibility is a famous law in perceptual (both visual and aural) psychology and psychophysics known as Weber's Law. The current paper develops a new segmentation model based on the integration of Weber's Law and the celebrated Mumford-Shah segmentation model (Comm. Pure Appl. Math., vol. 42, pp. 577-685, 1989). Explained in detail are issues concerning why the classical Mumford-Shah model lacks light adaptivity, and why its 'weberized' version can more faithfully reflect human vision's superiormore » segmentation capability in a variety of illuminance conditions from dawn to dusk. It is also argued that the popular Gaussian noise model is physically inappropriate for the weberization procedure. As a result, the intrinsic thermal noise of photon ensembles is introduced based on Bose and Einstein's distributions in quantum statistics, which turns out to be compatible with weberization both analytically and computationally. The current paper focuses on both the theory and computation of the weberized Mumford-Shah model with Bose-Einstein noise. In particular, Ambrosio-Tortorelli's {gamma}-convergence approximation theory is adapted (Boll. Un. Mat. Ital. B, vol. 6, pp. 105-123, 1992), and stable numerical algorithms are developed for the associated pair ofnonlinear Euler-Lagrange PDEs.« less
40 CFR 80.82 - Butane blending.
Code of Federal Regulations, 2011 CFR
2011-07-01
...% aromatics≤2.0 vol% benzene≤0.03 vol% sulfur≤140 ppm until December 31, 2003; ≤120 ppm in 2004; ≤30 ppm... results demonstrate the butane has the following properties: olefins≤10.0 vol% aromatics≤2.0 vol% benzene...
40 CFR 80.82 - Butane blending.
Code of Federal Regulations, 2010 CFR
2010-07-01
...% aromatics≤2.0 vol% benzene≤0.03 vol% sulfur≤140 ppm until December 31, 2003; ≤120 ppm in 2004; ≤30 ppm... results demonstrate the butane has the following properties: olefins≤10.0 vol% aromatics≤2.0 vol% benzene...
Computational complexity of ecological and evolutionary spatial dynamics
Ibsen-Jensen, Rasmus; Chatterjee, Krishnendu; Nowak, Martin A.
2015-01-01
There are deep, yet largely unexplored, connections between computer science and biology. Both disciplines examine how information proliferates in time and space. Central results in computer science describe the complexity of algorithms that solve certain classes of problems. An algorithm is deemed efficient if it can solve a problem in polynomial time, which means the running time of the algorithm is a polynomial function of the length of the input. There are classes of harder problems for which the fastest possible algorithm requires exponential time. Another criterion is the space requirement of the algorithm. There is a crucial distinction between algorithms that can find a solution, verify a solution, or list several distinct solutions in given time and space. The complexity hierarchy that is generated in this way is the foundation of theoretical computer science. Precise complexity results can be notoriously difficult. The famous question whether polynomial time equals nondeterministic polynomial time (i.e., P = NP) is one of the hardest open problems in computer science and all of mathematics. Here, we consider simple processes of ecological and evolutionary spatial dynamics. The basic question is: What is the probability that a new invader (or a new mutant) will take over a resident population? We derive precise complexity results for a variety of scenarios. We therefore show that some fundamental questions in this area cannot be answered by simple equations (assuming that P is not equal to NP). PMID:26644569
NASA Astrophysics Data System (ADS)
Żukowicz, Marek; Markiewicz, Michał
2016-09-01
The aim of the article is to present a mathematical definition of the object model, that is known in computer science as TreeList and to show application of this model for design evolutionary algorithm, that purpose is to generate structures based on this object. The first chapter introduces the reader to the problem of presenting data using the TreeList object. The second chapter describes the problem of testing data structures based on TreeList. The third one shows a mathematical model of the object TreeList and the parameters, used in determining the utility of structures created through this model and in evolutionary strategy, that generates these structures for testing purposes. The last chapter provides a brief summary and plans for future research related to the algorithm presented in the article.
Parametric Sensitivity Analysis of Oscillatory Delay Systems with an Application to Gene Regulation.
Ingalls, Brian; Mincheva, Maya; Roussel, Marc R
2017-07-01
A parametric sensitivity analysis for periodic solutions of delay-differential equations is developed. Because phase shifts cause the sensitivity coefficients of a periodic orbit to diverge, we focus on sensitivities of the extrema, from which amplitude sensitivities are computed, and of the period. Delay-differential equations are often used to model gene expression networks. In these models, the parametric sensitivities of a particular genotype define the local geometry of the evolutionary landscape. Thus, sensitivities can be used to investigate directions of gradual evolutionary change. An oscillatory protein synthesis model whose properties are modulated by RNA interference is used as an example. This model consists of a set of coupled delay-differential equations involving three delays. Sensitivity analyses are carried out at several operating points. Comments on the evolutionary implications of the results are offered.
Detecting and Analyzing Genetic Recombination Using RDP4.
Martin, Darren P; Murrell, Ben; Khoosal, Arjun; Muhire, Brejnev
2017-01-01
Recombination between nucleotide sequences is a major process influencing the evolution of most species on Earth. The evolutionary value of recombination has been widely debated and so too has its influence on evolutionary analysis methods that assume nucleotide sequences replicate without recombining. When nucleic acids recombine, the evolution of the daughter or recombinant molecule cannot be accurately described by a single phylogeny. This simple fact can seriously undermine the accuracy of any phylogenetics-based analytical approach which assumes that the evolutionary history of a set of recombining sequences can be adequately described by a single phylogenetic tree. There are presently a large number of available methods and associated computer programs for analyzing and characterizing recombination in various classes of nucleotide sequence datasets. Here we examine the use of some of these methods to derive and test recombination hypotheses using multiple sequence alignments.
Jacobs, Christopher; Lambourne, Luke; Xia, Yu; ...
2017-01-20
Here, system-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism's genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene's fitness contribution to an organism "here and now"º and the same gene's historical importance asmore » evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call "function-loss cost", which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.« less
Protein 3D Structure Computed from Evolutionary Sequence Variation
Sheridan, Robert; Hopf, Thomas A.; Pagnani, Andrea; Zecchina, Riccardo; Sander, Chris
2011-01-01
The evolutionary trajectory of a protein through sequence space is constrained by its function. Collections of sequence homologs record the outcomes of millions of evolutionary experiments in which the protein evolves according to these constraints. Deciphering the evolutionary record held in these sequences and exploiting it for predictive and engineering purposes presents a formidable challenge. The potential benefit of solving this challenge is amplified by the advent of inexpensive high-throughput genomic sequencing. In this paper we ask whether we can infer evolutionary constraints from a set of sequence homologs of a protein. The challenge is to distinguish true co-evolution couplings from the noisy set of observed correlations. We address this challenge using a maximum entropy model of the protein sequence, constrained by the statistics of the multiple sequence alignment, to infer residue pair couplings. Surprisingly, we find that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures. Indeed, the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy. We quantify this observation by computing, from sequence alone, all-atom 3D structures of fifteen test proteins from different fold classes, ranging in size from 50 to 260 residues., including a G-protein coupled receptor. These blinded inferences are de novo, i.e., they do not use homology modeling or sequence-similar fragments from known structures. The co-evolution signals provide sufficient information to determine accurate 3D protein structure to 2.7–4.8 Å Cα-RMSD error relative to the observed structure, over at least two-thirds of the protein (method called EVfold, details at http://EVfold.org). This discovery provides insight into essential interactions constraining protein evolution and will facilitate a comprehensive survey of the universe of protein structures, new strategies in protein and drug design, and the identification of functional genetic variants in normal and disease genomes. PMID:22163331
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobs, Christopher; Lambourne, Luke; Xia, Yu
Here, system-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism's genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene's fitness contribution to an organism "here and now"º and the same gene's historical importance asmore » evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call "function-loss cost", which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.« less
Surgery in World War II. Orthopedic Surgery in the Zone of Interior
1970-01-01
General Reference and Research Branch ROSE C. ENGELMAN, Ph. D., Chief, Historians Branch GERALDINE B. SITES, Chief, Information Activities Branch Major...SERIES Internal Medicine in World War II: Vol. I. Activities of Medical Consultants Vol. II. Infectious Diseases Vol. III. Infectious Diseases and General...Arthropodborne Diseases Other Than Malaria Vol. IX. Special Fields Surgery in World War II: Activities of Surgical Consultants, vol. I Activities of Surgical
Faster Evolution of More Multifunctional Logic Circuits
NASA Technical Reports Server (NTRS)
Stoica, Adrian; Zebulum, Ricardo
2005-01-01
A modification in a method of automated evolutionary synthesis of voltage-controlled multifunctional logic circuits makes it possible to synthesize more circuits in less time. Prior to the modification, the computations for synthesizing a four-function logic circuit by this method took about 10 hours. Using the method as modified, it is possible to synthesize a six-function circuit in less than half an hour. The concepts of automated evolutionary synthesis and voltage-controlled multifunctional logic circuits were described in a number of prior NASA Tech Briefs articles. To recapitulate: A circuit is designed to perform one of several different logic functions, depending on the value of an applied control voltage. The circuit design is synthesized following an automated evolutionary approach that is so named because it is modeled partly after the repetitive trial-and-error process of biological evolution. In this process, random populations of integer strings that encode electronic circuits play a role analogous to that of chromosomes. An evolved circuit is tested by computational simulation (prior to testing in real hardware to verify a final design). Then, in a fitness-evaluation step, responses of the circuit are compared with specifications of target responses and circuits are ranked according to how close they come to satisfying specifications. The results of the evaluation provide guidance for refining designs through further iteration.
Capitanescu, F; Rege, S; Marvuglia, A; Benetto, E; Ahmadi, A; Gutiérrez, T Navarrete; Tiruta-Barna, L
2016-07-15
Empowering decision makers with cost-effective solutions for reducing industrial processes environmental burden, at both design and operation stages, is nowadays a major worldwide concern. The paper addresses this issue for the sector of drinking water production plants (DWPPs), seeking for optimal solutions trading-off operation cost and life cycle assessment (LCA)-based environmental impact while satisfying outlet water quality criteria. This leads to a challenging bi-objective constrained optimization problem, which relies on a computationally expensive intricate process-modelling simulator of the DWPP and has to be solved with limited computational budget. Since mathematical programming methods are unusable in this case, the paper examines the performances in tackling these challenges of six off-the-shelf state-of-the-art global meta-heuristic optimization algorithms, suitable for such simulation-based optimization, namely Strength Pareto Evolutionary Algorithm (SPEA2), Non-dominated Sorting Genetic Algorithm (NSGA-II), Indicator-based Evolutionary Algorithm (IBEA), Multi-Objective Evolutionary Algorithm based on Decomposition (MOEA/D), Differential Evolution (DE), and Particle Swarm Optimization (PSO). The results of optimization reveal that good reduction in both operating cost and environmental impact of the DWPP can be obtained. Furthermore, NSGA-II outperforms the other competing algorithms while MOEA/D and DE perform unexpectedly poorly. Copyright © 2016 Elsevier Ltd. All rights reserved.
An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu
2017-08-18
Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .
Nature-Inspired Cognitive Evolution to Play MS. Pac-Man
NASA Astrophysics Data System (ADS)
Tan, Tse Guan; Teo, Jason; Anthony, Patricia
Recent developments in nature-inspired computation have heightened the need for research into the three main areas of scientific, engineering and industrial applications. Some approaches have reported that it is able to solve dynamic problems and very useful for improving the performance of various complex systems. So far however, there has been little discussion about the effectiveness of the application of these models to computer and video games in particular. The focus of this research is to explore the hybridization of nature-inspired computation methods for optimization of neural network-based cognition in video games, in this case the combination of a neural network with an evolutionary algorithm. In essence, a neural network is an attempt to mimic the extremely complex human brain system, which is building an artificial brain that is able to self-learn intelligently. On the other hand, an evolutionary algorithm is to simulate the biological evolutionary processes that evolve potential solutions in order to solve the problems or tasks by applying the genetic operators such as crossover, mutation and selection into the solutions. This paper investigates the abilities of Evolution Strategies (ES) to evolve feed-forward artificial neural network's internal parameters (i.e. weight and bias values) for automatically generating Ms. Pac-man controllers. The main objective of this game is to clear a maze of dots while avoiding the ghosts and to achieve the highest possible score. The experimental results show that an ES-based system can be successfully applied to automatically generate artificial intelligence for a complex, dynamic and highly stochastic video game environment.
ERIC Educational Resources Information Center
Vitali, Julius
1990-01-01
Explains an experimental photographic technique starting with a realistic photograph. Using various media (oil painting, video/computer photography, and multiprint imagery) the artist changes the photograph's compositional elements. Outlines the phases of this evolutionary process. Illustrates four images created by the technique. (DB)
Hybrid Architectures for Evolutionary Computing Algorithms
2008-01-01
other EC algorithms to FPGA Core Burns P1026/MAPLD 200532 Genetic Algorithm Hardware References S. Scott, A. Samal , and S. Seth, “HGA: A Hardware Based...on Parallel and Distributed Processing (IPPS/SPDP ), pp. 316-320, Proceedings. IEEE Computer Society 1998. [12] Scott, S. D. , Samal , A., and...Algorithm Hardware References S. Scott, A. Samal , and S. Seth, “HGA: A Hardware Based Genetic Algorithm”, Proceedings of the 1995 ACM Third
Insect-Inspired Flight Control for Unmanned Aerial Vehicles
NASA Technical Reports Server (NTRS)
Thakoor, Sarita; Stange, G.; Srinivasan, M.; Chahl, Javaan; Hine, Butler; Zornetzer, Steven
2005-01-01
Flight-control and navigation systems inspired by the structure and function of the visual system and brain of insects have been proposed for a class of developmental miniature robotic aircraft called "biomorphic flyers" described earlier in "Development of Biomorphic Flyers" (NPO-30554), NASA Tech Briefs, Vol. 28, No. 11 (November 2004), page 54. These form a subset of biomorphic explorers, which, as reported in several articles in past issues of NASA Tech Briefs ["Biomorphic Explorers" (NPO-20142), Vol. 22, No. 9 (September 1998), page 71; "Bio-Inspired Engineering of Exploration Systems" (NPO-21142), Vol. 27, No. 5 (May 2003), page 54; and "Cooperative Lander-Surface/Aerial Microflyer Missions for Mars Exploration" (NPO-30286), Vol. 28, No. 5 (May 2004), page 36], are proposed small robots, equipped with microsensors and communication systems, that would incorporate crucial functions of mobility, adaptability, and even cooperative behavior. These functions are inherent to biological organisms but are challenging frontiers for technical systems. Biomorphic flyers could be used on Earth or remote planets to explore otherwise difficult or impossible to reach sites. An example of an exploratory task of search/surveillance functions currently being tested is to obtain high-resolution aerial imagery, using a variety of miniaturized electronic cameras. The control functions to be implemented by the systems in development include holding altitude, avoiding hazards, following terrain, navigation by reference to recognizable terrain features, stabilization of flight, and smooth landing. Flying insects perform these and other functions remarkably well, even though insect brains contains fewer than 10(exp -4) as many neurons as does the human brain. Although most insects have immobile, fixed-focus eyes and lack stereoscopy (and hence cannot perceive depth directly), they utilize a number of ingenious strategies for perceiving, and navigating in, three dimensions. Despite their lack of stereoscopy, insects infer distances to potential obstacles and other objects from image motion cues that result from their own motions in the environment. The concept of motion of texture in images as a source of motion cues is denoted generally as the concept of optic or optical flow. Computationally, a strategy based on optical flow is simpler than is stereoscopy for avoiding hazards and following terrain. Hence, this strategy offers the potential to design vision-based control computing subsystems that would be more compact, would weigh less, and would demand less power than would subsystems of equivalent capability based on a conventional stereoscopic approach.
The Emergence of Operational Art in the Great Sioux War 1876-1877
2013-12-10
Schmitt (Norman: University of Oklahoma Press, 1946), 197. 58 Bourke , Vol. 1, 336-360. 25...feeling of hope, an atmosphere of futility and frustration set in when the column became too cumbersome to catch 78 Bourke , Vol. 1, 381. 79 “Letters...from the column and 81 Greene, Slim Buttes, 29. 82 Bourke , Vol. 2, 76; Gray, 219-220. See also Bourke , Vol. 2, 45. 83 Bourke , Vol. 2, 88-9
Numerical simulation of evolutionary erodible bedforms using the particle finite element method
NASA Astrophysics Data System (ADS)
Bravo, Rafael; Becker, Pablo; Ortiz, Pablo
2017-07-01
This paper presents a numerical strategy for the simulation of flows with evolutionary erodible boundaries. The fluid equations are fully resolved in 3D, while the sediment transport is modelled using the Exner equation and solved with an explicit Lagrangian procedure based on a fixed 2D mesh. Flow and sediment are coupled in geometry by deforming the fluid mesh in the vertical direction and in velocities with the experimental sediment flux computed using the Meyer Peter Müller model. A comparison with real experiments on channels is performed, giving good agreement.
NASA Astrophysics Data System (ADS)
Song, Chen; Zhong-Cheng, Wu; Hong, Lv
2018-03-01
Building Energy forecasting plays an important role in energy management and plan. Using mind evolutionary algorithm to find the optimal network weights and threshold, to optimize the BP neural network, can overcome the problem of the BP neural network into a local minimum point. The optimized network is used for time series prediction, and the same month forecast, to get two predictive values. Then two kinds of predictive values are put into neural network, to get the final forecast value. The effectiveness of the method was verified by experiment with the energy value of three buildings in Hefei.
Geometric morphometrics and virtual anthropology: advances in human evolutionary studies.
Rein, Thomas R; Harvati, Katerina
2014-01-01
Geometric morphometric methods have been increasingly used in paleoanthropology in the last two decades, lending greater power to the analysis and interpretation of the human fossil record. More recently the advent of the wide use of computed tomography and surface scanning, implemented in combination with geometric morphometrics (GM), characterizes a new approach, termed Virtual Anthropology (VA). These methodological advances have led to a number of developments in human evolutionary studies. We present some recent examples of GM and VA related research in human evolution with an emphasis on work conducted at the University of Tübingen and other German research institutions.
VizieR Online Data Catalog: Low-mass helium white dwarfs evolutionary models (Istrate+, 2016)
NASA Astrophysics Data System (ADS)
Istrate, A.; Marchant, P.; Tauris, T. M.; Langer, N.; Stancliffe, R. J.; Grassitelli, L.
2016-07-01
Evolutionary models of low-mass helium white dwarfs including element diffusion and rotational mixing. The WDs are produced considering binary evolution through the LMXB channel, with final WDs masses between ~0.16-~0.44. The models are computed using MESA, for different metallicities: Z=0.02, 0.01, 0.001 and 0.0002. For each metallicity, the models are divided in three categories: (1) basic (no diffusion nor rotation are considered) (2) diffusion (element diffusion is considered) (3) rotation+diffusion (both element diffusion and rotational mixing are considered) (4 data files).
NASA Astrophysics Data System (ADS)
Pini, Giovanni; Tuci, Elio
2008-06-01
In biology/psychology, the capability of natural organisms to learn from the observation/interaction with conspecifics is referred to as social learning. Roboticists have recently developed an interest in social learning, since it might represent an effective strategy to enhance the adaptivity of a team of autonomous robots. In this study, we show that a methodological approach based on artifcial neural networks shaped by evolutionary computation techniques can be successfully employed to synthesise the individual and social learning mechanisms for robots required to learn a desired action (i.e. phototaxis or antiphototaxis).