Science.gov

Sample records for da vinci code

  1. The Real Code of Leonardo da Vinci

    PubMed Central

    Ose, Leiv

    2008-01-01

    Leonardo da Vinci was born in Italy. Among the researchers and scientists, he is favourably known for his remarkable efforts in scientific work. His investigations of atherosclerosis judiciously combine three separate fields of research. In 1506, he finished his masterpiece, painting of Mona Lisa. A careful clinical examination of the famous painting reveals a yellow irregular leather-like spot at the inner end of the left upper eyelid and a soft bumpy well-defined swelling of the dorsum of the right hand beneath the index finger about 3 cm long. This is probably the first case of familial hypercholesterolemia (FH). The FH code of Leonardo da Vinci was given immense consideration by scientists like Carl Muller, who described the xanthomas tuberosum and angina pectoris. On the contrary, Akira Endo searched for microbial metabolites that would inhibit HMG-CoA reductase, the rate-limiting enzyme in the synthesis of cholesterol and finally, Michael Brown and Joseph Goldstein published a remarkable series of elegant and insightful papers in the 70s and 80s. They established that the cellular uptake of low-density lipoprotein (LDL) essentially requires the LDL receptor. In conclusion: this was the real Code of Leonardo da Vinci. PMID:19924278

  2. The Da Vinci code dynamically de-coded.

    PubMed

    Cohen, Mariam

    2005-01-01

    The novel The Da Vinci Code, by Dan Brown has been on best-seller lists for over two years. An examination of Brown's previous novels reveals a well-designed plot line shared by all four novels that not only makes them good "thrillers" but also creates a mythological structure to the novels that draws on common unconscious fantasies in the same way that fairy tales do. One aspect of this mythological structure is the use of evil conspiracies (and benign ones as well) for the protagonist to overcome. In addition, The Da Vinci Code presents a religious theme involving legends about Mary Magdalene. This theme touches on the role of a feminine aspect to divinity in allowing for an erotic connection with the divine.

  3. Leonardo da Vinci and the Downburst.

    NASA Astrophysics Data System (ADS)

    Gedzelman, Stanley David

    1990-05-01

    Evidence from the drawings, experiments, and writings of Leonardo da Vinci are presented to demonstrate that da Vinci recognized and, possibly, discovered the downburst and understood its associated airflow. Other early references to vortex flows resembling downbursts are mentioned.

  4. Media Images Abbott and Costello Meet the End of the World: Who Is the Enemy in "The Da Vinci Code" and "An Inconvenient Truth?"

    ERIC Educational Resources Information Center

    Beck, Bernard

    2007-01-01

    Popular culture requires readily identifiable villains. Subcultural groups often serve this role, creating controversies. Controversies based on religion are especially bitter. As a rule, religion in the movies is inoffensively sentimental, but "The Da Vinci Code" is both popular and provocative, treading on the dangerous ground of Jesus's…

  5. How to Think Like Leonardo da Vinci

    ERIC Educational Resources Information Center

    Caouette, Ralph

    2008-01-01

    To be effective and relevant in twenty-first-century learning, art needs to be more inclusive. In this article, the author discusses how teachers can find a good example in Leonardo da Vinci for building an art program. His art, design, and curiosity are the perfect foundation for any art program, at any level. (Contains 3 resources and 3 online…

  6. Hidden sketches by Leonardo da Vinci revealed

    NASA Astrophysics Data System (ADS)

    Dumé, Belle

    2009-02-01

    Three drawings on the back of Leonardo da Vinci's The Virgin and Child with St Anne (circa 1508) have been discovered by researchers led by Michel Menu from the Centre de Recherche et de Restauration des Musées de France (C2RMF) and the Louvre Museum in Paris.

  7. [Leonardo da Vinci--a dyslectic genius?].

    PubMed

    Røsstad, Anna

    2002-12-10

    Leonardo da Vinci's texts consist almost exclusively of scientific notes. Working on a book on Leonardo's art, I studied all Leonardo's published texts carefully for any new information. In some prefaces I came to suspect that Leonardo might have suffered from dyslexia. This article considers the question of whether it is possible to find indications of dyslexia in Leonardo's texts and in the accounts of his life.

  8. Leonardo da Vinci's studies of the heart.

    PubMed

    Shoja, Mohammadali M; Agutter, Paul S; Loukas, Marios; Benninger, Brion; Shokouhi, Ghaffar; Namdar, Husain; Ghabili, Kamyar; Khalili, Majid; Tubbs, R Shane

    2013-08-20

    Leonardo da Vinci's detailed drawings are justly celebrated; however, less well known are his accounts of the structures and functions of the organs. In this paper, we focus on his illustrations of the heart, his conjectures about heart and blood vessel function, his experiments on model systems to test those conjectures, and his unprecedented conclusions about the way in which the cardiovascular system operates. In particular, da Vinci seems to have been the first to recognize that the heart is a muscle and that systole is the active phase of the pump. He also seems to have understood the functions of the auricles and pulmonary veins, identified the relationship between the cardiac cycle and the pulse, and explained the hemodynamic mechanism of valve opening and closure. He also described anatomical variations and changes in structure and function that occurred with age. We outline da Vinci's varied career and suggest ways in which his personality, experience, skills and intellectual heritage contributed to these advances in understanding. We also consider his influence on later studies in anatomy and physiology.

  9. The Case: Bunche-Da Vinci Learning Partnership Academy

    ERIC Educational Resources Information Center

    Eisenberg, Nicole; Winters, Lynn; Alkin, Marvin C.

    2005-01-01

    The Bunche-Da Vinci case described in this article presents a situation at Bunche Elementary School that four theorists were asked to address in their evaluation designs (see EJ791771, EJ719772, EJ791773, and EJ792694). The Bunche-Da Vinci Learning Partnership Academy, an elementary school located between an urban port city and a historically…

  10. Leonardo da Vinci's contributions to neuroscience.

    PubMed

    Pevsner, Jonathan

    2002-04-01

    Leonardo da Vinci (1452-1519) made far-reaching contributions to many areas of science, technology and art. Leonardo's pioneering research into the brain led him to discoveries in neuroanatomy (such as those of the frontal sinus and meningeal vessels) and neurophysiology (he was the first to pith a frog). His injection of hot wax into the brain of an ox provided a cast of the ventricles, and represents the first known use of a solidifying medium to define the shape and size of an internal body structure. Leonardo developed an original, mechanistic model of sensory physiology. He undertook his research with the broad goal of providing physical explanations of how the brain processes visual and other sensory input, and integrates that information via the soul.

  11. Tree Branching: Leonardo da Vinci's Rule versus Biomechanical Models

    PubMed Central

    Minamino, Ryoko; Tateno, Masaki

    2014-01-01

    This study examined Leonardo da Vinci's rule (i.e., the sum of the cross-sectional area of all tree branches above a branching point at any height is equal to the cross-sectional area of the trunk or the branch immediately below the branching point) using simulations based on two biomechanical models: the uniform stress and elastic similarity models. Model calculations of the daughter/mother ratio (i.e., the ratio of the total cross-sectional area of the daughter branches to the cross-sectional area of the mother branch at the branching point) showed that both biomechanical models agreed with da Vinci's rule when the branching angles of daughter branches and the weights of lateral daughter branches were small; however, the models deviated from da Vinci's rule as the weights and/or the branching angles of lateral daughter branches increased. The calculated values of the two models were largely similar but differed in some ways. Field measurements of Fagus crenata and Abies homolepis also fit this trend, wherein models deviated from da Vinci's rule with increasing relative weights of lateral daughter branches. However, this deviation was small for a branching pattern in nature, where empirical measurements were taken under realistic measurement conditions; thus, da Vinci's rule did not critically contradict the biomechanical models in the case of real branching patterns, though the model calculations described the contradiction between da Vinci's rule and the biomechanical models. The field data for Fagus crenata fit the uniform stress model best, indicating that stress uniformity is the key constraint of branch morphology in Fagus crenata rather than elastic similarity or da Vinci's rule. On the other hand, mechanical constraints are not necessarily significant in the morphology of Abies homolepis branches, depending on the number of daughter branches. Rather, these branches were often in agreement with da Vinci's rule. PMID:24714065

  12. Leonardo da Vinci (1452-1519)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Painter, inventor and polymath, born in Vinci (near Empolia), Italy. Although astronomy does not figure large in Leonardo's works, he realized the possibility of constructing a telescope (`making glasses to see the Moon enlarged'). He suggested that `… in order to observe the nature of the planets, open the roof and bring the image of a single planet onto the base of a concave mirror. The image o...

  13. Leonardo da Vinci and the sinuses of Valsalva.

    PubMed

    Robicsek, F

    1991-08-01

    Recent studies indicate that eddy currents generated by the sinuses of Valsalva play an important role in the physiologic closure of the aortic valve. This process is briefly discussed and evidence is presented that this fact was well known and elaborated upon by the renaissance artist Leonardo da Vinci. This fact is illustrated with his words and drawings.

  14. Studying and Working Abroad. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles recent successful examples of students studying and working abroad as part of the European Commission's Leonardo da Vinci program, which is designed to give students across the European Union the opportunity to experience vocational training in a foreign country. The following examples are presented: (1) 3 Finnish students…

  15. Training and Health. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles programs in the fields of health and medicine that are offered through the European Commission's Leonardo da Vinci program. The following programs are profiled: (1) CYTOTRAIN (a transnational vocational training program in cervical cancer screening); (2) Apollo (a program of open and distance learning for paramedical…

  16. Women and Technical Professions. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles programs for women in technical professions that are offered through the European Commission's Leonardo da Vinci program. The following programs are profiled: (1) Artemis and Diana (vocational guidance programs to help direct girls toward technology-related careers); (2) CEEWIT (an Internet-based information and…

  17. The Potential da Vinci in All of Us

    ERIC Educational Resources Information Center

    Petto, Sarah; Petto, Andrew

    2009-01-01

    The study of the human form is fundamental to both science and art curricula. For vertebrates, perhaps no feature is more important than the skeleton to determine observable form and function. As Leonard da Vinci's famous Proportions of the Human Figure (Virtruvian Man) illustrates, the size, shape, and proportions of the human body are defined by…

  18. [Lobectomy for lung cancer using the Da Vinci surgical system].

    PubMed

    Nakamura, Hiroshige

    2014-05-01

    Robot-assisted surgery using the da Vinci surgical system has attracted attention because of excellent operability without shaking by joint forceps under the clear vision of a three-dimensional high-definition camera in lung cancer surgery. Although this form of advanced medical care is not yet approved for insurance coverage, it is at the stage of clinical research and expected to be useful in hilar exposure, lymph node dissection, and suturing of the lung parenchyma or bronchus. Lung cancer surgery with the da Vinci system has the advantage of combining thoracotomy and minimally invasive surgery in video-assisted thoracic surgery. However, safety management, education, and significant cost are problems to be resolved. Several important issues such as sharing knowledge and technology of robotic surgery, education, training, development of new instruments, and acquisition of advanced medical insurance are discussed for the future development of robotic surgical systems. PMID:24946522

  19. [Lobectomy for lung cancer using the Da Vinci surgical system].

    PubMed

    Nakamura, Hiroshige

    2014-05-01

    Robot-assisted surgery using the da Vinci surgical system has attracted attention because of excellent operability without shaking by joint forceps under the clear vision of a three-dimensional high-definition camera in lung cancer surgery. Although this form of advanced medical care is not yet approved for insurance coverage, it is at the stage of clinical research and expected to be useful in hilar exposure, lymph node dissection, and suturing of the lung parenchyma or bronchus. Lung cancer surgery with the da Vinci system has the advantage of combining thoracotomy and minimally invasive surgery in video-assisted thoracic surgery. However, safety management, education, and significant cost are problems to be resolved. Several important issues such as sharing knowledge and technology of robotic surgery, education, training, development of new instruments, and acquisition of advanced medical insurance are discussed for the future development of robotic surgical systems.

  20. On the sexual intercourse drawings of Leonardo da Vinci.

    PubMed

    Morris, A G

    1986-04-12

    Leonardo da Vinci's marvellous anatomical drawings have been praised by both artists and medical historians over the centuries. Specific reference is made here to Leonardo's drawings of the act of sexual intercourse. It is shown that his illustrations are not based purely on original observation. Rather, they are an attempt to illustrate and clarify anatomy and physiology as presented in the textbooks of the time.

  1. Da Vinci's codex and the anatomy of healthcare.

    PubMed

    Stephens-Borg, Keith

    2012-08-01

    We usually display a laid-back approach to medical jargon throughout our theatre work. The word 'perioperative' is built from the Greek word 'peri' (around) and the Latin 'operari' (to work). Latin and Greek became the prefixed language of choice for Leonardo da Vinci, and his research was pivotal in determining the way in which surgical procedures are documented. Ancient manuscripts aided the unfolding of the secrets of anatomy, and Leonardo revealed that art was the key in expressive detailed explanation.

  2. A Proposal to Build Evaluation Capacity at the Bunche-Da Vinci Learning Partnership Academy

    ERIC Educational Resources Information Center

    King, Jean A.

    2005-01-01

    The author describes potential evaluation capacity-building activities in contrast to the specifics of an evaluation design. Her response to the case of the Bunche-Da Vinci Learning Partnership Academy is developed in three parts: (1) an initial framing of the Bunche-Da Vinci situation; (2) what should be done before signing a contract; and (3)…

  3. A Creative Approach to the Common Core Standards: The Da Vinci Curriculum

    ERIC Educational Resources Information Center

    Chaucer, Harry

    2012-01-01

    "A Creative Approach to the Common Core Standards: The Da Vinci Curriculum" challenges educators to design programs that boldly embrace the Common Core State Standards by imaginatively drawing from the genius of great men and women such as Leonardo da Vinci. A central figure in the High Renaissance, Leonardo made extraordinary contributions as a…

  4. Leonardo da Vinci and Kethem-Kiveris vena.

    PubMed

    Dolezal, Antonín; Skorepova-Honzlova, Zita; Jelen, Karel

    2012-01-01

    In the drawing of coitus by Leonardo da Vinci are pictured the contemporary hypotheses regarding this act. The authors analyze the mamillaruteral connection depicted by the artist and grow up to believe that this is a hypothetical kiveris vena, female vein described by Anatomist Master Nicolai Physicus from the Salerno School. The Hebrew roots were found in the name. The connection is described also by Mondino in The Anathomia. The same connection can be found in the picture of the pregnant woman in Fasciculus Medicinæ by Johannes De Ketham.

  5. LEONARDO DA VINCI AND THE ORIGIN OF SEMEN.

    PubMed

    Noble, Denis; DiFrancesco, Dario; Zancani, Diego

    2014-12-20

    It is well known that Leonardo da Vinci made several drawings of the human male anatomy. The early drawings (before 1500) were incorrect in identifying the origin of semen, where he followed accepted teaching of his time. It is widely thought that he did not correct this mistake, a view that is reflected in several biographies. In fact, he made a later drawing (after 1500) in which the description of the anatomy is remarkably accurate and must have been based on careful dissection. In addition to highlighting this fact, acknowledged previously in only one other source, this article reviews the background to Leonardo's knowledge of the relevant anatomy.

  6. Visual tracking of da Vinci instruments for laparoscopic surgery

    NASA Astrophysics Data System (ADS)

    Speidel, S.; Kuhn, E.; Bodenstedt, S.; Röhl, S.; Kenngott, H.; Müller-Stich, B.; Dillmann, R.

    2014-03-01

    Intraoperative tracking of laparoscopic instruments is a prerequisite to realize further assistance functions. Since endoscopic images are always available, this sensor input can be used to localize the instruments without special devices or robot kinematics. In this paper, we present an image-based markerless 3D tracking of different da Vinci instruments in near real-time without an explicit model. The method is based on different visual cues to segment the instrument tip, calculates a tip point and uses a multiple object particle filter for tracking. The accuracy and robustness is evaluated with in vivo data.

  7. Leonardo da Vinci and the origin of semen

    PubMed Central

    Noble, Denis; DiFrancesco, Dario; Zancani, Diego

    2014-01-01

    It is well known that Leonardo da Vinci made several drawings of the human male anatomy. The early drawings (before 1500) were incorrect in identifying the origin of semen, where he followed accepted teaching of his time. It is widely thought that he did not correct this mistake, a view that is reflected in several biographies. In fact, he made a later drawing (after 1500) in which the description of the anatomy is remarkably accurate and must have been based on careful dissection. In addition to highlighting this fact, acknowledged previously in only one other source, this article reviews the background to Leonardo's knowledge of the relevant anatomy. PMID:27494016

  8. Leonardo Da Vinci and stroke - vegetarian diet as a possible cause.

    PubMed

    Oztürk, Serefnur; Altieri, Marta; Troisi, Pina

    2010-01-01

    Leonardo da Vinci (April 15, 1452 to May 2, 1519) was an Italian Renaissance architect, musician, anatomist, inventor, engineer, sculptor, geometer, and painter. It has been gleaned from the many available historical documents that da Vinci was a vegetarian who respected and loved animals, and that he suffered from right hemiparesis in the last 5 years of his life. A vegetarian diet has both positive and negative influences on the cerebrovascular system. In this report, a possible relation between a vegetarian diet and stroke is discussed from various perspectives as related to Leonardo da Vinci's stroke.

  9. Sine ars scientia nihil est: Leonardo da Vinci and beyond.

    PubMed

    Kickhöfel, Eduardo H P

    2009-01-01

    The aim of this article is to reflect on the relationship between art and science so far as it concerns a symposium on neurosciences. We undertake a historical overview of that relationship, paying particular attention to the sui generis case of Leonardo da Vinci, who very often is regarded as the man who worked on art and science with equal ease. We then explain why his idea of merging these two forms of knowledge failed, considering the clear-cut distinction between art and science in his time. With this clarification, we explore the matter today. We look at Raphael's The Transfiguration, in which the representation of the possessed boy is seen by neuroscientists as indicative of an epileptic seizure. We also look at the ideas of neuroscientists Semir Zeki and Vilayanur Ramachandran, who study particular aspects of brain function and suggest a new merging of art and science.

  10. Leonardo da Vinci: the search for the soul.

    PubMed

    Del Maestro, R F

    1998-11-01

    The human race has always contemplated the question of the anatomical location of the soul. During the Renaissance the controversy crystallized into those individuals who supported the heart ("cardiocentric soul") and others who supported the brain ("cephalocentric soul") as the abode for this elusive entity. Leonardo da Vinci (1452-1519) joined a long list of other explorers in the "search for the soul." The method he used to resolve this anatomical problem involved the accumulation of information from ancient and contemporary sources, careful notetaking, discussions with acknowledged experts, and his own personal search for the truth. Leonardo used a myriad of innovative methods acquired from his knowledge of painting, sculpture, and architecture to define more clearly the site of the "senso comune"--the soul. In this review the author examines the sources of this ancient question, the knowledge base tapped by Leonardo for his personal search for the soul, and the views of key individuals who followed him.

  11. [Regarding the Manuscript D " Dell' occhio " of Leonardo da Vinci].

    PubMed

    Heitz, Robert F

    2009-01-01

    Leonardo da Vinci's Manuscript D consists of five double pages sheets, which, folded in two, comprise ten folios. This document, in the old Tuscan dialect and mirror writing, reveals the ideas of Leonardo on the anatomy of the eye in relation to the formation of images and visual perception. Leonardo explains in particular the behavior of the rays in the eye in terms of refraction and reflection, and is very mechanistic in his conception of the eye and of the visual process. The most significant innovations found in these folios are the concept of the eye as a camera obscura and the intersection of light rays in the interior of the eye. His texts nevertheless show hesitation, doubts and a troubled confusion, reflecting the ideas and uncertainties of his era. He did not share his results in his lifetime, despite both printing and etching being readily available to him.

  12. Thinking like Leonardo da Vinci and its implications for the modern doctor.

    PubMed

    Baum, Neil

    2013-01-01

    Most people when asked to name the most creative, innovative, and multidimensional people in history would agree that Leonardo da Vinci is either at the top or very close to the number one position on that list. Wouldn't it be nice to think like da Vinci? This article shares the seven unique principles of thinking that da Vinci used that enabled him to be the greatest painter, sculptor, architect, musician, mathematician, engineer, inventor, anatomist, geologist, cartographer, botanist, and writer of his (if not of all) time. This article will take you deep into the notebooks and codices of da Vinci, and suggest ways his ideas can be used by anyone in the healthcare profession to make them a better healthcare provider.

  13. Thinking like Leonardo da Vinci and its implications for the modern doctor.

    PubMed

    Baum, Neil

    2013-01-01

    Most people when asked to name the most creative, innovative, and multidimensional people in history would agree that Leonardo da Vinci is either at the top or very close to the number one position on that list. Wouldn't it be nice to think like da Vinci? This article shares the seven unique principles of thinking that da Vinci used that enabled him to be the greatest painter, sculptor, architect, musician, mathematician, engineer, inventor, anatomist, geologist, cartographer, botanist, and writer of his (if not of all) time. This article will take you deep into the notebooks and codices of da Vinci, and suggest ways his ideas can be used by anyone in the healthcare profession to make them a better healthcare provider. PMID:24228380

  14. Load evaluation of the da Vinci surgical system for transoral robotic surgery.

    PubMed

    Fujiwara, Kazunori; Fukuhara, Takahiro; Niimi, Koji; Sato, Takahiro; Kitano, Hiroya

    2015-12-01

    Transoral robotic surgery, performed with the da Vinci surgical system (da Vinci), is a surgical approach for benign and malignant lesions of the oral cavity and laryngopharynx. It provides several unique advantages, which include a 3-dimensional magnified view and ability to see and work around curves or angles. However, the current da Vinci surgical system does not provide haptic feedback. This is problematic because the potential risks specific to the transoral use of the da Vinci include tooth injury, mucosal laceration, ocular injury and mandibular fracture. To assess the potential for intraoperative injuries, we measured the load of the endoscope and the instrument of the da Vinci Si surgical system. We pressed the endoscope and instrument of the da Vinci Si against Load cell six times each and measured the dynamic load and the time-to-maximum load. We also struck the da Vinci Si endoscope and instrument against the Load cell six times each and measured the impact load. The maximum dynamic load was 7.27 ± 1.31 kg for the endoscope and 1.90 ± 0.72 for the instrument. The corresponding time-to-maximum loads were 1.72 ± 0.22 and 1.29 ± 0.34 s, but the impact loads were significantly lower than the dynamic load. It remains possible that a major load is exerted on adjacent structures by continuous contact with the endoscope and instrument of da Vinci Si. However, there is a minor delay in reaching the maximum load. Careful monitoring by an on-site assistant may, therefore, help prevent contiguous injury.

  15. [Project Leonardo-da-Vinci for better nursing care].

    PubMed

    Gábor, Katalin; Csanádi, Lajosné; Helembai, Kornélia; Szögi, Zoltánné; Tulkán, Ibolya; Unginé, Kántor Katalin

    2002-08-18

    The aim of the present paper is to inform physicians about the work completed by nurses and professors of baccalaureat nurses in the framework of Leonardo da Vinci project, organised and sponsored by the European Union. The goal of the project was to increase the effectiveness of chief nurses throughout their further training programme in the field of management. The team of Szeged chose the human resource management, since in this field is possible to achieve the greatest improvement with the smallest financial investment. We measured the fluctuations and the absentees of the nurses, the changes in the degree of education, the nurse' and patient' satisfaction at the beginning and at the end of the period studied. Except the patient's satisfaction all the other parameters improved by the end of tested period. The project provided a unique possibility to compare the state of the Hungarian nursing with that of the countries belonging to the European Union, to exchange the experience and to learn some new methods. In the framework of this project a book of two volumes was prepared containing the suggestions of EU. This book is widely available in English and in French.

  16. Leonardo da Vinci and the first hemodynamic observations.

    PubMed

    Martins e Silva, J

    2008-02-01

    Leonardo da Vinci was a genius whose accomplishments and ideas come down to us today, five centuries later, with the freshness of innovation and the fascination of discovery. This brief review begins with a summary of Leonardo's life and a description of the most important works of art that he bequeathed us, and then concentrates on his last great challenge. There was a point at which Leonardo's passion for art gave way to the study of human anatomy, not only to improve his drawing but to go beyond what had been simply a representation of form to understand the underlying functioning. Among his many interests, we focus on his study of the heart and blood vessels, which he observed carefully in animals and human autopsies, and reproduced in drawings of great quality with annotations of astonishing acuteness. The experience that he had acquired from observing the flow of water in currents and around obstacles, and the conclusions that he drew concerning hydrodynamics, were central to his interpretation of the mechanisms of the heart and of blood flow, to which he devoted much of his time between 1508 and 1513. From these studies, immortalized in drawings of great clarity, come what are acknowledged to be the first hemodynamic records, in which Leonardo demonstrates the characteristics of blood flow in the aorta and great vessels and the importance of blood reflux and the formation of eddies in the sinus in aortic valve his assiduous and careful observations, and his subsequent deductions, Leonardo put forward detailed findings on hemodynamic questions that advanced technology has only recently enabled us to confirm.

  17. Evolution of robots throughout history from Hephaestus to Da Vinci Robot.

    PubMed

    Iavazzo, Christos; Gkegke, Xanthi-Ekaterini D; Iavazzo, Paraskevi-Evangelia; Gkegkes, Ioannis D

    2014-01-01

    Da Vinci robot is increasingly used for operations adding the advantages of robots to the favor of medicine. This is a historical article with the aim to present the evolution of robots in the medical area from the time of ancient myths to Renaissance and finally to the current revolutionary applications. We endeavored to collect several elegant narratives on the topic. The use of imagination could help the reader to find similarities. A trip from the Greek myths of Hephaestus through Aristotle and Leonardo Da Vinci to the robots of Karel Capek and Isaac Asimov and finally the invention of the medical robots is presented.

  18. Evolution of robots throughout history from Hephaestus to Da Vinci Robot.

    PubMed

    Iavazzo, Christos; Gkegke, Xanthi-Ekaterini D; Iavazzo, Paraskevi-Evangelia; Gkegkes, Ioannis D

    2014-01-01

    Da Vinci robot is increasingly used for operations adding the advantages of robots to the favor of medicine. This is a historical article with the aim to present the evolution of robots in the medical area from the time of ancient myths to Renaissance and finally to the current revolutionary applications. We endeavored to collect several elegant narratives on the topic. The use of imagination could help the reader to find similarities. A trip from the Greek myths of Hephaestus through Aristotle and Leonardo Da Vinci to the robots of Karel Capek and Isaac Asimov and finally the invention of the medical robots is presented. PMID:25811686

  19. Leonardo da Vinci, One Year on...a Different Look at Vocational Training in Europe.

    ERIC Educational Resources Information Center

    Le Magazine, 1996

    1996-01-01

    Discusses the success of the Leonardo da Vinci program, a European laboratory of innovation in vocational training, a priority focus of investment in human resources and intelligence, and a way to mobilize innovative forces beyond national boundaries. Trends identified by the program focus on new information and communication technologies. (JOW)

  20. Transparency of Vocational Qualifications: The Leonardo da Vinci Approach. CEDEFOP Panorama Series.

    ERIC Educational Resources Information Center

    Bjornavold, Jens; Pettersson, Sten

    This report gives an overview of the situation of transparency of vocational qualifications by presenting measures introduced at the European Community level and by drawing attention to projects within the Leonardo da Vinci Program dealing with the issue. A 16-page executive summary appears first. Chapter 1 provides general background and aims.…

  1. Modifications of transaxillary approach in endoscopic da Vinci-assisted thyroid and parathyroid gland surgery.

    PubMed

    Al Kadah, Basel; Piccoli, Micaela; Mullineris, Barbara; Colli, Giovanni; Janssen, Martin; Siemer, Stephan; Schick, Bernhard

    2015-03-01

    Endoscopic surgery for treatment of thyroid and parathyroid pathologies is increasingly gaining attention. The da Vinci system has already been widely used in different fields of medicine and quite recently in thyroid and parathyroid surgery. Herein, we report about modifications of the transaxillary approach in endoscopic surgery of thyroid and parathyroid gland pathologies using the da Vinci system. 16 patients suffering from struma nodosa in 14 cases and parathyroid adenomas in two cases were treated using the da Vinci system at the ENT Department of Homburg/Saar University and in cooperation with the Department of General Surgery in New Sant'Agostino Hospital, Modena/Italy. Two different retractors, endoscopic preparation of the access and three different incision modalities were used. The endoscopic preparation of the access allowed us to have a better view during preparation and reduced surgical time compared to the use of a headlamp. To introduce the da Vinci instruments at the end of the access preparation, the skin incisions were over the axilla with one incision in eight patients, two incisions in four patients and three incisions in a further four patients. The two and three skin incisions modality allowed introduction of the da Vinci instruments without arm conflicts. The use of a new retractor (Modena retractor) compared to a self-developed retractor made it easier during the endoscopic preparation of the access and the reposition of the retractor. The scar was hidden in the axilla and independent of the incisions selected, the cosmetic findings were judged by the patients to be excellent. The neurovascular structures such as inferior laryngeal nerve, superior laryngeal nerve and vessels, as well as the different pathologies, were clearly 3D visualized in all 16 cases. No paralysis of the vocal cord was observed. All patients had a benign pathology in their histological examination. The endoscopic surgery of the thyroid and parathyroid gland can be

  2. Towards the Implementation of an Autonomous Camera Algorithm on the da Vinci Platform.

    PubMed

    Eslamian, Shahab; Reisner, Luke A; King, Brady W; Pandya, Abhilash K

    2016-01-01

    Camera positioning is critical for all telerobotic surgical systems. Inadequate visualization of the remote site can lead to serious errors that can jeopardize the patient. An autonomous camera algorithm has been developed on a medical robot (da Vinci) simulator. It is found to be robust in key scenarios of operation. This system behaves with predictable and expected actions for the camera arm with respect to the tool positions. The implementation of this system is described herein. The simulation closely models the methodology needed to implement autonomous camera control in a real hardware system. The camera control algorithm follows three rules: (1) keep the view centered on the tools, (2) keep the zoom level optimized such that the tools never leave the field of view, and (3) avoid unnecessary movement of the camera that may distract/disorient the surgeon. Our future work will apply this algorithm to the real da Vinci hardware.

  3. From Jacobeaus to the da Vinci: thoracoscopic applications of the robot.

    PubMed

    Al-Mufarrej, Faisal; Margolis, Marc; Tempesta, Barbara; Strother, Eric; Najam, Farzad; Gharagozloo, Farid

    2010-02-01

    With the increasing recognition of the benefits of minimally invasive surgery, surgical technology has evolved significantly since Jacobeaus' first attempt at thoracoscopy 100 years ago. Currently, video-assisted thoracic surgery occupies a significant role in the diagnosis and treatment of benign and malignant diseases of the chest. However, the clinical application of video-assisted thoracic surgery is limited by the technical shortcomings of the approach. Although the da Vinci system (Intuitive Surgical) is not the first robotic surgical system, it has been the most successful and widely applicable. After early applications in general and urologic surgery, the da Vinci robot extended its arms into the field of thoracic surgery, broadening the applicability of minimally invasive thoracic surgery. We review the available literature on robot-assisted thoracic surgery in attempt to better define the current role of the robot in pulmonary, mediastinal, and esophageal surgeries.

  4. Towards the Implementation of an Autonomous Camera Algorithm on the da Vinci Platform.

    PubMed

    Eslamian, Shahab; Reisner, Luke A; King, Brady W; Pandya, Abhilash K

    2016-01-01

    Camera positioning is critical for all telerobotic surgical systems. Inadequate visualization of the remote site can lead to serious errors that can jeopardize the patient. An autonomous camera algorithm has been developed on a medical robot (da Vinci) simulator. It is found to be robust in key scenarios of operation. This system behaves with predictable and expected actions for the camera arm with respect to the tool positions. The implementation of this system is described herein. The simulation closely models the methodology needed to implement autonomous camera control in a real hardware system. The camera control algorithm follows three rules: (1) keep the view centered on the tools, (2) keep the zoom level optimized such that the tools never leave the field of view, and (3) avoid unnecessary movement of the camera that may distract/disorient the surgeon. Our future work will apply this algorithm to the real da Vinci hardware. PMID:27046563

  5. Visual degradation in Leonardo da Vinci's iconic self-portrait: A nanoscale study

    NASA Astrophysics Data System (ADS)

    Conte, A. Mosca; Pulci, O.; Misiti, M. C.; Lojewska, J.; Teodonio, L.; Violante, C.; Missori, M.

    2014-06-01

    The discoloration of ancient paper, due to the development of oxidized groups acting as chromophores in its chief component, cellulose, is responsible for severe visual degradation in ancient artifacts. By adopting a non-destructive approach based on the combination of optical reflectance measurements and time-dependent density functional theory ab-initio calculations, we describe and quantify the chromophores affecting Leonardo da Vinci's iconic self-portrait. Their relative concentrations are very similar to those measured in modern and ancient samples aged in humid environments. This analysis quantifies the present level of optical degradation of the Leonardo da Vinci's self-portrait which, compared with future measurements, will assess its degradation rate. This is a fundamental information in order to plan appropriate conservation strategies.

  6. Surgical techniques: robot-assisted laparoscopic colposacropexy with the da Vinci(®) surgical system.

    PubMed

    Matthews, Catherine A

    2009-03-01

    Colposacropexy is the gold-standard operation for repair of apical vaginal support defects. While it is feasible to perform this operation using conventional laparoscopic techniques, a limited number of surgeons have mastered the advanced minimally invasive skills that are required. Introduction of the da Vinci(®) robotic system with instruments that have improved dexterity and precision and a camera system with three-dimensional imaging presents an opportunity for more surgeons treating women with pelvic organ prolapse to perform the procedure laparoscopically. This paper will outline a technique that is exactly modeled after the open procedure for completion of a robotic-assisted colposacropexy using the da Vinci(®) surgical system. PMID:27628451

  7. The Handedness of Leonardo da Vinci: A Tale of the Complexities of Lateralisation

    ERIC Educational Resources Information Center

    McManus, I. C.; Drury, Helena

    2004-01-01

    The handedness of Leonardo da Vinci is controversial. Although there is little doubt that many of his well-attributed drawings were drawn with the left hand, the hatch marks of the shading going downwards from left to right, it is not clear that he was a natural left-hander, there being some suggestion that he may have become left-handed as the…

  8. da Vinci robot-assisted keyhole neurosurgery: a cadaver study on feasibility and safety.

    PubMed

    Marcus, Hani J; Hughes-Hallett, Archie; Cundy, Thomas P; Yang, Guang-Zhong; Darzi, Ara; Nandi, Dipankar

    2015-04-01

    The goal of this cadaver study was to evaluate the feasibility and safety of da Vinci robot-assisted keyhole neurosurgery. Several keyhole craniotomies were fashioned including supraorbital subfrontal, retrosigmoid and supracerebellar infratentorial. In each case, a simple durotomy was performed, and the flap was retracted. The da Vinci surgical system was then used to perform arachnoid dissection towards the deep-seated intracranial cisterns. It was not possible to simultaneously pass the 12-mm endoscope and instruments through the keyhole craniotomy in any of the approaches performed, limiting visualization. The articulated instruments provided greater dexterity than existing tools, but the instrument arms could not be placed in parallel through the keyhole craniotomy and, therefore, could not be advanced to the deep cisterns without significant clashing. The da Vinci console offered considerable ergonomic advantages over the existing operating room arrangement, allowing the operating surgeon to remain non-sterile and seated comfortably throughout the procedure. However, the lack of haptic feedback was a notable limitation. In conclusion, while robotic platforms have the potential to greatly enhance the performance of transcranial approaches, there is strong justification for research into next-generation robots, better suited to keyhole neurosurgery.

  9. [History of robotics: from archytas of tarentum until Da Vinci robot. (Part II)].

    PubMed

    Sánchez-Martín, F M; Jiménez Schlegl, P; Millán Rodríguez, F; Salvador-Bayarri, J; Monllau Font, V; Palou Redorta, J; Villavicencio Mavrich, H

    2007-03-01

    Robotic surgery is a reality. In order to to understand how new robots work is interesting to know the history of ancient (see part i) and modern robotics. The desire to design automatic machines imitating humans continued for more than 4000 years. Archytas of Tarentum (at around 400 a.C.), Heron of Alexandria, Hsieh-Fec, Al-Jazari, Bacon, Turriano, Leonardo da Vinci, Vaucanson o von Kempelen were robot inventors. At 1942 Asimov published the three robotics laws. Mechanics, electronics and informatics advances at XXth century developed robots to be able to do very complex self governing works. At 1985 the robot PUMA 560 was employed to introduce a needle inside the brain. Later on, they were designed surgical robots like World First, Robodoc, Gaspar o Acrobot, Zeus, AESOP, Probot o PAKI-RCP. At 2000 the FDA approved the da Vinci Surgical System (Intuitive Surgical Inc, Sunnyvale, CA, USA), a very sophisticated robot to assist surgeons. Currently urological procedures like prostatectomy, cystectomy and nephrectomy are performed with the da Vinci, so urology has become a very suitable speciality to robotic surgery.

  10. da Vinci robot-assisted keyhole neurosurgery: a cadaver study on feasibility and safety.

    PubMed

    Marcus, Hani J; Hughes-Hallett, Archie; Cundy, Thomas P; Yang, Guang-Zhong; Darzi, Ara; Nandi, Dipankar

    2015-04-01

    The goal of this cadaver study was to evaluate the feasibility and safety of da Vinci robot-assisted keyhole neurosurgery. Several keyhole craniotomies were fashioned including supraorbital subfrontal, retrosigmoid and supracerebellar infratentorial. In each case, a simple durotomy was performed, and the flap was retracted. The da Vinci surgical system was then used to perform arachnoid dissection towards the deep-seated intracranial cisterns. It was not possible to simultaneously pass the 12-mm endoscope and instruments through the keyhole craniotomy in any of the approaches performed, limiting visualization. The articulated instruments provided greater dexterity than existing tools, but the instrument arms could not be placed in parallel through the keyhole craniotomy and, therefore, could not be advanced to the deep cisterns without significant clashing. The da Vinci console offered considerable ergonomic advantages over the existing operating room arrangement, allowing the operating surgeon to remain non-sterile and seated comfortably throughout the procedure. However, the lack of haptic feedback was a notable limitation. In conclusion, while robotic platforms have the potential to greatly enhance the performance of transcranial approaches, there is strong justification for research into next-generation robots, better suited to keyhole neurosurgery. PMID:25516094

  11. [The Vitruvian Man: an anatomical drawing for proportions by Leonardo Da Vinci].

    PubMed

    Le Floch-Prigent, P

    2008-12-01

    The aim of the study was to find out and to analyse the text by Vitruvius which inspired the famous drawing by Leonardo da Vinci (circa 1490) kept in the Galleria dell'Accademia, in Venezia, Italy: the man inscribed in one circle and in one square. The book "de Architectura" by Vitruvius Marcus Pollio was printed several times since the Renaissance when both the roman architecture of antiquity and this text became very popular. From a French translation by Claude Perrault in 1864, it became easy to find a French translation with the original text in Latin (Paris, 2003, Les Belles Lettres, French text by Pierre Gros). The drawing by Leonardo da Vinci illustrates with great accuracy and fidelity the quotation of Vitruvius (with the exception of two of the 12 main relationships). The genius of Leonardo da Vinci was to keep only one trunk, head and neck for two pairs of limbs: scapular and pelvic; to make the circle tangent to the lower edge of the square; to adjust a few features of the quotation for the equilibrium of the whole figure; and of course to bring his incredible skill as a drawer (one of the best of his century). The drawing was made on a sheet of paper 344x245mm, in black ink which became dark brown with time; several lines complete the figure above and below; a short caption and a horizontal scale appear just under the drawing. The celebrity of the drawing, a symbol of the Renaissance, of the equilibrium of man and mankind, of the universality of the artists and intellectuals of the time (Humanism) made it iconic and it has been constantly reproduced and adapted especially for advertisement and logos, not only in the medical field.

  12. Associated Da Vinci and magellan robotic systems for successful treatment of nutcracker syndrome.

    PubMed

    Thaveau, Fabien; Nicolini, Philippe; Lucereau, Benoit; Georg, Yannick; Lejay, Anne; Chakfe, Nabil

    2015-01-01

    Here, we report the case of a 26-year-old woman suffering from nutcracker syndrome with concurrent disabling pelvic congestion syndrome. She was given the minimally invasive treatment of left renal vein transposition with the Da Vinci(®) robotic system (Intuitive Surgical, Sunnyvale, CA), followed the next day by a gonadal vein and pelvic varicose embolization using a robotic intraluminal navigation with the Magellan™ robotic system (Hansen Medical, Mountain View, CA). The procedure was uneventful, and the patient had good results at 6 months of follow-up, including a patent left renal vein and complete relief of symptoms.

  13. [The anatomy of a reduced skull model--visualisation of Leonardo da Vinci's anthropology].

    PubMed

    Ahner, E

    2008-04-01

    The article focuses on a rare example of a miniature skull of unknown origin. The profoundness of the anatomical details, conjoint with outstanding virtuosity, reminds of Leonardo da Vinci's anatomical skull studies and asks for additional interpretation beside the emblematic "memento mori"-character. Following the miscellaneous topics of his skull studies an anatomical-anthropological interpretation is proposed. For such a project the mergence of anthropology, history of medicine and history of art was mandatory. Concerning some discrepancies within the anatomical realism, the depiction of a pathology is discussed and beyond the visualisation of a historic concept of brain function.

  14. Early experience with telemanipulative robot-assisted laparoscopic cholecystectomy using da Vinci.

    PubMed

    Kim, Victor B; Chapman, William H H; Albrecht, Robert J; Bailey, B Marcus; Young, James A; Nifong, L Wiley; Chitwood, W Randolph

    2002-02-01

    In the past decade, robot-assisted surgery has become increasingly used to assist in minimally invasive surgical procedures. In this article we review the evolution of robotic devices, from the first use of an industrial robot for stereotactic biopsies to pioneering work with robots used for hip and prostate surgery, to the development of robotic guidance systems that enabled solo endoscopic surgery, to telemanipulative surgery with master-servant computer-enhanced robotic devices. In addition, we review our early experience with da Vinci Robotic Surgical Systems (Intuitive Surgical, Inc., Mountain View, CA, U.S.A.), which we used to perform robot-assisted laparoscopic cholecystectomies.

  15. OCT structural examination of Madonna dei Fusi by Leonardo da Vinci

    NASA Astrophysics Data System (ADS)

    Targowski, Piotr; Iwanicka, Magdalena; Sylwestrzak, Marcin; Kaszewska, Ewa A.; Frosinini, Cecilia

    2013-05-01

    Madonna dei Fusi (`Madonna of the Yarnwider') is a spectacular example of Italian Renaissance painting, attributed to Leonardo da Vinci. The aim of this study is to give an account of past restoration procedures. The evidence of a former retouching campaign will be presented with cross-sectional images obtained non-invasively with Optical Coherence Tomography (OCT). Specifically, the locations of overpaintings/retouchings with respect to the original paint layer and secondary varnishes will be given. Additionally, the evidence of a former transfer of the pictorial layer to the new canvas support by detecting the presence of its structure incised into paint layer will be shown.

  16. On the purported discovery of the bronchial circulation by Leonardo da Vinci.

    PubMed

    Mitzner, W; Wagner, E

    1992-09-01

    Among modern physiologists and anatomists, there has been a nearly universal acceptance that Leonardo da Vinci was the first to identify the anatomy of the bronchial circulation. However, because of certain ambiguities in both his anatomic drawing that was supposed to have shown this circulation and the accompanying descriptive text, we questioned whether he really could have been the first to discover this small but important vasculature. To address this question, we set out to repeat Leonardo's dissections in the ox. We reasoned that perhaps the normally tiny bronchial vessels would be considerably more noticeable in this very large species. Our dissections, however, failed to provide any evidence that Leonardo's drawing was that of the bronchial circulation. Furthermore we observed a set of distinct small pulmonary veins to the left upper and right middle lobes that Leonardo, given his lack of understanding of the function of the lung and its circulation, could have easily mistaken for a separate circulation. We thus conclude that Leonardo da Vinci did not describe the anatomy of the bronchial circulation. We believe that the first person to clearly and unequivocally describe the anatomy of this circulation was the Dutch Professor of Anatomy and Botany, Frederich Ruysch.

  17. Leonardo da Vinci and Andreas Vesalius; the shoulder girdle and the spine, a comparison.

    PubMed

    Ganseman, Y; Broos, P

    2008-01-01

    Leonardo Da Vinci and Andreas Vesalius were two important renaissance persons; Vesalius was a surgeon-anatomist who delivered innovative work on the study of the human body, Leonardo da Vinci was an artist who delivered strikingly accurate and beautiful drawings on the human body. Below we compare both masters with regard to their knowledge of the working of the muscles, their method and system of dissection and their system and presentation of the drawings. The investigation consisted of a comparison between both anatomists, in particular concerning their study on the shoulder girdle and spine, by reviewing their original work as well as already existing literature on this subject. The investigation led to the conclusion that the drawings mentioned meant a change in history, and were of high quality, centuries ahead of their time. Both were anatomists, both were revolutionary, only one changed history at the moment itself, while the other changed history centuries later. Leonardo has made beautiful drawings that are at a match with the drawings of today or are even better. Vesalius set the start for medicine as a science as it is until this day. Their lives differed as strongly as their impact. In the light of their time, the achievement they made was extraordinary. PMID:18807610

  18. Leonardo da Vinci and Andreas Vesalius; the shoulder girdle and the spine, a comparison.

    PubMed

    Ganseman, Y; Broos, P

    2008-01-01

    Leonardo Da Vinci and Andreas Vesalius were two important renaissance persons; Vesalius was a surgeon-anatomist who delivered innovative work on the study of the human body, Leonardo da Vinci was an artist who delivered strikingly accurate and beautiful drawings on the human body. Below we compare both masters with regard to their knowledge of the working of the muscles, their method and system of dissection and their system and presentation of the drawings. The investigation consisted of a comparison between both anatomists, in particular concerning their study on the shoulder girdle and spine, by reviewing their original work as well as already existing literature on this subject. The investigation led to the conclusion that the drawings mentioned meant a change in history, and were of high quality, centuries ahead of their time. Both were anatomists, both were revolutionary, only one changed history at the moment itself, while the other changed history centuries later. Leonardo has made beautiful drawings that are at a match with the drawings of today or are even better. Vesalius set the start for medicine as a science as it is until this day. Their lives differed as strongly as their impact. In the light of their time, the achievement they made was extraordinary.

  19. The uncatchable smile in Leonardo da Vinci's La Bella Principessa portrait.

    PubMed

    Soranzo, Alessandro; Newberry, Michelle

    2015-08-01

    A portrait of uncertain origin recently came to light which, after extensive research and examination, was shown to be that rarest of things: a newly discovered Leonardo da Vinci painting entitled La Bella Principessa. This research presents a new illusion which is similar to that identified in the Mona Lisa; La Bella Principessa's mouth appears to change slant depending on both the Viewing Distance and the Level of Blur applied to a digital version of the portrait. Through a series of psychophysics experiments, it was found that a perceived change in the slant of the La Bella Principessa's mouth influences her expression of contentment thus generating an illusion that we have coined the "uncatchable smile". The elusive quality of the Mona Lisa's smile has been previously reported (Science, 290 (2000) 1299) and so the existence of a similar illusion in a portrait painted prior to the Mona Lisa becomes more interesting. The question remains whether Leonardo da Vinci intended this illusion. In any case, it can be argued that the ambiguity created adds to the portrait's allure.

  20. [History of robotics: from Archytas of Tarentum until da Vinci robot. (Part I)].

    PubMed

    Sánchez Martín, F M; Millán Rodríguez, F; Salvador Bayarri, J; Palou Redorta, J; Rodríguez Escovar, F; Esquena Fernández, S; Villavicencio Mavrich, H

    2007-02-01

    Robotic surgery is the newst technologic option in urology. To understand how new robots work is interesting to know their history. The desire to design machines imitating humans continued for more than 4000 years. There are references to King-su Tse (clasic China) making up automaton at 500 a. C. Archytas of Tarentum (at around 400 a.C.) is considered the father of mechanical engineering, and one of the occidental robotics classic referents. Heron of Alexandria, Hsieh-Fec, Al-Jazari, Roger Bacon, Juanelo Turriano, Leonardo da Vinci, Vaucanson o von Kempelen were robot inventors in the middle age, renaissance and classicism. At the XIXth century, automaton production underwent a peak and all engineering branches suffered a great development. At 1942 Asimov published the three robotics laws, based on mechanics, electronics and informatics advances. At XXth century robots able to do very complex self governing works were developed, like da Vinci Surgical System (Intuitive Surgical Inc, Sunnyvale, CA, USA), a very sophisticated robot to assist surgeons.

  1. Virtual Mobility in Reality: A Study of the Use of ICT in Finnish Leonardo da Vinci Mobility Projects.

    ERIC Educational Resources Information Center

    Valjus, Sonja

    An e-mail survey and interviews collected data on use of information and communications technology (ICT) in Finnish Leonardo da Vinci mobility projects from 2000-02. Findings showed that the most common ICT tools used were e-mail, digital tools, and the World Wide Web; ICT was used during all project phases; the most common problems concerned…

  2. Educating in the Design and Construction of Built Environments Accessible to Disabled People: The Leonardo da Vinci AWARD Project

    ERIC Educational Resources Information Center

    Frattari, Antonio; Dalpra, Michela; Bernardi, Fabio

    2013-01-01

    An interdisciplinary partnership within an European Leonardo da Vinci project has developed a new approach aimed at educating secondary school students in the creation of built environments accessible to disabled people and at sensitizing them towards the inclusion of people with disabilities in all realms of social life. The AWARD (Accessible…

  3. Application of da Vinci(®) Robot in simple or radical hysterectomy: Tips and tricks.

    PubMed

    Iavazzo, Christos; Gkegkes, Ioannis D

    2016-01-01

    The first robotic simple hysterectomy was performed more than 10 years ago. These days, robotic-assisted hysterectomy is accepted as an alternative surgical approach and is applied both in benign and malignant surgical entities. The two important points that should be taken into account to optimize postoperative outcomes in the early period of a surgeon's training are how to achieve optimal oncological and functional results. Overcoming any technical challenge, as with any innovative surgical method, leads to an improved surgical operation timewise as well as for patients' safety. The standardization of the technique and recognition of critical anatomical landmarks are essential for optimal oncological and clinical outcomes on both simple and radical robotic-assisted hysterectomy. Based on our experience, our intention is to present user-friendly tips and tricks to optimize the application of a da Vinci® robot in simple or radical hysterectomies. PMID:27403078

  4. Application of da Vinci® Robot in simple or radical hysterectomy: Tips and tricks

    PubMed Central

    Iavazzo, Christos; Gkegkes, Ioannis D.

    2016-01-01

    The first robotic simple hysterectomy was performed more than 10 years ago. These days, robotic-assisted hysterectomy is accepted as an alternative surgical approach and is applied both in benign and malignant surgical entities. The two important points that should be taken into account to optimize postoperative outcomes in the early period of a surgeon’s training are how to achieve optimal oncological and functional results. Overcoming any technical challenge, as with any innovative surgical method, leads to an improved surgical operation timewise as well as for patients’ safety. The standardization of the technique and recognition of critical anatomical landmarks are essential for optimal oncological and clinical outcomes on both simple and radical robotic-assisted hysterectomy. Based on our experience, our intention is to present user-friendly tips and tricks to optimize the application of a da Vinci® robot in simple or radical hysterectomies. PMID:27403078

  5. Bell's palsy: the answer to the riddle of Leonardo da Vinci's 'Mona Lisa'.

    PubMed

    Maloney, W J

    2011-05-01

    The smile of the famed portrait 'The Mona Lisa' has perplexed both art historians and researchers for the past 500 years. There has been a multitude of theories expounded to explain the nature of the model's enigmatic smile. The origin of the model's wry smile can be demonstrated through a careful analysis of both documented facts concerning the portrait--some gathered only recently through the use of modern technology--and a knowledge of the clinical presentation of Bell's palsy. Bell's palsy is more prevalent in women who are either pregnant or who have recently given birth. This paper postulates that the smile of the portrait's model was due to Leonardo da Vinci's anatomically precise representation of a new mother affected by Bell's palsy subsequent to her recent pregnancy.

  6. Urodynamics in the anatomical work of Leonardo da Vinci (1452-1519).

    PubMed

    Schultheiss, D; Grünewald, V; Jonas, U

    1999-06-01

    Leonardo da Vinci (1452-1519) incorporates the symbiosis of art and medicine and can be addressed as the founder of medical illustration in the time of the Renaissance. His anatomy studies were not published in his time, which explains why Leonardo's outstanding knowledge of anatomy, physiology, and medicine had no impact on his scientific contemporaries and is therefore primarily of retrospective importance in the history of medicine. The collection of anatomical illustrations remained unknown until their rediscovery in the eighteenth century and their wide publication at the beginning of our century. This article systematically reviews Leonardo's genitourinary drawings with regard to urodynamic aspects of the upper and lower urinary tract, highlighting topics such as vesicoureteral reflux and urinary sphincter mechanisms.

  7. Microbiological Analysis of Surfaces of Leonardo Da Vinci's Atlantic Codex: Biodeterioration Risk.

    PubMed

    Tarsitani, Gianfranco; Moroni, Catia; Cappitelli, Francesca; Pasquariello, Giovanna; Maggi, Oriana

    2014-01-01

    Following the discovery of discoloration on some pages of the Atlantic Codex (AC) of Leonardo da Vinci kept in the Biblioteca Ambrosiana in Milan, some investigations have been carried out to verify the presence of microorganisms, such as bacteria and fungi. To verify the presence of microorganisms a noninvasive method of sampling has been used that was efficient and allowed us to highlight the microbial facies of the material that was examined using conventional microbiological techniques. The microclimatic conditions in the storage room as well as the water content of the volume were also assessed. The combined observations allowed the conclusion that the discoloration of suspected biological origin on some pages of AC is not related to the presence or current attack of microbial agents.

  8. Microbiological Analysis of Surfaces of Leonardo Da Vinci's Atlantic Codex: Biodeterioration Risk

    PubMed Central

    Moroni, Catia; Pasquariello, Giovanna; Maggi, Oriana

    2014-01-01

    Following the discovery of discoloration on some pages of the Atlantic Codex (AC) of Leonardo da Vinci kept in the Biblioteca Ambrosiana in Milan, some investigations have been carried out to verify the presence of microorganisms, such as bacteria and fungi. To verify the presence of microorganisms a noninvasive method of sampling has been used that was efficient and allowed us to highlight the microbial facies of the material that was examined using conventional microbiological techniques. The microclimatic conditions in the storage room as well as the water content of the volume were also assessed. The combined observations allowed the conclusion that the discoloration of suspected biological origin on some pages of AC is not related to the presence or current attack of microbial agents. PMID:25574171

  9. From Leonardo to da Vinci: the history of robot-assisted surgery in urology.

    PubMed

    Yates, David R; Vaessen, Christophe; Roupret, Morgan

    2011-12-01

    What's known on the subject? and What does the study add? Numerous urological procedures can now be performed with robotic assistance. Though not definitely proven to be superior to conventional laparoscopy or traditional open surgery in the setting of a randomised trial, in experienced centres robot-assisted surgery allows for excellent surgical outcomes and is a valuable tool to augment modern surgical practice. Our review highlights the depth of history that underpins the robotic surgical platform we utilise today, whilst also detailing the current place of robot-assisted surgery in urology in 2011. The evolution of robots in general and as platforms to augment surgical practice is an intriguing story that spans cultures, continents and centuries. A timeline from Yan Shi (1023-957 bc), Archytas of Tarentum (400 bc), Aristotle (322 bc), Heron of Alexandria (10-70 ad), Leonardo da Vinci (1495), the Industrial Revolution (1790), 'telepresence' (1950) and to the da Vinci(®) Surgical System (1999), shows the incredible depth of history and development that underpins the modern surgical robot we use to treat our patients. Robot-assisted surgery is now well-established in Urology and although not currently regarded as a 'gold standard' approach for any urological procedure, it is being increasingly used for index operations of the prostate, kidney and bladder. We perceive that robotic evolution will continue infinitely, securing the place of robots in the history of Urological surgery. Herein, we detail the history of robots in general, in surgery and in Urology, highlighting the current place of robot-assisted surgery in radical prostatectomy, partial nephrectomy, pyeloplasty and radical cystectomy.

  10. Understanding the adoption dynamics of medical innovations: affordances of the da Vinci robot in the Netherlands.

    PubMed

    Abrishami, Payam; Boer, Albert; Horstman, Klasien

    2014-09-01

    This study explored the rather rapid adoption of a new surgical device - the da Vinci robot - in the Netherlands despite the high costs and its controversial clinical benefits. We used the concept 'affordances' as a conceptual-analytic tool to refer to the perceived promises, symbolic meanings, and utility values of an innovation constructed in the wider social context of use. This concept helps us empirically understand robot adoption. Data from 28 in-depth interviews with diverse purposively-sampled stakeholders, and from medical literature, policy documents, Health Technology Assessment reports, congress websites and patients' weblogs/forums between April 2009 and February 2014 were systematically analysed from the perspective of affordances. We distinguished five interrelated affordances of the robot that accounted for shaping and fulfilling its rapid adoption: 'characteristics-related' affordances such as smart nomenclature and novelty, symbolising high-tech clinical excellence; 'research-related' affordances offering medical-technical scientific excellence; 'entrepreneurship-related' affordances for performing better-than-the-competition; 'policy-related' affordances indicating the robot's liberalised provision and its reduced financial risks; and 'communication-related' affordances of the robot in shaping patients' choices and the public's expectations by resonating promising discourses while pushing uncertainties into the background. These affordances make the take-up and use of the da Vinci robot sound perfectly rational and inevitable. This Dutch case study demonstrates the fruitfulness of the affordances approach to empirically capturing the contextual dynamics of technology adoption in health care: exploring in-depth actors' interaction with the technology while considering the interpretative spaces created in situations of use. This approach can best elicit real-life value of innovations, values as defined through the eyes of (potential) users. PMID

  11. Understanding the adoption dynamics of medical innovations: affordances of the da Vinci robot in the Netherlands.

    PubMed

    Abrishami, Payam; Boer, Albert; Horstman, Klasien

    2014-09-01

    This study explored the rather rapid adoption of a new surgical device - the da Vinci robot - in the Netherlands despite the high costs and its controversial clinical benefits. We used the concept 'affordances' as a conceptual-analytic tool to refer to the perceived promises, symbolic meanings, and utility values of an innovation constructed in the wider social context of use. This concept helps us empirically understand robot adoption. Data from 28 in-depth interviews with diverse purposively-sampled stakeholders, and from medical literature, policy documents, Health Technology Assessment reports, congress websites and patients' weblogs/forums between April 2009 and February 2014 were systematically analysed from the perspective of affordances. We distinguished five interrelated affordances of the robot that accounted for shaping and fulfilling its rapid adoption: 'characteristics-related' affordances such as smart nomenclature and novelty, symbolising high-tech clinical excellence; 'research-related' affordances offering medical-technical scientific excellence; 'entrepreneurship-related' affordances for performing better-than-the-competition; 'policy-related' affordances indicating the robot's liberalised provision and its reduced financial risks; and 'communication-related' affordances of the robot in shaping patients' choices and the public's expectations by resonating promising discourses while pushing uncertainties into the background. These affordances make the take-up and use of the da Vinci robot sound perfectly rational and inevitable. This Dutch case study demonstrates the fruitfulness of the affordances approach to empirically capturing the contextual dynamics of technology adoption in health care: exploring in-depth actors' interaction with the technology while considering the interpretative spaces created in situations of use. This approach can best elicit real-life value of innovations, values as defined through the eyes of (potential) users.

  12. [The art of Leonardo Da Vinci as a resource to science and the ideal of nursing care].

    PubMed

    Nascimento, Maria Aparecida de Luca; de Brito, Isabela Jorge; Dehoul, Marcelo da Silva

    2003-01-01

    Theoretical reflection whose goal is to demonstrate the art a nursing team is required to show in order to perform a technical procedure for transfer of solutions from a normal vial to a microdrops vial, based on Leonardo Da Vinci's theoretical referential, inspired by his work called "Vitruvian Man", so that body harmony is kept. The authors emphasize its relationship to nursing care, viewing it from its broadest sense, and its own motto--"Science, Art and Ideal".

  13. Realization of a single image haze removal system based on DaVinci DM6467T processor

    NASA Astrophysics Data System (ADS)

    Liu, Zhuang

    2014-10-01

    Video monitoring system (VMS) has been extensively applied in domains of target recognition, traffic management, remote sensing, auto navigation and national defence. However the VMS has a strong dependence on the weather, for instance, in foggy weather, the quality of images received by the VMS are distinct degraded and the effective range of VMS is also decreased. All in all, the VMS performs terribly in bad weather. Thus the research of fog degraded images enhancement has very high theoretical and practical application value. A design scheme of a fog degraded images enhancement system based on the TI DaVinci processor is presented in this paper. The main function of the referred system is to extract and digital cameras capture images and execute image enhancement processing to obtain a clear image. The processor used in this system is the dual core TI DaVinci DM6467T - ARM@500MHz+DSP@1GH. A MontaVista Linux operating system is running on the ARM subsystem which handles I/O and application processing. The DSP handles signal processing and the results are available to the ARM subsystem in shared memory.The system benefits from the DaVinci processor so that, with lower power cost and smaller volume, it provides the equivalent image processing capability of a X86 computer. The outcome shows that the system in this paper can process images at 25 frames per second on D1 resolution.

  14. Design of image stabilization system for space remote sensor based on DaVinci technology

    NASA Astrophysics Data System (ADS)

    Li, Haoyang; Liu, Zhaojun; Xu, Pengmei

    2011-08-01

    Many factors affect space remote sensor imaging, causing image degradation of contrast and resolution decreasing, which cannot be solved neither by improving resolution of imaging components nor processing of images. In order to meet the imaging requirement of space remote sensor, image stabilization system should be included. In this paper, with a combining method of micro-mechanical and digital image stabilization, an image stabilization system based on DaVinci technology is designed, including imaging and sensing unit, operating and controlling unit and fast steering mirror unit, using TI TMS320DM6446 as the main processor of the image stabilization system, which performs the function of focal plane controlling, image acquisition, motion vector estimating, digital image stabilization operating, fast steering mirror controlling and image outputting. The workflow is as followings: first, through optical system, ground scene is imaged by imaging focal planes. Short exposure images acquired by imaging focal plane are transferred as series to the unit of computing and controlling. Then, inter-frame motion vector is computed from images according to gray projection algorithm, and employed as inputs with image series to do iterative back projection. In this way the final picture is obtained. Meanwhile, the control value obtained from the inter-frame motion vector is sent to the fast steering mirror unit, making compensation to damp vibrations. The results of experiments demonstrate that the image stabilization system improves the imaging performance of space remote sensor.

  15. The proportion of the face in younger adults using the thumb rule of Leonardo da Vinci.

    PubMed

    Oguz, O

    1996-01-01

    The present study was conducted to examine whether the thumb rule of Leonardo da Vinci could be an objective method in the determination of the natural and artistic proportions of human face. In this study, a sample of 400 subjects (200 male and 200 female, 22-25 years old) was used. Measurements were made of the length of thumb, the length of ear, the approximate distances between the hair line and the glabella or eyebrows, between the glabella or eyebrows and the tip of the nose and the distance between the nose and the chin, and the distance between the ear and the lateral aspect of the eye. The results obtained in the males and females showed significant (p < 0.01) correlations between the length of thumb and the proportions of the face examined in the study. Additionally, the height of the face was found to be almost three times the length of the thumb. However, the measurements obtained from female subjects were on average smaller than those taken from males. The results obtained in this experiment could be of value in understanding of the evaluation of the face for the people working in plastic surgery or art.

  16. A midline sagittal brain view depicted in Da Vinci's "Saint Jerome in the wilderness".

    PubMed

    Valença, M M; Aragão, M de F V Vasco; Castillo, M

    2013-01-01

    It is estimated that around the year 1480 Leonardo da Vinci painted Saint Jerome in the Wilderness, representing the saint during his years of retreat in the Syrian dessert where he lived the life of a hermit. One may interpret Leonardo's Saint Jerome in the Wilderness as St. Jerome practicing self-chastisement with a stone in his right hand, seemingly punching his chest repeatedly. The stone, the lion and a cardinal's hat are conventionally linked to the saint. A skull was also almost always present with the image of the saint symbolically representing penance. With careful analysis of the painting one can identify the skull which is hidden in an arc represented as a lion's tail. The image is of a hemicranium (midline sagittal view) showing the intracranial dura, including the falx and tentorium, and venous system with the sinuses and major deep veins. This may have been the first time when the intracranial sinuses and the major deep venous vessels were illustrated.

  17. The handedness of Leonardo da Vinci: a tale of the complexities of lateralisation.

    PubMed

    McManus, I C; Drury, Helena

    2004-07-01

    The handedness of Leonardo da Vinci is controversial. Although there is little doubt that many of his well-attributed drawings were drawn with the left hand, the hatch marks of the shading going downwards from left to right, it is not clear that he was a natural left-hander, there being some suggestion that he may have become left-handed as the result of an injury to his right hand in early adulthood. Leonardo's lateralisation may be illuminated by an obscure passage in his notebooks in which he describes crouching down to look into a dark cave, putting his left hand on his knee, and shading his eyes with his right hand. We carried out a questionnaire survey, using 33 written and photographic items, to find whether this behaviour was typical of right handers or left handers. In fact the 'Leonardo task' showed almost no direct association with handedness, meaning that it contributes little to the immediate problem of elucidating Leonardo's handedness. However, the lateralisation of the task did relate to other aspects of behavioural laterality in surprisingly complex ways. This suggests that individual differences in handedness, and behavioural laterality in general, have a structural complexity which is not fully encompassed by simple measures of direction or degree of handedness.

  18. Leonardo da Vinci's "A skull sectioned": skull and dental formula revisited.

    PubMed

    Gerrits, Peter O; Veening, Jan G

    2013-05-01

    What can be learned from historical anatomical drawings and how to incorporate these drawings into anatomical teaching? The drawing "A skull sectioned" (RL 19058v) by Leonardo da Vinci (1452-1519), hides more detailed information than reported earlier. A well-chosen section cut explores sectioned paranasal sinuses and ductus nasolacrimalis. A dissected lateral wall of the maxilla is also present. Furthermore, at the level of the foramen mentale, the drawing displays compact and spongious bony components, together with a cross-section through the foramen mentale and its connection with the canalis mandibulae. Leonardo was the first to describe a correct dental formula (6424) and made efforts to place this formula above the related dental elements. However, taking into account, the morphological features of the individual elements of the maxilla, it can be suggested that Leonardo sketched a "peculiar dental element" on the position of the right maxillary premolar in the dental sketch. The fact that the author did not make any comment on that special element is remarkable. Leonardo could have had sufficient knowledge of the precise morphology of maxillary and mandibular premolars, since the author depicted these elements in the dissected skull. The fact that the author also had access to premolars in situ corroborates our suggestion that "something went wrong" in this part of the drawing. The present study shows that historical anatomical drawings are very useful for interactive learning of detailed anatomy for students in medicine and dentistry.

  19. [The history of prostate cancer from the beginning to DaVinci].

    PubMed

    Hatzinger, M; Hubmann, R; Moll, F; Sohn, M

    2012-07-01

    For hardly any other organ can the development of medicine and technical advances in the last 150 years be so clearly illustrated as for the prostate. The history of radical prostatectomy was initially characterised by the problems in approaching this relatively difficulty accessible organ. In 1867, Theodor Billroth in Vienna performed the first partial prostatectomy via a perineal access. In 1904, Hugh Hampton Young and William Stewart Halsted at the Johns Hopkins Hospital in Baltimore / USA carried out the first successful extracapsular perineal prostatectomy and opened up a new era. In Germany, Prof. Friedrich Voelcker in Halle in 1924 developed the so-called ischiorectal prostatectomy. But it was left to Terence Millin to publish in 1945 the first series of retropubic prostatectomies. In 1952, the sacroperineal approach according to Thiermann and the sacral prostatectomy according to were introduced. Finally, in 1991 another new era in prostate surgery started with the first laparoscopic prostatectomy. This development peaked in 2011 with the presentation of the laparoscopic DaVinci prostatectomy by Binder. Originally a stepchild of urological surgery that was to be avoided whenever possible due to the fear of serious complications, the prostate has progressed in the course of time to an obscure object of lust. The stepchild has become the favorite child. PMID:23035261

  20. Robot-Assisted Cardiac Surgery Using the Da Vinci Surgical System: A Single Center Experience

    PubMed Central

    Kim, Eung Re; Lim, Cheong; Kim, Dong Jin; Kim, Jun Sung; Park, Kay Hyun

    2015-01-01

    Background We report our initial experiences of robot-assisted cardiac surgery using the da Vinci Surgical System. Methods Between February 2010 and March 2014, 50 consecutive patients underwent minimally invasive robot-assisted cardiac surgery. Results Robot-assisted cardiac surgery was employed in two cases of minimally invasive direct coronary artery bypass, 17 cases of mitral valve repair, 10 cases of cardiac myxoma removal, 20 cases of atrial septal defect repair, and one isolated CryoMaze procedure. Average cardiopulmonary bypass time and average aorta cross-clamping time were 194.8±48.6 minutes and 126.1±22.6 minutes in mitral valve repair operations and 132.0±32.0 minutes and 76.1±23.1 minutes in myxoma removal operations, respectively. During atrial septal defect closure operations, the average cardiopulmonary bypass time was 128.3±43.1 minutes. The median length of stay was between five and seven days. The only complication was that one patient needed reoperation to address bleeding. There were no hospital mortalities. Conclusion Robot-assisted cardiac surgery is safe and effective for mitral valve repair, atrial septal defect closure, and cardiac myxoma removal surgery. Reducing operative time depends heavily on the experience of the entire robotic surgical team. PMID:25883892

  1. Tele-surgical simulation system for training in the use of da Vinci surgery.

    PubMed

    Suzuki, Shigeyuki; Suzuki, Naoki; Hayashibe, Mitsuhiro; Hattori, Asaki; Konishi, Kozo; Kakeji, Yoshihiro; Hashizume, Makoto

    2005-01-01

    Laparoscopic surgery including robotic surgery allows the surgeon to be able to conduct minimally invasive surgery. A surgeon is required to master difficult skills for this surgery to compensate for the narrow field of view, limitation of work space, and the lack of depth sensation. To counteract these drawbacks, we have been developing a training simulation system that can allow surgeons to practice and master surgical procedures. In addition, our system aims to distribute a simulation program, to provide a means of collaboration between remote hospitals, and to be able to provide a means for guidance from an expert surgeon. In this paper, we would like to show the surgery simulation for da Vinci surgery, in particular a cholecystectomy. The integral parts of this system are a soft tissue model which is created by the sphere-filled method enabling real-time deformations based on a patient's data, force feedback devices known as a PHANToM and the Internet connection. By using this system a surgeon can perform surgical maneuvers such as pushing, grasping, and detachment in real-time manipulation. Moreover, using the broadband communication, we can perform the tele-surgical simulation for training. PMID:15718794

  2. [Leonardo da Vinci the first human body imaging specialist. A brief communication on the thorax oseum images].

    PubMed

    Cicero, Raúl; Criales, José Luis; Cardoso, Manuel

    2009-01-01

    The impressive development of computed tomography (CT) techniques such as the three dimensional helical CT produces a spatial image of the thoracic skull. At the beginning of the 16th century Leonardo da Vinci drew with great precision the thorax oseum. These drawings show an outstanding similarity with the images obtained by three dimensional helical CT. The cumbersome task of the Renaissance genius is a prime example of the careful study of human anatomy. Modern imaging techniques require perfect anatomic knowledge of the human body in order to generate exact interpretations of images. Leonardo's example is alive for anybody devoted to modern imaging studies.

  3. The LEONARDO-DA-VINCI pilot project "e-learning-assistant" - Situation-based learning in nursing education.

    PubMed

    Pfefferle, Petra Ina; Van den Stock, Etienne; Nauerth, Annette

    2010-07-01

    E-learning will play an important role in the training portfolio of students in higher and vocational education. Within the LEONARDO-DA-VINCI action programme transnational pilot projects were funded by the European Union, which aimed to improve the usage and quality of e-learning tools in education and professional training. The overall aim of the LEONARDO-DA-VINCI pilot project "e-learning-assistant" was to create new didactical and technical e-learning tools for Europe-wide use in nursing education. Based on a new situation-oriented learning approach, nursing teachers enrolled in the project were instructed to adapt, develop and implement e- and blended learning units. According to the training contents nursing modules were developed by teachers from partner institutions, implemented in the project centers and evaluated by students. The user-package "e-learning-assistant" as a product of the project includes two teacher training units, the authoring tool "synapse" to create situation-based e-learning units, a student's learning platform containing blended learning modules in nursing and an open sourced web-based communication centre.

  4. Surgical treatment of gastroesophageal reflux disease and upside-down stomach using the Da Vinci robotic system. A prospective study.

    PubMed

    Hartmann, Jens; Jacobi, Christoph A; Menenakos, Charalambos; Ismail, Mahmoud; Braumann, Chris

    2008-03-01

    So far, the impact of telematic surgical approach in Gastroesophageal Reflux Disease (GERD) is still obscure. In this prospective study, we analyzed the Da Vinci Intuitive Surgical robotic system for antireflux surgery. In April 2003, we set up a pilot study to evaluate the efficacy of laparoscopic telerobotic surgery using the three-arm Da Vinci system. Optimal trocar positions, operating and setup times, conversion rate, intraoperative complications, and perioperative morbidity, as well as mortality rate, were analyzed. The median age was 53 years (range 25-74) in 118 patients (52 female/66 male). In 17 patients, an upside-down stomach- and in 101 GERD was surgical indication. The median operating time has been reduced from 105 min to 91 min after 40 procedures and setup time from 24.5 min to 10.4 min after 10 procedures. The system is safe and it seems to be superior to traditional laparoscopy during dissection in the esophageal hiatus region. This compensates long setup- and operating times. Disadvantages are the high costs, the time to master the setup/system and the necessity of exact trocar positioning. PMID:18027060

  5. Realization and optimization of AES algorithm on the TMS320DM6446 based on DaVinci technology

    NASA Astrophysics Data System (ADS)

    Jia, Wen-bin; Xiao, Fu-hai

    2013-03-01

    The application of AES algorithm in the digital cinema system avoids video data to be illegal theft or malicious tampering, and solves its security problems. At the same time, in order to meet the requirements of the real-time, scene and transparent encryption of high-speed data streams of audio and video in the information security field, through the in-depth analysis of AES algorithm principle, based on the hardware platform of TMS320DM6446, with the software framework structure of DaVinci, this paper proposes the specific realization methods of AES algorithm in digital video system and its optimization solutions. The test results show digital movies encrypted by AES128 can not play normally, which ensures the security of digital movies. Through the comparison of the performance of AES128 algorithm before optimization and after, the correctness and validity of improved algorithm is verified.

  6. Leonardo da Vinci's drapery studies: characterization of lead white pigments by µ-XRD and 2D scanning XRF

    NASA Astrophysics Data System (ADS)

    Gonzalez, Victor; Calligaro, Thomas; Pichon, Laurent; Wallez, Gilles; Mottin, Bruno

    2015-11-01

    This work focuses on the composition and microstructure of the lead white pigment employed in a set of paintworks, using a combination of µ-XRD and 2D scanning XRF, directly applied on five drapery studies attributed to Leonardo da Vinci (1452-1519) and conserved in the Département des Arts Graphiques, Musée du Louvre and in the Musée des Beaux- Arts de Rennes. Trace elements present in the composition as well as in the lead white highlights were imaged by 2D scanning XRF. Mineral phases were determined in a fully noninvasive way using a special µ-XRD diffractometer. Phase proportions were estimated by Rietveld refinement. The analytical results obtained will contribute to differentiate lead white qualities and to highlight the artist's technique.

  7. How did Leonardo perceive himself? Metric iconography of da Vinci's self-portraits

    NASA Astrophysics Data System (ADS)

    Tyler, Christopher W.

    2010-02-01

    Some eighteen portraits are now recognized of Leonardo in old age, consolidating the impression from his bestestablished self-portrait of an old man with long white hair and beard. However, his appearance when younger is generally regarded as unknown, although he was described as very beautiful as a youth. Application of the principles of metric iconography, the study of the quantitative analysis of the painted images, provides an avenue for the identification of other portraits that may be proposed as valid portraits of Leonardo during various stages of his life, by himself and by his contemporaries. Overall, this approach identifies portraits of Leonardo by Verrocchio, Raphael, Botticelli, and others. Beyond this physiognomic analysis, Leonardo's first known drawing provides further insight into his core motivations. Topographic considerations make clear that the drawing is of the hills behind Vinci with a view overlooking the rocky promontory of the town and the plain stretching out before it. The outcroppings in the foreground bear a striking resemblance to those of his unique composition, 'The Virgin of the Rocks', suggesting a deep childhood appreciation of this wild terrain. and an identification with that religious man of the mountains, John the Baptist, who was also the topic of Leonardo's last known painting. Following this trail leads to a line of possible selfportraits continuing the age-regression concept back to a self view at about two years of age.

  8. Console-integrated stereoscopic OsiriX 3D volume-rendered images for da Vinci colorectal robotic surgery.

    PubMed

    Volonté, Francesco; Pugin, Francois; Buchs, Nicolas Christian; Spaltenstein, Joël; Hagen, Monika; Ratib, Osman; Morel, Philippe

    2013-04-01

    The increased distance between surgeon and surgical field is a significant problem in laparoscopic surgery. Robotic surgery, although providing advantages for the operator, increases this gap by completely removing force feedback. Enhancement with visual tools can therefore be beneficial. The goal of this preliminary work was to create a custom plugin for OsiriX to display volume-rendered images in the da Vinci surgeon's console. The TilePro multi-input display made the generated stereoscopic pairs appear to have depth. Tumor position, vascular supply, spatial location, and relationship between organs appear directly within the surgeon's field of view. This study presents a case of totally robotic right colectomy for cancer using this new technology. Sight diversion was no longer necessary. Depth perception was subjectively perceived as profitable. Total immersion in the operative field helped compensate for the lack of tactile feedback specific to robotic intervention. This innovative tool is a step forward toward augmented-reality robot-assisted surgery. PMID:22549904

  9. The oldest anatomical handmade skull of the world c. 1508: 'the ugliness of growing old' attributed to Leonardo da Vinci.

    PubMed

    Missinne, Stefaan J

    2014-06-01

    The author discusses a previously unknown early sixteenth-century renaissance handmade anatomical miniature skull. The small, naturalistic skull made from an agate (calcedonia) stone mixture (mistioni) shows remarkable osteologic details. Dr. Saban was the first to link the skull to Leonardo. The three-dimensional perspective of and the search for the senso comune are discussed. Anatomical errors both in the drawings of Leonardo and this skull are presented. The article ends with the issue of physiognomy, his grotesque faces, the Perspective Communis and his experimenting c. 1508 with the stone mixture and the human skull. Evidence, including the Italian scale based on Crazie and Braccia, chemical analysis leading to a mine in Volterra and Leonardo's search for the soul in the skull are presented. Written references in the inventory of Salai (1524), the inventory of the Villa Riposo (Raffaello Borghini 1584) and Don Ambrogio Mazenta (1635) are reviewed. The author attributes the skull c. 1508 to Leonardo da Vinci.

  10. The mother relationship and artistic inhibition in the lives of Leonardo da Vinci and Erik H. Erikson.

    PubMed

    Capps, Donald

    2008-12-01

    In four earlier articles, I focused on the theme of the relationship of melancholia and the mother, and suggested that the melancholic self may experience humor (Capps, 2007a), play (Capps, 2007b), dreams (Capps, 2008a), and art (Capps, 2008b) as restorative resources. I argued that Erik H. Erikson found these resources to be valuable remedies for his own melancholic condition, which had its origins in the fact that he was illegitimate and was raised solely by his mother until he was three years old, when she remarried. In this article, I focus on two themes in Freud's Leonardo da Vinci and a memory of his childhood (1964): Leonardo's relationship with his mother in early childhood and his inhibitions as an artist. I relate these two themes to Erikson's own early childhood and his failure to achieve his goal as an aspiring artist in his early twenties. The article concludes with a discussion of Erikson's frustrated aspirations to become an artist and his emphasis, in his psychoanalytic work, on children's play.

  11. The mother relationship and artistic inhibition in the lives of Leonardo da Vinci and Erik H. Erikson.

    PubMed

    Capps, Donald

    2008-12-01

    In four earlier articles, I focused on the theme of the relationship of melancholia and the mother, and suggested that the melancholic self may experience humor (Capps, 2007a), play (Capps, 2007b), dreams (Capps, 2008a), and art (Capps, 2008b) as restorative resources. I argued that Erik H. Erikson found these resources to be valuable remedies for his own melancholic condition, which had its origins in the fact that he was illegitimate and was raised solely by his mother until he was three years old, when she remarried. In this article, I focus on two themes in Freud's Leonardo da Vinci and a memory of his childhood (1964): Leonardo's relationship with his mother in early childhood and his inhibitions as an artist. I relate these two themes to Erikson's own early childhood and his failure to achieve his goal as an aspiring artist in his early twenties. The article concludes with a discussion of Erikson's frustrated aspirations to become an artist and his emphasis, in his psychoanalytic work, on children's play. PMID:19093682

  12. 2D and 3D optical diagnostic techniques applied to Madonna dei Fusi by Leonardo da Vinci

    NASA Astrophysics Data System (ADS)

    Fontana, R.; Gambino, M. C.; Greco, M.; Marras, L.; Materazzi, M.; Pampaloni, E.; Pelagotti, A.; Pezzati, L.; Poggi, P.; Sanapo, C.

    2005-06-01

    3D measurement and modelling have been traditionally applied to statues, buildings, archeological sites or similar large structures, but rarely to paintings. Recently, however, 3D measurements have been performed successfully also on easel paintings, allowing to detect and document the painting's surface. We used 3D models to integrate the results of various 2D imaging techniques on a common reference frame. These applications show how the 3D shape information, complemented with 2D colour maps as well as with other types of sensory data, provide the most interesting information. The 3D data acquisition was carried out by means of two devices: a high-resolution laser micro-profilometer, composed of a commercial distance meter mounted on a scanning device, and a laser-line scanner. The 2D data acquisitions were carried out using a scanning device for simultaneous RGB colour imaging and IR reflectography, and a UV fluorescence multispectral image acquisition system. We present here the results of the techniques described, applied to the analysis of an important painting of the Italian Reinassance: `Madonna dei Fusi', attributed to Leonardo da Vinci.

  13. Amid the possible causes of a very famous foxing: molecular and microscopic insight into Leonardo da Vinci's self‐portrait

    PubMed Central

    Tafer, Hakim; Sterflinger, Katja; Pinzari, Flavia

    2015-01-01

    Summary Leonardo da Vinci's self‐portrait is affected by foxing spots. The portrait has no fungal or bacterial infections in place, but is contaminated with airborne spores and fungal material that could play a role in its disfigurement. The knowledge of the nature of the stains is of great concern because future conservation treatments should be derived from scientific investigations. The lack of reliable scientific data, due to the non‐culturability of the microorganisms inhabiting the portrait, prompted the investigation of the drawing using non‐invasive and micro‐invasive sampling, in combination with scanning electron microscope (SEM) imaging and molecular techniques. The fungus E urotium halophilicum was found in foxing spots using SEM analyses. Oxalates of fungal origin were also documented. Both findings are consistent with the hypothesis that tonophilic fungi germinate on paper metabolizing organic acids, oligosaccharides and proteic compounds, which react chemically with the material at a low water activity, forming brown products and oxidative reactions resulting in foxing spots. Additionally, molecular techniques enabled a screening of the fungi inhabiting the portrait and showed differences when different sampling techniques were employed. Swabs samples showed a high abundance of lichenized Ascomycota, while the membrane filters showed a dominance of A cremonium sp. colonizing the drawing. PMID:26111623

  14. Peri-operative comparison between daVinci-assisted radical prostatectomy and open radical prostatectomy in obese patients

    NASA Astrophysics Data System (ADS)

    Le, Carter Q.; Ho, Khai-Linh V.; Slezak, Jeffrey M.; Blute, Michael L.; Gettman, Matthew T.

    2007-02-01

    Introduction: While the effects of increasing body mass index on prostate cancer epidemiology and surgical approach have recently been studied, its effects on surgical outcomes are less clear. We studied the perioperative outcomes of obese (BMI >= 30) men treated with daVinci-assisted laparoscopic radical prostatectomy (DLP) and compared them to those treated with open radical retropubic prostatectomy (RRP) in a contemporary time frame. Method: After Institutional Review Board approval, we used the Mayo Clinic Radical Prostatectomy database to identify patients who had undergone DLP by a single surgeon and those who had undergone open RRP by a single surgeon between December 2002 and March 2005. Baseline demographics, peri- and post-operative courses, and complications were collected by retrospective chart review, and variables from the two cohorts compared using chi-square method and least-squares method of linear regression where appropriate. Results: 59 patients who had DLP and 76 undergoing RRP were available for study. Baseline demographics were not statistically different between the two cohorts. Although DLP had a significantly lower clinical stage than RRP (p=0.02), pathological stage was not statistically different (p=0.10). Transfusion rates, hospital stay, overall complications, and pathological Gleason were also not significantly different, nor were PSA progression, positive margin rate, or continence at 1 year. After bilateral nerve-sparing, erections suitable for intercourse with or without therapy at 1 year was 88.5% (23/26) for DLP and 61.2% (30/49) for RRP (p=0.01). Follow-up time was similar. Conclusion: For obese patients, DLP appears to have similar perioperative, as well as short-term oncologic and functional outcomes when compared to open RRP.

  15. Michelangelo in Florence, Leonardo in Vinci.

    ERIC Educational Resources Information Center

    Herberholz, Barbara

    2003-01-01

    Provides background information on the lives and works of Michelangelo and Leonardo da Vinci. Focuses on the artwork of the artists and the museums where their work is displayed. Includes museum photographs of their work. (CMK)

  16. Integrating Leonardo da Vinci's principles of demonstration, uncertainty, and cultivation in contemporary nursing education.

    PubMed

    Story, Lachel; Butts, Janie

    2014-03-01

    Nurses today are facing an ever changing health care system. Stimulated by health care reform and limited resources, nursing education is being challenged to prepare nurses for this uncertain environment. Looking to the past can offer possible solutions to the issues nursing education is confronting. Seven principles of da Vincian thinking have been identified (Gelb, 2004). As a follow-up to an exploration of the curiosità principle (Butts & Story, 2013), this article will explore the three principles of dimostrazione, sfumato, and corporalita. Nursing faculty can set the stage for a meaningful educational experience through these principles of demonstration (dimostrazione), uncertainty (sfumato), and cultivation (corporalita). Preparing nurses not only to manage but also to flourish in the current health care environment that will enhance the nurse's and patient's experience.

  17. Placement of {sup 125}I implants with the da Vinci robotic system after video-assisted thoracoscopic wedge resection: A feasibility study

    SciTech Connect

    Pisch, Julianna . E-mail: jpisch@bethisraelny.org; Belsley, Scott J.; Ashton, Robert; Wang Lin; Woode, Rudolph; Connery, Cliff

    2004-11-01

    Purpose: To evaluate the feasibility of using the da Vinci robotic system for radioactive seed placement in the wedge resection margin of pigs' lungs. Methods and materials: Video-assisted thoracoscopic wedge resection was performed in the upper and lower lobes in pigs. Dummy {sup 125}I seeds embedded in absorbable sutures were sewn into the resection margin with the aid of the da Vinci robotic system without complications. In the 'loop technique,' the seeds were placed in a cylindrical pattern; in the 'longitudinal,' they were above and lateral to the resection margin. Orthogonal radiographs were taken in the operating room. For dose calculation, Variseed 66.7 (Build 11312) software was used. Results: With looping seed placement, in the coronal view, the dose at 1 cm from the source was 97.0 Gy; in the lateral view it was 107.3 Gy. For longitudinal seed placement, the numbers were 89.5 Gy and 70.0 Gy, respectively. Conclusion: Robotic technology allows direct placement of radioactive seeds into the resection margin by endoscopic surgery. It overcomes the technical difficulties of manipulating in the narrow chest cavity. With the advent of robotic technology, new options in the treatment of lung cancer, as well as other malignant tumors, will become available.

  18. Specific learning curve for port placement and docking of da Vinci(®) Surgical System: one surgeon's experience in robotic-assisted radical prostatectomy.

    PubMed

    Dal Moro, F; Secco, S; Valotto, C; Artibani, W; Zattoni, F

    2012-12-01

    Port placement and docking of the da Vinci(®) Surgical System is fundamental in robotic-assisted laparoscopic radical prostatectomy (RALP). The aim of our study was to investigate learning curves for port placement and docking of robots (PPDR) in RALP. This manuscript is a retrospective review of prospectively collected data looking at PPDR in 526 patients who underwent RALP in our institute from April 2005 to May 2010. Data included patient-factor features such as body mass index (BMI), and pre-, intra- and post-operative data. Intra-operative information included operation time, subdivided into anesthesia, PPDR and console times. 526 patients underwent RALP, but only those in whom PPDR was performed by the same surgeon without laparoscopic and robotic experience (F.D.M.) were studied, totalling 257 cases. The PPDR phase revealed an evident learning curve, comparable with other robotic phases. Efficiency improved until approximately the 60th case (P < 0.001), due more to effective port placement than to docking of robotic arms. In our experience, conversion to open surgery is so rare that statistical evaluation is not significant. Conversion due to robotic device failure is also very rare. This study on da Vinci procedures in RALP revealed a learning curve during PPDR and throughout the robotic-assisted procedure, reaching a plateau after 60 cases. PMID:27628472

  19. Robotic-assisted laparoscopic radical nephrectomy using the Da Vinci Si system: how to improve surgeon autonomy. Our step-by-step technique.

    PubMed

    Davila, Hugo H; Storey, Raul E; Rose, Marc C

    2016-09-01

    Herein, we describe several steps to improve surgeon autonomy during a Left Robotic-Assisted Laparoscopic Radical Nephrectomy (RALRN), using the Da Vinci Si system. Our kidney cancer program is based on 2 community hospitals. We use the Da Vinci Si system. Access is obtained with the following trocars: Two 8 mm robotic, one 8 mm robotic, bariatric length (arm 3), 15 mm for the assistant and 12 mm for the camera. We use curved monopolar scissors in robotic arm 1, Bipolar Maryland in arm 2, Prograsp Forceps in arm 3, and we alternate throughout the surgery with EndoWrist clip appliers and the vessel sealer. Here, we described three steps and the use of 3 robotic instruments to improve surgeon autonomy. Step 1: the lower pole of the kidney was dissected and this was retracted upwards and laterally. This maneuver was performed using the 3rd robotic arm with the Prograsp Forceps. Step 2: the monopolar scissors was replaced (robotic arm 1) with the robotic EndoWrist clip applier, 10 mm Hem-o-Lok. The renal artery and vein were controlled and transected by the main surgeon. Step 3: the superior, posterolateral dissection and all bleeders were carefully coagulated by the surgeon with the EndoWrist one vessel sealer. We have now performed 15 RALRN following these steps. Our results were: blood loss 300 cc, console time 140 min, operating room time 200 min, anesthesia time 180 min, hospital stay 2.5 days, 1 incisional hernia, pathology: (13) RCC clear cell, (1) chromophobe and (1) papillary type 1. Tumor Stage: (5) T1b, (8) T2a, (2) T2b. We provide a concise, step-by-step technique for radical nephrectomy (RN) using the Da Vinci Si robotic system that may provide more autonomy to the surgeon, while maintaining surgical outcome equivalent to standard laparoscopic RN.

  20. A psychoanalytic understanding of the desire for knowledge as reflected in Freud's Leonardo da Vinci and a memory of his childhood.

    PubMed

    Blass, Rachel B

    2006-10-01

    The author offers an understanding of the psychoanalytic notion of the desire for knowledge and the possibility of attaining it as it fi nds expression in Freud's Leonardo da Vinci and a memory of his childhood. This understanding has not been explicitly articulated by Freud but may be considered integral to psychoanalysis' Weltanschauung as shaped by Freud's legacy. It emerges through an attempt to explain basic shifts, contradictions, inconsistencies and tensions that become apparent from a close reading of the text of Leonardo. Articulating this implicit understanding of knowledge provides the grounds for a stance on epistemology that is integral to psychoanalysis and relevant to contemporary psychoanalytic concerns on this topic. This epistemology focuses on the necessary involvement of passion, rather than detachment, in the search for knowledge and views the psychoanalytic aim of self-knowledge as a derivative, and most immediate expression, of a broader and more basic human drive to know.

  1. Elastography Using Multi-Stream GPU: An Application to Online Tracked Ultrasound Elastography, In-Vivo and the da Vinci Surgical System

    PubMed Central

    Deshmukh, Nishikant P.; Kang, Hyun Jae; Billings, Seth D.; Taylor, Russell H.; Hager, Gregory D.; Boctor, Emad M.

    2014-01-01

    A system for real-time ultrasound (US) elastography will advance interventions for the diagnosis and treatment of cancer by advancing methods such as thermal monitoring of tissue ablation. A multi-stream graphics processing unit (GPU) based accelerated normalized cross-correlation (NCC) elastography, with a maximum frame rate of 78 frames per second, is presented in this paper. A study of NCC window size is undertaken to determine the effect on frame rate and the quality of output elastography images. This paper also presents a novel system for Online Tracked Ultrasound Elastography (O-TRuE), which extends prior work on an offline method. By tracking the US probe with an electromagnetic (EM) tracker, the system selects in-plane radio frequency (RF) data frames for generating high quality elastograms. A novel method for evaluating the quality of an elastography output stream is presented, suggesting that O-TRuE generates more stable elastograms than generated by untracked, free-hand palpation. Since EM tracking cannot be used in all systems, an integration of real-time elastography and the da Vinci Surgical System is presented and evaluated for elastography stream quality based on our metric. The da Vinci surgical robot is outfitted with a laparoscopic US probe, and palpation motions are autonomously generated by customized software. It is found that a stable output stream can be achieved, which is affected by both the frequency and amplitude of palpation. The GPU framework is validated using data from in-vivo pig liver ablation; the generated elastography images identify the ablated region, outlined more clearly than in the corresponding B-mode US images. PMID:25541954

  2. Using Program Theory-Driven Evaluation Science to Crack the Da Vinci Code

    ERIC Educational Resources Information Center

    Donaldson, Stewart I.

    2005-01-01

    Program theory-driven evaluation science uses substantive knowledge, as opposed to method proclivities, to guide program evaluations. It aspires to update, clarify, simplify, and make more accessible the evolving theory of evaluation practice commonly referred to as theory-driven or theory-based evaluation. The evaluator in this chapter provides a…

  3. The left ventricle as a mechanical engine: from Leonardo da Vinci to the echocardiographic assessment of peak power output-to-left ventricular mass.

    PubMed

    Dini, Frank L; Guarini, Giacinta; Ballo, Piercarlo; Carluccio, Erberto; Maiello, Maria; Capozza, Paola; Innelli, Pasquale; Rosa, Gian M; Palmiero, Pasquale; Galderisi, Maurizio; Razzolini, Renato; Nodari, Savina

    2013-03-01

    The interpretation of the heart as a mechanical engine dates back to the teachings of Leonardo da Vinci, who was the first to apply the laws of mechanics to the function of the heart. Similar to any mechanical engine, whose performance is proportional to the power generated with respect to weight, the left ventricle can be viewed as a power generator whose performance can be related to left ventricular mass. Stress echocardiography may provide valuable information on the relationship between cardiac performance and recruited left ventricular mass that may be used in distinguishing between adaptive and maladaptive left ventricular remodeling. Peak power output-to-mass, obtained during exercise or pharmacological stress echocardiography, is a measure that reflects the number of watts that are developed by 100 g of left ventricular mass under maximal stimulation. Power output-to-mass may be calculated as left ventricular power output per 100 g of left ventricular mass: 100× left ventricular power output divided by left ventricular mass (W/100 g). A simplified formula to calculate power output-to-mass is as follows: 0.222 × cardiac output (l/min) × mean blood pressure (mmHg)/left ventricular mass (g). When the integrity of myocardial structure is compromised, a mismatch becomes apparent between maximal cardiac power output and left ventricular mass; when this occurs, a reduction of the peak power output-to-mass index is observed.

  4. Reforming Upper Secondary Education in Europe. The Leonardo da Vinci Project Post-16 Strategies. Surveys of Strategies for Post-16 Education To Improve the Parity of Esteem for Initial Vocational Education in Eight European Educational Systems. Theory into Practice 92. Institute for Educational Research Publication Series B.

    ERIC Educational Resources Information Center

    Lasonen, Johanna, Ed.

    This book contains the following papers on the Leonardo da Vinci project: "Looking for Post-16 Education Strategies for Parity of Esteem in Europe" (Lasonen); "Improving Parity of Esteem as a Policy Goal" (Makinen, Volanen); "Alternative Strategies for Parity of Esteem between General/Academic and Vocational Education in Europe" (Kamarainen);…

  5. A second Leonardo da Vinci?

    PubMed

    Nakano, Mitsuko; Endo, Toshitaka; Tanaka, Shigeki

    2003-10-01

    We describe a young woman who suddenly began mirror writing with her right hand and has not reverted to normal writing for more than 6 years, although she writes normally with her left hand. She is ambidextrous, although she had previously used only her right hand for writing and drawing. Since it is much easier for her to use right-handed mirror writing, she uses her left hand only for writing meant to be read by others and her right hand for all other writing. Her hobbies are sculpture and painting, and her chief complaint is migraine accompanied by sensory and perceptive disturbances.

  6. What Is the Moral Imperative of Workplace Learning: Unlocking the DaVinci Code of Human Resource Development?

    ERIC Educational Resources Information Center

    Short, Tom

    2006-01-01

    In the course of the author's doctoral study, he is exploring the strategic linkages between learning activities in the modern workplace and the long-term success they bring to organisations. For many years, this challenge has been the Holy Grail of human resource (HR) development practitioners, who invest heavily on training and professional…

  7. VINCI: the VLT Interferometer commissioning instrument

    NASA Astrophysics Data System (ADS)

    Kervella, Pierre; Coudé du Foresto, Vincent; Glindemann, Andreas; Hofmann, Reiner

    2000-07-01

    The Very Large Telescope Interferometer (VLTI) is a complex system, made of a large number of separated elements. To prepare an early successful operation, it will require a period of extensive testing and verification to ensure that the many devices involved work properly together, and can produce meaningful data. This paper describes the concept chosen for the VLTI commissioning instrument, LEONARDO da VINCI, and details its functionalities. It is a fiber based two-way beam combiner, associated with an artificial star and an alignment verification unit. The technical commissioning of the VLTI is foreseen as a stepwise process: fringes will first be obtained with the commissioning instrument in an autonomous mode (no other parts of the VLTI involved); then the VLTI telescopes and optical trains will be tested in autocollimation; finally fringes will be observed on the sky.

  8. Tourism. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This brochure, part of a series about good practices in vocational training in the European Union, describes 10 projects that have promoted investment in human resources through training in the tourism sector to promote sustainable, or responsible, tourism. The projects and their countries of origin are as follows: (1) BEEFT, training of mobility…

  9. [Studies of vision by Leonardo da Vinci].

    PubMed

    Berggren, L

    2001-01-01

    Leonardo was an advocate of the intromission theory of vision. Light rays from the object to the eye caused visual perceptions which were transported to the brain ventricles via a hollow optic nerve. Leonardo introduced wax injections to explore the ventricular system. Perceptions were assumed to go to the "senso comune" in the middle (3rd) ventricle, also the seat of the soul. The processing station "imprensiva" in the anterior lateral horns together with memory "memoria" in th posterior (4th) ventricle integrated the visual perceptions to visual experience. - Leonardo's sketches with circular lenses in the center of the eye reveal that his dependence on medieval optics prevailed over anatomical observations. Drawings of the anatomy of the sectioned eye are missing although Leonardo had invented a new embedding technique. In order to dissect the eye without spilling its contents, the eye was first boiled in egg white and then cut. The procedure was now repeated and showed that the ovoid lens after boiling had become spherical. - Leonardo described that light rays were refracted and reflected in the eye but his imperfect anatomy prevented a development of physiological optics. He was, however, the first to compare the eye with a pin-hole camera (camera obscura). Leonardo's drawings of the inverted pictures on the back wall of a camera obscura inspired to its use as an instrument for artistic practice. The camera obscura was for centuries a model for explaining human vision.

  10. Da Vinci robot emergency undocking protocol.

    PubMed

    O'Sullivan, O E; O'Sullivan, S; Hewitt, M; O'Reilly, B A

    2016-09-01

    The role of robot-assisted surgery across gynaecology is evolving with increasing numbers of procedures being undertaken with varying degrees of complexity. While the risk of conversion is low at approximately 1 %, the reasons for conversion are variable. These range from technical issues with the robot, surgical complications such as haemorrhage and anaesthetics issues such as an inability to ventilate the patient adequately. While many conversions to open or laparoscopic approach are not due to life-threatening indications, it is important that the theatre staff are aware of the indication and can perform an emergency undocking as effectively, efficiently and safely as possible when the need arises. Unfortunately, there is a paucity of the literature available outlining such protocols. For this reason, we developed an emergency undocking protocol clearly outlining the role of each theatre staff member and the need for clear concise communication. PMID:27126584

  11. Leonardo Da Vinci, the genius and the monsters. Casual encounters?

    PubMed

    Ciseri, Lorenzo Montemagno

    2014-01-01

    This article analyses Leonardo's interest in monsters and deformed reality, one of the lesser known aspects of his vast and multifaceted output. With the possible exception of his studies of physiognomy, relevant drawings, sketches and short stories represent a marginal aspect of his work, but they are nevertheless significant for historians of teratology. The purpose of this study is to provide a broad overview of the relationship between Leonardo and both the literature on mythological monsters and the reports on monstrous births that he either read about or witnessed personally. While aspects of his appreciation and attention to beauty and the pursuit of perfection and good proportions are the elements most emphasised in Leonardo's work, other no less interesting aspects related to deformity have been considered of marginal importance. My analysis will demonstrate that Leonardo approached the realm of monstrosity as if he considered abnormality a mirror of normality, deformity a mirror of harmony, and disease a mirror of health, as if to emphasise that, ultimately, it is the monster that gives the world the gift of normality. Two special cases of monstrosity are analysed: the famous monster of Ravenna, whose image was found among his papers, and a very rare case of parasitic conjoined twins (thoracopagus parasiticus) portrayed for the first time alive, probably in Florence, by Leonardo himself.

  12. Distance Learning. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This brochure, part of a series about good practices in vocational training in the European Union, describes 12 projects that use distance learning to promote lifelong learning in adults. The projects and their countries of origin are as follows: (1) 3D Project, training in the use of IT tools for 3D simulation and animation and practical…

  13. Scientific Aspects of Leonardo da Vinci's Drawings: An Interdisciplinary Model.

    ERIC Educational Resources Information Center

    Struthers, Sally A.

    While interdisciplinary courses can help demonstrate the relevance of learning to students and reinforce education from different fields, they can be difficult to implement and are often not cost effective. An interdisciplinary art history course at Ohio's Sinclair Community College incorporates science into the art history curriculum, making use…

  14. [Operation-assisted robot·da Vinci (lung)].

    PubMed

    Nakamura, Hiroshige; Taniguchi, Yuji

    2014-07-01

    The most favorable advantage of robotic surgery is the markedly free movement of joint-equipped robotic forceps under 3-dimensional high-vision. Accurate operation makes complex procedures straightforward and may overcome weak points of previous thoracoscopic surgery. The efficiency and safety improves with acquiring skills. However, the spread of robotic surgery in the general thoracic surgery field has been delayed compared to those in other fields. The surgical indications include primary lung cancer, thymic diseases, and mediastinal tumors, but it is unclear whether technical advantages felt by operators are directly connected to merits for patients. Moreover, problems concerning the cost and education have not been solved. Although evidence is insufficient for robotic thoracic surgery, it may be an extension of thoracoscopic surgery, and reports showing its usefulness for primary lung cancer, myasthenia gravis, and thymoma have been accumulating. Now, important thing is to carry out clinical trial for advanced medical care and insurance acquisition. Although it is necessary to solve important problems such as safety, education, training, the cost for the future development, advancing robot technology has a possibility to markedly change general thoracic surgery. PMID:25138949

  15. Leonardo Da Vinci, the genius and the monsters. Casual encounters?

    PubMed

    Ciseri, Lorenzo Montemagno

    2014-01-01

    This article analyses Leonardo's interest in monsters and deformed reality, one of the lesser known aspects of his vast and multifaceted output. With the possible exception of his studies of physiognomy, relevant drawings, sketches and short stories represent a marginal aspect of his work, but they are nevertheless significant for historians of teratology. The purpose of this study is to provide a broad overview of the relationship between Leonardo and both the literature on mythological monsters and the reports on monstrous births that he either read about or witnessed personally. While aspects of his appreciation and attention to beauty and the pursuit of perfection and good proportions are the elements most emphasised in Leonardo's work, other no less interesting aspects related to deformity have been considered of marginal importance. My analysis will demonstrate that Leonardo approached the realm of monstrosity as if he considered abnormality a mirror of normality, deformity a mirror of harmony, and disease a mirror of health, as if to emphasise that, ultimately, it is the monster that gives the world the gift of normality. Two special cases of monstrosity are analysed: the famous monster of Ravenna, whose image was found among his papers, and a very rare case of parasitic conjoined twins (thoracopagus parasiticus) portrayed for the first time alive, probably in Florence, by Leonardo himself. PMID:25702382

  16. The PAKY, HERMES, AESOP, ZEUS, and da Vinci robotic systems.

    PubMed

    Kim, Hyung L; Schulam, Peter

    2004-11-01

    In 1965 Gordon Moore, cofounder of Intel Corporation, made his famous observation now known as Moore's law. He predicted that computing capacity will double every 18 to 24 months. Since then, Moore's law has held true; the number of transistors per integrated computer circuit has doubled every couple of years. This relentless advance in computer technology ensures future advances in robotic technology. The ultimate goal of robotics is to allow surgeons to perform difficult procedures with a level of precision and improved clinical outcomes not possible by conventional methods. Robotics has the potential to enable surgeons with various levels of surgical skill to achieve a uniform outcome. As long as urologists continue to embrace technological advances and incorporate beneficial technology into their practice, the outlook for patients remains bright.

  17. Possible role of DaVinci Robot in uterine transplantation.

    PubMed

    Iavazzo, Christos; Gkegkes, Ioannis D

    2015-01-01

    Minimally invasive surgery, specifically robotic surgery, became a common technique used by gynecological surgeons over the last decade. The realization of the first human uterine transplantation commenced new perspectives in the treatment of uterine agenesia or infertility in women with history of hysterectomy at a young age. Robot-assisted technique may enhance the safety of the procedure by facilitating the microvascular anastomosis, vaginal anastomosis, and ligaments' fixation. This study proposes the formation of a multicenter collaboration group to organize a protocol with the aim to clarify the possible role of robotic surgery in uterine transplantation. PMID:26401113

  18. MONA, LISA and VINCI Soon Ready to Travel to Paranal

    NASA Astrophysics Data System (ADS)

    2000-11-01

    First Instruments for the VLT Interferometer Summary A few months from now, light from celestial objects will be directed for the first time towards ESO's Very Large Telescope Interferometer (VLTI) at the Paranal Observatory (Chile). During this "First Light" event and the subsequent test phase, the light will be recorded with a special test instrument, VINCI (VLT INterferometer Commissioning Instrument). The main components of this high-tech instrument are aptly named MONA (a system that combines the light beams from several telescopes by means of optical fibers) and LISA (the infrared camera). VINCI was designed and constructed within a fruitful collaboration between ESO and several research institutes and industrial companies in France and Germany . It is now being assembled at the ESO Headquarters in Garching (Germany) and will soon be ready for installation at the telescope on Paranal. With the VLTI and VINCI, Europe's astronomers are now entering the first, crucial phase of an exciting scientific and technology venture that will ultimately put the world's most powerful optical/IR interferometric facility in their hands . PR Photo 31/00 : VINCI during tests at the ESO Headquarters in Garching. The VLT Interferometer (VLTI) ESO Press Photo 31/00 ESO Press Photo 31/00 [Preview; JPEG: 400 x 301; 43k] [Normal; JPEG: 800 x 602;208xk] [Full-Res; JPEG: 1923 x 1448; 2.2Mb] PR Photo 31/00 shows the various components of the complex VINCI instrument for the VLT Interferometer , during the current tests at the Optical Laboratory at the ESO Headquarters in Garching (Germany). It will later be installed in "clean-room" conditions within the Interferometric Laboratory at the Paranal Observatory. This electronic photo was obtained for documentary purposes. VINCI (VLT INterferometer Commissioning Instrument) is the "First Light" instrument for the Very Large Telescope Interferometer (VLTI) at the Paranal Observatory (Chile). Early in 2001, it will be used for the first tests

  19. The Rosslyn Code: Can Physics Explain a 500-Year Old Melody Etched in the Walls of a Scottish Chapel?

    SciTech Connect

    Wilson, Chris

    2011-10-19

    For centuries, historians have puzzled over a series of 213 symbols carved into the stone of Scotland’s Rosslyn Chapel. (Disclaimer: You may recognize this chapel from The Da Vinci Code, but this is real and unrelated!) Several years ago, a composer and science enthusiast noticed that the symbols bore a striking similarity to Chladni patterns, the elegant images that form on a two- dimensional surface when it vibrates at certain frequencies. This man’s theory: A 500-year-old melody was inscribed in the chapel using the language of physics. But not everyone is convinced. Slate senior editor Chris Wilson travelled to Scotland to investigate the claims and listen to this mysterious melody, whatever it is. Come find out what he discovered, including images of the patterns and audio of the music they inspired.

  20. The rare DAT coding variant Val559 perturbs DA neuron function, changes behavior, and alters in vivo responses to psychostimulants.

    PubMed

    Mergy, Marc A; Gowrishankar, Raajaram; Gresch, Paul J; Gantz, Stephanie C; Williams, John; Davis, Gwynne L; Wheeler, C Austin; Stanwood, Gregg D; Hahn, Maureen K; Blakely, Randy D

    2014-11-01

    Despite the critical role of the presynaptic dopamine (DA) transporter (DAT, SLC6A3) in DA clearance and psychostimulant responses, evidence that DAT dysfunction supports risk for mental illness is indirect. Recently, we identified a rare, nonsynonymous Slc6a3 variant that produces the DAT substitution Ala559Val in two male siblings who share a diagnosis of attention-deficit hyperactivity disorder (ADHD), with other studies identifying the variant in subjects with bipolar disorder (BPD) and autism spectrum disorder (ASD). Previously, using transfected cell studies, we observed that although DAT Val559 displays normal total and surface DAT protein levels, and normal DA recognition and uptake, the variant transporter exhibits anomalous DA efflux (ADE) and lacks capacity for amphetamine (AMPH)-stimulated DA release. To pursue the significance of these findings in vivo, we engineered DAT Val559 knock-in mice, and here we demonstrate in this model the presence of elevated extracellular DA levels, altered somatodendritic and presynaptic D2 DA receptor (D2R) function, a blunted ability of DA terminals to support depolarization and AMPH-evoked DA release, and disruptions in basal and psychostimulant-evoked locomotor behavior. Together, our studies demonstrate an in vivo functional impact of the DAT Val559 variant, providing support for the ability of DAT dysfunction to impact risk for mental illness.

  1. 78 FR 58376 - American Asset Development, Inc., aVinci Media Corp., Ceragenix Pharmaceuticals, Inc., Marshall...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-23

    ... From the Federal Register Online via the Government Publishing Office SECURITIES AND EXCHANGE COMMISSION American Asset Development, Inc., aVinci Media Corp., Ceragenix Pharmaceuticals, Inc., Marshall... Pharmaceuticals, Inc. because it has not filed any periodic reports since the period ended December 31, 2009....

  2. [From Leonardo Da Vinci to present days; from the history of antiplague costume].

    PubMed

    Kalmykov, A A; Aminev, R M; Korneev, A G; Polyakov, V S; Artebyakin, S V

    2016-01-01

    As a prototype of the antiplague costume can be considered a special clothing, which physicians in medieval Europe wear for protection in plague nidus. Inventor of the first antiplague costume is considered to be a French doctor Charles de Lorme (1619). Much later, in 1878, a Russian professor Pashutin V V offered to use a costume, which looked like a hermetically sealed "bag" with a special breathing device aimed at protection of medical staff. Later, professor O.I. Dogel's respirator became well-known (1889). At the beginning of 20th century as part of the antiplague costume was used a charcoal filter mask, invented by Zelinsky N.D. Requirements to order the use of modern means of individual protection when working in nidus of especially dangerous infections identified sanitary-epidemiological rules, which reflect issues of laboratory workers working and protective clothing, respiratory protection, and view, especially operation, the procedure of putting on, removing and disinfecting antiplague costumes, pneumocostumes, pneumohelmets, isolation suits, gas-protection boxes, etc. PMID:27120957

  3. [From Leonardo Da Vinci to present days; from the history of antiplague costume].

    PubMed

    Kalmykov, A A; Aminev, R M; Korneev, A G; Polyakov, V S; Artebyakin, S V

    2016-01-01

    As a prototype of the antiplague costume can be considered a special clothing, which physicians in medieval Europe wear for protection in plague nidus. Inventor of the first antiplague costume is considered to be a French doctor Charles de Lorme (1619). Much later, in 1878, a Russian professor Pashutin V V offered to use a costume, which looked like a hermetically sealed "bag" with a special breathing device aimed at protection of medical staff. Later, professor O.I. Dogel's respirator became well-known (1889). At the beginning of 20th century as part of the antiplague costume was used a charcoal filter mask, invented by Zelinsky N.D. Requirements to order the use of modern means of individual protection when working in nidus of especially dangerous infections identified sanitary-epidemiological rules, which reflect issues of laboratory workers working and protective clothing, respiratory protection, and view, especially operation, the procedure of putting on, removing and disinfecting antiplague costumes, pneumocostumes, pneumohelmets, isolation suits, gas-protection boxes, etc.

  4. A new technique for robotic thyroidectomy: "the daVinci gasless single-incision axillary approach".

    PubMed

    Rodriguez, Francisco N S; Low, Rick A; Singer, Jeffrey A; Bornstein, Alan M; Bradford Doxey, J; Hashimoto, Luis A; Rassadi, Roozbeh; Dolce, Charles J; Hollingworth, Alexzandra; Hayes, Chester; Shively, Cynthia J

    2011-09-01

    Robotic thyroidectomy has been recently introduced as a new modality of treatment for selected benign and malignant thyroid lesions. The standard technique, popularized by a leading Korean group, combines an axillary and a thoracic approach to accomplish thyroid resection without neck incision. We recently introduced a modified technique that has enabled us to complete robotic thyroidectomy through a single axillary incision. We herein report our initial successful experience in 35 cases with the modified technique.

  5. Back to the Drawing Board Reconstructing DaVinci's Vitruvian Man to Teach Anatomy

    ERIC Educational Resources Information Center

    Babaian, C.

    2009-01-01

    In today's high tech world, one hardly expects to see the original chalkboard or blackboard utilized in research, teaching, or scientific communication, but having spent an equal number of years doing both art and biology and dabbling in computer graphics, the author has found the simple technology of the chalkboard and chalk to have incredible…

  6. Moving towards Optimising Demand-Led Learning: The 2005-2007 ECUANET Leonardo Da Vinci Project

    ERIC Educational Resources Information Center

    Dealtry, Richard; Howard, Keith

    2008-01-01

    Purpose: The purpose of this paper is to present the key project learning points and outcomes as a guideline for the future quality management of demand-led learning and development. Design/methodology/approach: The research methodology was based upon a corporate university blueprint architecture and browser toolkit developed by a member of the…

  7. Social and Occupational Integration of Disadvantaged People. Leonardo da Vinci Good Practices Series.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles nine European programs that exemplify good practice in social and occupational integration of disadvantaged people. The programs profiled are as follows: (1) Restaurant Venezia (a CD-ROM program to improve the reading and writing skills of young people in Luxembourg who have learning difficulties); (2) an integrated…

  8. Building Skills and Qualifications among SME Employees. Leonardo da Vinci Good Practices Series.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles 10 European programs that exemplify good practice in building skills and qualifications among employees of small and medium enterprises (SMEs). The programs profiled are as follows: (1) TRICTSME (a program providing World Wide Web-based information and communication technologies training for SMEs in manufacturing); (2)…

  9. Depth of Monocular Elements in a Binocular Scene: The Conditions for da Vinci Stereopsis

    ERIC Educational Resources Information Center

    Cook, Michael; Gillam, Barbara

    2004-01-01

    Quantitative depth based on binocular resolution of visibility constraints is demonstrated in a novel stereogram representing an object, visible to 1 eye only, and seen through an aperture or camouflaged against a background. The monocular region in the display is attached to the binocular region, so that the stereogram represents an object which…

  10. Leonardo da Vinci, visual perspective and the crystalline sphere (lens): if only Leonardo had had a freezer.

    PubMed

    Hilloowala, Rumy

    2004-06-01

    This study confirms Leonardo's claim to have experimented on the bovine eye to determine the internal anatomy of the eye. The experiment, as described by Leonardo, was repeated in our laboratory. The study further discusses Leonardo's primary interest in the study of the eye (especially the lens), to determine how the image of an object which enters the eye in an inverted form is righted. The study shows the evolution of Leonardo's understanding of the anatomy and the physiology of vision. Initially, in keeping with his reading of the literature, the lens was placed in the centre but he made it globular. Later he promulgated two theories, reflection from the uvea and refraction within the lens to explain reversal of the image in the eye. Subsequently he rejected the first theory and, putting credence in the second theory, experimented (1509) to show that the lens is globular and is centrally placed. The fact that the present knowledge about the lens is at variance from his findings is not because he did not carry out the experiment, as suggested by some modern authors, but because of the limitation of the techniques available to him at the time.

  11. Totally thoracoscopic surgery for the treatment of atrial septal defect without of the robotic Da Vinci surgical system

    PubMed Central

    2013-01-01

    Background More and more surgeons and patients focus on the minimally invasive surgical techniques in the 21st century. Totally thoracoscopic operation provides another minimal invasive surgical option for patients with ASD (atrial septal defect). In this study, we reported our experience of 61 patients with atrial septal defect who underwent totally thoracoscopic operation and discussed the feasibility and safety of the new technique. Methods From January 2010 to October 2012, 61 patients with atrial septal defect underwent totally thoracoscopic closure but not traditional median sternotomy surgery. We divided the 61 patients into two groups based on the operation sequence. The data of group A (the first 30 cases) and group B (the last 31 cases). The mean age of the patients was 35.1 ± 12.8 years (range, 6.3 to 63.5 years), and mean weight was 52.7 ± 11.9 kg (range, 30.5 to 80 kg). Mean size of the atrial septal defect was 16.8 ± 11.3 mm (range, 13 to 39 mm) based on the description of the echocardiography. Results All patients underwent totally thoracoscopy successfully, 36 patients with pericardium patch and 25 patients were sutured directly. 7 patients underwent concomitant tricuspid valvuloplasty with Key technique. No death, reoperation or complete atrioventricular block occurred. The mean time of cardiopulmonary bypass was 68.5 ± 19.1 min (range, 31.0 to 153.0 min), the mean time of aortic cross-clamp was 27.2 ± 11.3 min (range, 0.0 to 80.0 min) and the mean time of operation was 149.8 ± 35.7 min (range, 63.0 to 300.0 min). Postoperative mechanical ventilation averaged 4.9 ± 2.5 hours (range, 3.5 to 12.6 hours), and the duration of intensive care unit stay 20.0 ± 4.8 hours (range, 15.5 to 25 hours). The mean volume of blood drainage was 158 ± 38 ml (range, 51 to 800 ml). No death, residual shunt, lung atelectasis or moderate tricuspid regurgitation was found at 3-month follow-up. Conclusion The totally thoracoscopic operation is feasible and safe for patients with ASD, even with or without tricuspid regurgitation. This technique provides another minimal invasive surgical option for patients with atrial septal defect. PMID:23634811

  12. Fallopian tube carcinoma in a patient with a pelvic kidney: surgical management with the da Vinci robot.

    PubMed

    Hoffman, Mitchel S

    2012-06-01

    A patient with a known pelvic kidney and early fallopian tube carcinoma was managed with robotically assisted surgery. Associated conginital anomalies were noted and described. The final stage of the cancer was 1C, grade 3 and she is without evidence of recurrent cancer 2 years following completion of chemotherapy. PMID:27628284

  13. Identification of a novel Drosophila melanogaster heat-shock gene, lethal(2)denticleless [l(2)dtl], coding for an 83-kDa protein.

    PubMed

    Kurzik-Dumke, U; Neubauer, M; Debes, A

    1996-06-01

    In this study, we describe the identification of a novel Drosophila melanogaster (Dm) gene, l(2)dtl, characterized by elevated expression under heat-shock (HS) conditions. It encodes a protein of 83 kDa with no homology to known members of the HSP90 family and other proteins. Gene l(2)dtl is located on the right arm of the second chromosome at locus 59F5, close to the tumor suppressor gene l(2)tid, a homolog of the dnaJ encoding a chaperone strongly conserved in evolution. In the following, we present the sequence of l(2)dtl, the putative protein it encodes, and its molecular localization in a closely interspaced gene cluster consisting of at least four nested genes spanning an approximately 10-kb genomic interval. Furthermore, we present the temporal expression of l(2)dtl in the wild type under normal and HS conditions, and describe the isolation and the phenotype of eight embryonic lethal l(2)dtl mutants.

  14. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  15. Detection of Multiple Budding Yeast Cells and a Partial Sequence of 43-kDa Glycoprotein Coding Gene of Paracoccidioides brasiliensis from a Case of Lacaziosis in a Female Pacific White-Sided Dolphin (Lagenorhynchus obliquidens).

    PubMed

    Minakawa, Tomoko; Ueda, Keiichi; Tanaka, Miyuu; Tanaka, Natsuki; Kuwamura, Mitsuru; Izawa, Takeshi; Konno, Toshihiro; Yamate, Jyoji; Itano, Eiko Nakagawa; Sano, Ayako; Wada, Shinpei

    2016-08-01

    Lacaziosis, formerly called as lobomycosis, is a zoonotic mycosis, caused by Lacazia loboi, found in humans and dolphins, and is endemic in the countries on the Atlantic Ocean, Indian Ocean and Pacific Ocean of Japanese coast. Susceptible Cetacean species include the bottlenose dolphin (Tursiops truncatus), the Indian Ocean bottlenose dolphin (T. aduncus), and the estuarine dolphin (Sotalia guianensis); however, no cases have been recorded in other Cetacean species. We diagnosed a case of Lacaziosis in a Pacific white-sided dolphin (Lagenorhynchus obliquidens) nursing in an aquarium in Japan. The dolphin was a female estimated to be more than 14 years old at the end of June 2015 and was captured in a coast of Japan Sea in 2001. Multiple, lobose, and solid granulomatous lesions with or without ulcers appeared on her jaw, back, flipper and fluke skin, in July 2014. The granulomatous skin lesions from the present case were similar to those of our previous cases. Multiple budding and chains of round yeast cells were detected in the biopsied samples. The partial sequence of 43-kDa glycoprotein coding gene confirmed by a nested PCR and sequencing, which revealed a different genotype from both Amazonian and Japanese lacaziosis in bottlenose dolphins, and was 99 % identical to those derived from Paracoccidioides brasiliensis; a sister fungal species to L. loboi. This is the first case of lacaziosis in Pacific white-sided dolphin.

  16. Detection of Multiple Budding Yeast Cells and a Partial Sequence of 43-kDa Glycoprotein Coding Gene of Paracoccidioides brasiliensis from a Case of Lacaziosis in a Female Pacific White-Sided Dolphin (Lagenorhynchus obliquidens).

    PubMed

    Minakawa, Tomoko; Ueda, Keiichi; Tanaka, Miyuu; Tanaka, Natsuki; Kuwamura, Mitsuru; Izawa, Takeshi; Konno, Toshihiro; Yamate, Jyoji; Itano, Eiko Nakagawa; Sano, Ayako; Wada, Shinpei

    2016-08-01

    Lacaziosis, formerly called as lobomycosis, is a zoonotic mycosis, caused by Lacazia loboi, found in humans and dolphins, and is endemic in the countries on the Atlantic Ocean, Indian Ocean and Pacific Ocean of Japanese coast. Susceptible Cetacean species include the bottlenose dolphin (Tursiops truncatus), the Indian Ocean bottlenose dolphin (T. aduncus), and the estuarine dolphin (Sotalia guianensis); however, no cases have been recorded in other Cetacean species. We diagnosed a case of Lacaziosis in a Pacific white-sided dolphin (Lagenorhynchus obliquidens) nursing in an aquarium in Japan. The dolphin was a female estimated to be more than 14 years old at the end of June 2015 and was captured in a coast of Japan Sea in 2001. Multiple, lobose, and solid granulomatous lesions with or without ulcers appeared on her jaw, back, flipper and fluke skin, in July 2014. The granulomatous skin lesions from the present case were similar to those of our previous cases. Multiple budding and chains of round yeast cells were detected in the biopsied samples. The partial sequence of 43-kDa glycoprotein coding gene confirmed by a nested PCR and sequencing, which revealed a different genotype from both Amazonian and Japanese lacaziosis in bottlenose dolphins, and was 99 % identical to those derived from Paracoccidioides brasiliensis; a sister fungal species to L. loboi. This is the first case of lacaziosis in Pacific white-sided dolphin. PMID:26883513

  17. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  18. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  19. TRACKING CODE DEVELOPMENT FOR BEAM DYNAMICS OPTIMIZATION

    SciTech Connect

    Yang, L.

    2011-03-28

    Dynamic aperture (DA) optimization with direct particle tracking is a straight forward approach when the computing power is permitted. It can have various realistic errors included and is more close than theoretical estimations. In this approach, a fast and parallel tracking code could be very helpful. In this presentation, we describe an implementation of storage ring particle tracking code TESLA for beam dynamics optimization. It supports MPI based parallel computing and is robust as DA calculation engine. This code has been used in the NSLS-II dynamics optimizations and obtained promising performance.

  20. MCNP code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids.

  1. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  2. The polar wind of the fast rotating Be star Achernar. VINCI/VLTI interferometric observations of an elongated polar envelope

    NASA Astrophysics Data System (ADS)

    Kervella, P.; Domiciano de Souza, A.

    2006-07-01

    Context: .Be stars show evidence of mass loss and circumstellar envelopes (CSE) from UV resonance lines, near-IR excesses, and the presence of episodic hydrogen emission lines. The geometry of these envelopes is still uncertain, although it is often assumed that they are formed by a disk around the stellar equator and a hot polar wind. Aims: .We probe the close environment of the fast rotating Be star Achernar at angular scales of a few milliarcseconds (mas) in the infrared, in order to constrain the geometry of a possible polar CSE. Methods: .We obtained long-baseline interferometric observations of Achernar with the VINCI/VLTI beam combiner in the H and K bands, using various telescope configurations and baseline lengths with a wide azimuthal coverage. Results: .The observed visibility measurements along the polar direction are significantly lower than the visibility function of the photosphere of the star alone, in particular at low spatial frequencies. This points to the presence of an asymmetric diffuse CSE elongated along the polar direction of the star. To our data, we fit a simple model consisting of two components: a 2D elliptical Gaussian superimposed on a uniform ellipse representing the distorted photosphere of the fast rotating star. Conclusions: .We clearly detected a CSE elongated along the polar axis of the star, as well as rotational flattening of the stellar photosphere. For the uniform-ellipse photosphere we derive a major axis of θ_eq = 2.13 ± 0.05 mas and a minor axis of θ_pol = 1.51 ± 0.02 mas. The relative near-IR flux measured for the CSE compared to the stellar photosphere is f = 4.7 ± 0.3%. Its angular dimensions are loosely constrained by the available data at ρ_eq = 2.7 ± 1.3 mas and ρ_pol = 17.6 ± 4.9 mas. This CSE could be linked to free-free emission from the radiative pressure driven wind originating from the hot polar caps of the star.

  3. Imaging atherosclerosis with hybrid [18F]fluorodeoxyglucose positron emission tomography/computed tomography imaging: what Leonardo da Vinci could not see.

    PubMed

    Cocker, Myra S; Mc Ardle, Brian; Spence, J David; Lum, Cheemun; Hammond, Robert R; Ongaro, Deidre C; McDonald, Matthew A; Dekemp, Robert A; Tardif, Jean-Claude; Beanlands, Rob S B

    2012-12-01

    Prodigious efforts and landmark discoveries have led toward significant advances in our understanding of atherosclerosis. Despite significant efforts, atherosclerosis continues globally to be a leading cause of mortality and reduced quality of life. With surges in the prevalence of obesity and diabetes, atherosclerosis is expected to have an even more pronounced impact upon the global burden of disease. It is imperative to develop strategies for the early detection of disease. Positron emission tomography (PET) imaging utilizing [(18)F]fluorodeoxyglucose (FDG) may provide a non-invasive means of characterizing inflammatory activity within atherosclerotic plaque, thus serving as a surrogate biomarker for detecting vulnerable plaque. The aim of this review is to explore the rationale for performing FDG imaging, provide an overview into the mechanism of action, and summarize findings from the early application of FDG PET imaging in the clinical setting to evaluate vascular disease. Alternative imaging biomarkers and approaches are briefly discussed.

  4. [The "myologie dynamique" by Girolamo Fabrizi da Aquapendente in the scientific language in the Renaissance age (XVI-XVII)].

    PubMed

    Stroppiana, L

    1989-01-01

    Beginning from the XV century, mechanical materialism underwent an evolution in "biological mechanics" within the scientific doctrine. Among the greatest exponents of this new current there were two Italian men, Leonardo da Vinci (1452-1519) and Girolamo da Acquapendente (1533-1619). By the trend given by Leonardo, the myology, instead of being a static science, took a dynamic meaning and valence. Later, Fabrizi resumed and investigated the subject above all in its less known expression, elaborating an original theory. With Acquapendente, the anatomy lost its merely descriptive pecularity and evolved in analysis of the structure in connection with the function. Moreover, he opposed the syllogism against the mechanic language and the mathematical formulation. A new scientific way will be afterwards characterized by Galileo Galilei in the field of the physics and by Giovanni Alfonso Borrelli in the biology. PMID:11640090

  5. [The "myologie dynamique" by Girolamo Fabrizi da Aquapendente in the scientific language in the Renaissance age (XVI-XVII)].

    PubMed

    Stroppiana, L

    1989-01-01

    Beginning from the XV century, mechanical materialism underwent an evolution in "biological mechanics" within the scientific doctrine. Among the greatest exponents of this new current there were two Italian men, Leonardo da Vinci (1452-1519) and Girolamo da Acquapendente (1533-1619). By the trend given by Leonardo, the myology, instead of being a static science, took a dynamic meaning and valence. Later, Fabrizi resumed and investigated the subject above all in its less known expression, elaborating an original theory. With Acquapendente, the anatomy lost its merely descriptive pecularity and evolved in analysis of the structure in connection with the function. Moreover, he opposed the syllogism against the mechanic language and the mathematical formulation. A new scientific way will be afterwards characterized by Galileo Galilei in the field of the physics and by Giovanni Alfonso Borrelli in the biology.

  6. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  7. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue. PMID:26633789

  8. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  9. To Code or Not To Code?

    ERIC Educational Resources Information Center

    Parkinson, Brian; Sandhu, Parveen; Lacorte, Manel; Gourlay, Lesley

    1998-01-01

    This article considers arguments for and against the use of coding systems in classroom-based language research and touches on some relevant considerations from ethnographic and conversational analysis approaches. The four authors each explain and elaborate on their practical decision to code or not to code events or utterances at a specific point…

  10. Bare Code Reader

    NASA Astrophysics Data System (ADS)

    Clair, Jean J.

    1980-05-01

    The Bare code system will be used, in every market and supermarket. The code, which is normalised in US and Europe (code EAN) gives informations on price, storage, nature and allows in real time the gestion of theshop.

  11. Cryptographer

    ERIC Educational Resources Information Center

    Sullivan, Megan

    2005-01-01

    For the general public, the field of cryptography has recently become famous as the method used to uncover secrets in Dan Brown's fictional bestseller, The Da Vinci Code. But the science of cryptography has been popular for centuries--secret hieroglyphics discovered in Egypt suggest that code-making dates back almost 4,000 years. In today's…

  12. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  13. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  14. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  15. Parafermion stabilizer codes

    NASA Astrophysics Data System (ADS)

    Güngördü, Utkan; Nepal, Rabindra; Kovalev, Alexey A.

    2014-10-01

    We define and study parafermion stabilizer codes, which can be viewed as generalizations of Kitaev's one-dimensional (1D) model of unpaired Majorana fermions. Parafermion stabilizer codes can protect against low-weight errors acting on a small subset of parafermion modes in analogy to qudit stabilizer codes. Examples of several smallest parafermion stabilizer codes are given. A locality-preserving embedding of qudit operators into parafermion operators is established that allows one to map known qudit stabilizer codes to parafermion codes. We also present a local 2D parafermion construction that combines topological protection of Kitaev's toric code with additional protection relying on parity conservation.

  16. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  17. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  18. Nonbinary Quantum Convolutional Codes Derived from Negacyclic Codes

    NASA Astrophysics Data System (ADS)

    Chen, Jianzhang; Li, Jianping; Yang, Fan; Huang, Yuanyuan

    2015-01-01

    In this paper, some families of nonbinary quantum convolutional codes are constructed by using negacyclic codes. These nonbinary quantum convolutional codes are different from quantum convolutional codes in the literature. Moreover, we construct a family of optimal quantum convolutional codes.

  19. Asymmetric quantum convolutional codes

    NASA Astrophysics Data System (ADS)

    La Guardia, Giuliano G.

    2016-01-01

    In this paper, we construct the first families of asymmetric quantum convolutional codes (AQCCs). These new AQCCs are constructed by means of the CSS-type construction applied to suitable families of classical convolutional codes, which are also constructed here. The new codes have non-catastrophic generator matrices, and they have great asymmetry. Since our constructions are performed algebraically, i.e. we develop general algebraic methods and properties to perform the constructions, it is possible to derive several families of such codes and not only codes with specific parameters. Additionally, several different types of such codes are obtained.

  20. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  1. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  2. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  3. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  4. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  5. Coding for Electronic Mail

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  6. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  7. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  8. DLLExternalCode

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read frommore » files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.« less

  9. Parafermion stabilizer codes

    NASA Astrophysics Data System (ADS)

    Gungordu, Utkan; Nepal, Rabindra; Kovalev, Alexey

    2015-03-01

    We define and study parafermion stabilizer codes [Phys. Rev. A 90, 042326 (2014)] which can be viewed as generalizations of Kitaev's one dimensional model of unpaired Majorana fermions. Parafermion stabilizer codes can protect against low-weight errors acting on a small subset of parafermion modes in analogy to qudit stabilizer codes. Examples of several smallest parafermion stabilizer codes are given. Our results show that parafermions can achieve a better encoding rate than Majorana fermions. A locality preserving embedding of qudit operators into parafermion operators is established which allows one to map known qudit stabilizer codes to parafermion codes. We also present a local 2D parafermion construction that combines topological protection of Kitaev's toric code with additional protection relying on parity conservation. This work was supported in part by the NSF under Grants No. Phy-1415600 and No. NSF-EPSCoR 1004094.

  10. Mathematical Fiction for Senior Students and Undergraduates: Novels, Plays, and Film

    ERIC Educational Resources Information Center

    Padula, Janice

    2006-01-01

    Mathematical fiction has probably existed since ideas have been written down and certainly as early as 414 BC (Kasman, 2000). Mathematical fiction is a recently rediscovered and growing literature, as sales of the novels: "The Curious Incident of the Dog in the Night-time" (Haddon, 2003) and "The Da Vinci Code" (Brown, 2004) attest. Science…

  11. Industrial Code Development

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1991-01-01

    The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.

  12. Updating the Read Codes

    PubMed Central

    Robinson, David; Comp, Dip; Schulz, Erich; Brown, Philip; Price, Colin

    1997-01-01

    Abstract The Read Codes are a hierarchically-arranged controlled clinical vocabulary introduced in the early 1980s and now consisting of three maintained versions of differing complexity. The code sets are dynamic, and are updated quarterly in response to requests from users including clinicians in both primary and secondary care, software suppliers, and advice from a network of specialist healthcare professionals. The codes' continual evolution of content, both across and within versions, highlights tensions between different users and uses of coded clinical data. Internal processes, external interactions and new structural features implemented by the NHS Centre for Coding and Classification (NHSCCC) for user interactive maintenance of the Read Codes are described, and over 2000 items of user feedback episodes received over a 15-month period are analysed. PMID:9391934

  13. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  14. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  15. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  16. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  17. Doubled Color Codes

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey

    Combining protection from noise and computational universality is one of the biggest challenges in the fault-tolerant quantum computing. Topological stabilizer codes such as the 2D surface code can tolerate a high level of noise but implementing logical gates, especially non-Clifford ones, requires a prohibitively large overhead due to the need of state distillation. In this talk I will describe a new family of 2D quantum error correcting codes that enable a transversal implementation of all logical gates required for the universal quantum computing. Transversal logical gates (TLG) are encoded operations that can be realized by applying some single-qubit rotation to each physical qubit. TLG are highly desirable since they introduce no overhead and do not spread errors. It has been known before that a quantum code can have only a finite number of TLGs which rules out computational universality. Our scheme circumvents this no-go result by combining TLGs of two different quantum codes using the gauge-fixing method pioneered by Paetznick and Reichardt. The first code, closely related to the 2D color code, enables a transversal implementation of all single-qubit Clifford gates such as the Hadamard gate and the π / 2 phase shift. The second code that we call a doubled color code provides a transversal T-gate, where T is the π / 4 phase shift. The Clifford+T gate set is known to be computationally universal. The two codes can be laid out on the honeycomb lattice with two qubits per site such that the code conversion requires parity measurements for six-qubit Pauli operators supported on faces of the lattice. I will also describe numerical simulations of logical Clifford+T circuits encoded by the distance-3 doubled color code. Based on a joint work with Andrew Cross.

  18. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  19. Bar Code Labels

    NASA Technical Reports Server (NTRS)

    1988-01-01

    American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.

  20. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  1. MORSE Monte Carlo code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  2. Research on universal combinatorial coding.

    PubMed

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value.

  3. Research on universal combinatorial coding.

    PubMed

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  4. Clip and Save.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2003-01-01

    Focuses on the facial expression in the "Mona Lisa" by Leonardo da Vinci. Offers background information on da Vinci as well as learning activities for students. Includes a reproduction of the "Mona Lisa" and information about the painting. (CMK)

  5. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  6. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  7. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  8. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  9. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  10. Combustion chamber analysis code

    NASA Astrophysics Data System (ADS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-05-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  11. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  12. Energy Conservation Code Decoded

    SciTech Connect

    Cole, Pam C.; Taylor, Zachary T.

    2006-09-01

    Designing an energy-efficient, affordable, and comfortable home is a lot easier thanks to a slime, easier to read booklet, the 2006 International Energy Conservation Code (IECC), published in March 2006. States, counties, and cities have begun reviewing the new code as a potential upgrade to their existing codes. Maintained under the public consensus process of the International Code Council, the IECC is designed to do just what its title says: promote the design and construction of energy-efficient homes and commercial buildings. Homes in this case means traditional single-family homes, duplexes, condominiums, and apartment buildings having three or fewer stories. The U.S. Department of Energy, which played a key role in proposing the changes that resulted in the new code, is offering a free training course that covers the residential provisions of the 2006 IECC.

  13. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  14. Quantum convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng

    2014-12-01

    In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ≡ 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

  15. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  16. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  17. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  18. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  19. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  20. KENO-V code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P/sub 1/ scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes.

  1. Coded aperture compressive temporal imaging.

    PubMed

    Llull, Patrick; Liao, Xuejun; Yuan, Xin; Yang, Jianbo; Kittle, David; Carin, Lawrence; Sapiro, Guillermo; Brady, David J

    2013-05-01

    We use mechanical translation of a coded aperture for code division multiple access compression of video. We discuss the compressed video's temporal resolution and present experimental results for reconstructions of > 10 frames of temporal data per coded snapshot.

  2. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode

  3. Induction technology optimization code

    SciTech Connect

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-08-21

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. The Induction Technology Optimization Study (ITOS) was undertaken to examine viable combinations of a linear induction accelerator and a relativistic klystron (RK) for high power microwave production. It is proposed, that microwaves from the RK will power a high-gradient accelerator structure for linear collider development. Previous work indicates that the RK will require a nominal 3-MeV, 3-kA electron beam with a 100-ns flat top. The proposed accelerator-RK combination will be a high average power system capable of sustained microwave output at a 300-Hz pulse repetition frequency. The ITOS code models many combinations of injector, accelerator, and pulse power designs that will supply an RK with the beam parameters described above.

  4. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  5. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  6. Adaptation and visual coding

    PubMed Central

    Webster, Michael A.

    2011-01-01

    Visual coding is a highly dynamic process and continuously adapting to the current viewing context. The perceptual changes that result from adaptation to recently viewed stimuli remain a powerful and popular tool for analyzing sensory mechanisms and plasticity. Over the last decade, the footprints of this adaptation have been tracked to both higher and lower levels of the visual pathway and over a wider range of timescales, revealing that visual processing is much more adaptable than previously thought. This work has also revealed that the pattern of aftereffects is similar across many stimulus dimensions, pointing to common coding principles in which adaptation plays a central role. However, why visual coding adapts has yet to be fully answered. PMID:21602298

  7. FAA Smoke Transport Code

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a codemore » obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.« less

  8. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  9. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  10. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  11. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  12. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  13. Codes with Monotonic Codeword Lengths.

    ERIC Educational Resources Information Center

    Abrahams, Julia

    1994-01-01

    Discusses the minimum average codeword length coding under the constraint that the codewords are monotonically nondecreasing in length. Bounds on the average length of an optimal monotonic code are derived, and sufficient conditions are given such that algorithms for optimal alphabetic codes can be used to find the optimal monotonic code. (six…

  14. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  15. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  16. The revised genetic code

    NASA Astrophysics Data System (ADS)

    Ninio, Jacques

    1990-03-01

    Recent findings on the genetic code are reviewed, including selenocysteine usage, deviations in the assignments of sense and nonsense codons, RNA editing, natural ribosomal frameshifts and non-orthodox codon-anticodon pairings. A multi-stage codon reading process is presented.

  17. Dual Coding in Children.

    ERIC Educational Resources Information Center

    Burton, John K.; Wildman, Terry M.

    The purpose of this study was to test the applicability of the dual coding hypothesis to children's recall performance. The hypothesis predicts that visual interference will have a small effect on the recall of visually presented words or pictures, but that acoustic interference will cause a decline in recall of visually presented words and…

  18. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  19. Code of Ethics.

    ERIC Educational Resources Information Center

    Association of College Unions-International, Bloomington, IN.

    The code of ethics for the college union and student activities professional is presented by the Association of College Unions-International. The preamble identifies the objectives of the college union as providing campus community centers and social programs that enhance the quality of life for members of the academic community. Ethics for…

  20. Odor Coding Sensor

    NASA Astrophysics Data System (ADS)

    Hayashi, Kenshi

    Odor is a one of important sensing parameters for human life. However, odor has not been quantified by a measuring instrument because of its vagueness. In this paper, a measuring of odor with odor coding, which are vector quantities of plural odor molecular information, and its applications are described.

  1. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  2. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  3. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  4. The Redox Code

    PubMed Central

    Jones, Dean P.

    2015-01-01

    Abstract Significance: The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O2 and H2O2 contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Recent Advances: Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Critical Issues: Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Future Directions: Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine. Antioxid. Redox Signal. 23, 734–746. PMID:25891126

  5. Code of Ethics.

    ERIC Educational Resources Information Center

    American Sociological Association, Washington, DC.

    The American Sociological Association's code of ethics for sociologists is presented. For sociological research and practice, 10 requirements for ethical behavior are identified, including: maintaining objectivity and integrity; fully reporting findings and research methods, without omission of significant data; reporting fully all sources of…

  6. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  7. Finite Element Analysis Code

    2006-03-08

    MAPVAR-KD is designed to transfer solution results from one finite element mesh to another. MAPVAR-KD draws heavily from the structure and coding of MERLIN II, but it employs a new finite element data base, EXODUS II, and offers enhanced speed and new capabilities not available in MERLIN II. In keeping with the MERLIN II documentation, the computational algorithms used in MAPVAR-KD are described. User instructions are presented. Example problems are included to demonstrate the operationmore » of the code and the effects of various input options. MAPVAR-KD is a modification of MAPVAR in which the search algorithm was replaced by a kd-tree-based search for better performance on large problems.« less

  8. The NIMROD Code

    NASA Astrophysics Data System (ADS)

    Schnack, D. D.; Glasser, A. H.

    1996-11-01

    NIMROD is a new code system that is being developed for the analysis of modern fusion experiments. It is being designed from the beginning to make the maximum use of massively parallel computer architectures and computer graphics. The NIMROD physics kernel solves the three-dimensional, time-dependent two-fluid equations with neo-classical effects in toroidal geometry of arbitrary poloidal cross section. The NIMROD system also includes a pre-processor, a grid generator, and a post processor. User interaction with NIMROD is facilitated by a modern graphical user interface (GUI). The NIMROD project is using Quality Function Deployment (QFD) team management techniques to minimize re-engineering and reduce code development time. This paper gives an overview of the NIMROD project. Operation of the GUI is demonstrated, and the first results from the physics kernel are given.

  9. Finite Element Analysis Code

    SciTech Connect

    Sjaardema, G.; Wellman, G.; Gartling, D.

    2006-03-08

    MAPVAR-KD is designed to transfer solution results from one finite element mesh to another. MAPVAR-KD draws heavily from the structure and coding of MERLIN II, but it employs a new finite element data base, EXODUS II, and offers enhanced speed and new capabilities not available in MERLIN II. In keeping with the MERLIN II documentation, the computational algorithms used in MAPVAR-KD are described. User instructions are presented. Example problems are included to demonstrate the operation of the code and the effects of various input options. MAPVAR-KD is a modification of MAPVAR in which the search algorithm was replaced by a kd-tree-based search for better performance on large problems.

  10. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  11. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  12. Finite Element Analysis Code

    SciTech Connect

    Forsythe, C.; Smith, M.; Sjaardema, G.

    2005-06-26

    Exotxt is an analysis code that reads finite element results data stored in an exodusII file and generates a file in a structured text format. The text file can be edited or modified via a number of text formatting tools. Exotxt is used by analysis to translate data from the binary exodusII format into a structured text format which can then be edited or modified and then either translated back to exodusII format or to another format.

  13. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  14. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  15. Bar coded retroreflective target

    SciTech Connect

    Vann, C.S.

    2000-01-25

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  16. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  17. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  18. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic.

  19. Structural coding versus free-energy predictive coding.

    PubMed

    van der Helm, Peter A

    2016-06-01

    Focusing on visual perceptual organization, this article contrasts the free-energy (FE) version of predictive coding (a recent Bayesian approach) to structural coding (a long-standing representational approach). Both use free-energy minimization as metaphor for processing in the brain, but their formal elaborations of this metaphor are fundamentally different. FE predictive coding formalizes it by minimization of prediction errors, whereas structural coding formalizes it by minimization of the descriptive complexity of predictions. Here, both sides are evaluated. A conclusion regarding competence is that FE predictive coding uses a powerful modeling technique, but that structural coding has more explanatory power. A conclusion regarding performance is that FE predictive coding-though more detailed in its account of neurophysiological data-provides a less compelling cognitive architecture than that of structural coding, which, for instance, supplies formal support for the computationally powerful role it attributes to neuronal synchronization.

  20. Computer-Based Coding of Occupation Codes for Epidemiological Analyses.

    PubMed

    Russ, Daniel E; Ho, Kwan-Yuet; Johnson, Calvin A; Friesen, Melissa C

    2014-05-01

    Mapping job titles to standardized occupation classification (SOC) codes is an important step in evaluating changes in health risks over time as measured in inspection databases. However, manual SOC coding is cost prohibitive for very large studies. Computer based SOC coding systems can improve the efficiency of incorporating occupational risk factors into large-scale epidemiological studies. We present a novel method of mapping verbatim job titles to SOC codes using a large table of prior knowledge available in the public domain that included detailed description of the tasks and activities and their synonyms relevant to each SOC code. Job titles are compared to our knowledge base to find the closest matching SOC code. A soft Jaccard index is used to measure the similarity between a previously unseen job title and the knowledge base. Additional information such as standardized industrial codes can be incorporated to improve the SOC code determination by providing additional context to break ties in matches. PMID:25221787

  1. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  2. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  3. Safety and efficacy of non-vitamin K oral anticoagulant treatment compared with warfarin in patients with non-valvular atrial fibrillation who develop acute ischemic stroke or transient ischemic attack: a multicenter prospective cohort study (daVinci study).

    PubMed

    Saji, Naoki; Kimura, Kazumi; Tateishi, Yohei; Fujimoto, Shigeru; Kaneko, Nobuyuki; Urabe, Takao; Tsujino, Akira; Iguchi, Yasuyuki

    2016-11-01

    The safety and efficacy of non-vitamin K oral anticoagulant (NOAC) compared with warfarin in treating patients with non-valvular atrial fibrillation (NVAF) who developed acute ischemic stroke or transient ischemic attack (AIS/TIA), particularly those receiving tissue-plasminogen activator (tPA) therapy, remains unclear. Between April 2012 and December 2014, we conducted a multicenter prospective cohort study to assess the current clinical practice for treating such patients. We divided the patients into two groups according to the administration of oral anticoagulants (warfarin or NOACs) and tPA therapy. The risk of any hemorrhagic or ischemic event was compared within 1 month after the onset of stroke. We analyzed 235 patients with AIS/TIA including 73 who received tPA therapy. Oral anticoagulants were initiated within 2-4 inpatient days. NOACs were administered to 49.8 % of patients, who were predominantly male, younger, had small infarcts, lower NIHSS scores, and had a lower all-cause mortality rate (0 vs. 4.2 %, P = 0.06) and a lower risk of any ischemic events (6.0 vs. 7.6 %, P = 0.797) compared with warfarin users. The prevalence of all hemorrhagic events was equivalent between the two groups. Early initiation of NOACs after tPA therapy appeared to lower the risk of hemorrhagic events, although there was no significant difference (0 vs. 5.6 %, P = 0.240). Although more clinicians are apt to prescribe NOACs in minor ischemic stroke, NOAC treatment may provide a potential benefit in such cases. Early initiation of NOACs after tPA therapy may reduce the risk of hemorrhagic events compared with warfarin.

  4. Reflections on Post-16 Strategies in European Countries. Interim Report of the Leonardo da Vinci/Multiplier Effect Project III.3.a. Priority 2: Forging Links between Educational Establishments and Enterprises (1997-2000) ID 27009. Working Papers, No. 9.

    ERIC Educational Resources Information Center

    Stenstrom, Marja-Leena, Ed.

    This four-part publication contains 19 papers on educational practices and promises for post-16 education in European countries. Part I, the introduction, contains these three papers: "Sharpening Post-16 Education Strategies: Building on the Results of the Previous Projects" (Johanna Lasonen); "'Parity of Esteem' and 'Integrated…

  5. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  6. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  7. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  8. Finite Element Analysis Code

    2005-06-26

    Exotxt is an analysis code that reads finite element results data stored in an exodusII file and generates a file in a structured text format. The text file can be edited or modified via a number of text formatting tools. Exotxt is used by analysis to translate data from the binary exodusII format into a structured text format which can then be edited or modified and then either translated back to exodusII format or tomore » another format.« less

  9. Finite Element Analysis Code

    SciTech Connect

    Sjaardema, G.; Forsythe, C.

    2005-05-07

    CONEX is a code for joining sequentially in time multiple exodusll database files which all represent the same base mesh topology and geometry. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. CONEX is used to postprocess the results from a series of finite element analyses. It can join sequentially the data from multiple results databases into a single database which makes it easier to postprocess the results data.

  10. Finite Element Analysis Code

    2005-05-07

    CONEX is a code for joining sequentially in time multiple exodusll database files which all represent the same base mesh topology and geometry. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. CONEX is used to postprocess the results from a series of finite element analyses. It can join sequentially the data from multiple results databases intomore » a single database which makes it easier to postprocess the results data.« less

  11. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  12. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  13. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-07-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  14. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  15. Two-terminal video coding.

    PubMed

    Yang, Yang; Stanković, Vladimir; Xiong, Zixiang; Zhao, Wei

    2009-03-01

    Following recent works on the rate region of the quadratic Gaussian two-terminal source coding problem and limit-approaching code designs, this paper examines multiterminal source coding of two correlated, i.e., stereo, video sequences to save the sum rate over independent coding of both sequences. Two multiterminal video coding schemes are proposed. In the first scheme, the left sequence of the stereo pair is coded by H.264/AVC and used at the joint decoder to facilitate Wyner-Ziv coding of the right video sequence. The first I-frame of the right sequence is successively coded by H.264/AVC Intracoding and Wyner-Ziv coding. An efficient stereo matching algorithm based on loopy belief propagation is then adopted at the decoder to produce pixel-level disparity maps between the corresponding frames of the two decoded video sequences on the fly. Based on the disparity maps, side information for both motion vectors and motion-compensated residual frames of the right sequence are generated at the decoder before Wyner-Ziv encoding. In the second scheme, source splitting is employed on top of classic and Wyner-Ziv coding for compression of both I-frames to allow flexible rate allocation between the two sequences. Experiments with both schemes on stereo video sequences using H.264/AVC, LDPC codes for Slepian-Wolf coding of the motion vectors, and scalar quantization in conjunction with LDPC codes for Wyner-Ziv coding of the residual coefficients give a slightly lower sum rate than separate H.264/AVC coding of both sequences at the same video quality.

  16. Genetic code for sine

    NASA Astrophysics Data System (ADS)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  17. FAST GYROSYNCHROTRON CODES

    SciTech Connect

    Fleishman, Gregory D.; Kuznetsov, Alexey A.

    2010-10-01

    Radiation produced by charged particles gyrating in a magnetic field is highly significant in the astrophysics context. Persistently increasing resolution of astrophysical observations calls for corresponding three-dimensional modeling of the radiation. However, available exact equations are prohibitively slow in computing a comprehensive table of high-resolution models required for many practical applications. To remedy this situation, we develop approximate gyrosynchrotron (GS) codes capable of quickly calculating the GS emission (in non-quantum regime) from both isotropic and anisotropic electron distributions in non-relativistic, mildly relativistic, and ultrarelativistic energy domains applicable throughout a broad range of source parameters including dense or tenuous plasmas and weak or strong magnetic fields. The computation time is reduced by several orders of magnitude compared with the exact GS algorithm. The new algorithm performance can gradually be adjusted to the user's needs depending on whether precision or computation speed is to be optimized for a given model. The codes are made available for users as a supplement to this paper.

  18. New optimal quantum convolutional codes

    NASA Astrophysics Data System (ADS)

    Zhu, Shixin; Wang, Liqi; Kai, Xiaoshan

    2015-04-01

    One of the most challenges to prove the feasibility of quantum computers is to protect the quantum nature of information. Quantum convolutional codes are aimed at protecting a stream of quantum information in a long distance communication, which are the correct generalization to the quantum domain of their classical analogs. In this paper, we construct some classes of quantum convolutional codes by employing classical constacyclic codes. These codes are optimal in the sense that they attain the Singleton bound for pure convolutional stabilizer codes.

  19. Circular codes, symmetries and transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-06-01

    Circular codes, putative remnants of primeval comma-free codes, have gained considerable attention in the last years. In fact they represent a second kind of genetic code potentially involved in detecting and maintaining the normal reading frame in protein coding sequences. The discovering of an universal code across species suggested many theoretical and experimental questions. However, there is a key aspect that relates circular codes to symmetries and transformations that remains to a large extent unexplored. In this article we aim at addressing the issue by studying the symmetries and transformations that connect different circular codes. The main result is that the class of 216 C3 maximal self-complementary codes can be partitioned into 27 equivalence classes defined by a particular set of transformations. We show that such transformations can be put in a group theoretic framework with an intuitive geometric interpretation. More general mathematical results about symmetry transformations which are valid for any kind of circular codes are also presented. Our results pave the way to the study of the biological consequences of the mathematical structure behind circular codes and contribute to shed light on the evolutionary steps that led to the observed symmetries of present codes. PMID:25008961

  20. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  1. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  2. Peripheral coding of taste

    PubMed Central

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  3. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  4. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  5. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  6. Some easily analyzable convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R.; Dolinar, S.; Pollara, F.; Vantilborg, H.

    1989-01-01

    Convolutional codes have played and will play a key role in the downlink telemetry systems on many NASA deep-space probes, including Voyager, Magellan, and Galileo. One of the chief difficulties associated with the use of convolutional codes, however, is the notorious difficulty of analyzing them. Given a convolutional code as specified, say, by its generator polynomials, it is no easy matter to say how well that code will perform on a given noisy channel. The usual first step in such an analysis is to computer the code's free distance; this can be done with an algorithm whose complexity is exponential in the code's constraint length. The second step is often to calculate the transfer function in one, two, or three variables, or at least a few terms in its power series expansion. This step is quite hard, and for many codes of relatively short constraint lengths, it can be intractable. However, a large class of convolutional codes were discovered for which the free distance can be computed by inspection, and for which there is a closed-form expression for the three-variable transfer function. Although for large constraint lengths, these codes have relatively low rates, they are nevertheless interesting and potentially useful. Furthermore, the ideas developed here to analyze these specialized codes may well extend to a much larger class.

  7. Interframe vector wavelet coding technique

    NASA Astrophysics Data System (ADS)

    Wus, John P.; Li, Weiping

    1997-01-01

    Wavelet coding is often used to divide an image into multi- resolution wavelet coefficients which are quantized and coded. By 'vectorizing' scalar wavelet coding and combining this with vector quantization (VQ), vector wavelet coding (VWC) can be implemented. Using a finite number of states, finite-state vector quantization (FSVQ) takes advantage of the similarity between frames by incorporating memory into the video coding system. Lattice VQ eliminates the potential mismatch that could occur using pre-trained VQ codebooks. It also eliminates the need for codebook storage in the VQ process, thereby creating a more robust coding system. Therefore, by using the VWC coding method in conjunction with the FSVQ system and lattice VQ, the formulation of a high quality very low bit rate coding systems is proposed. A coding system using a simple FSVQ system where the current state is determined by the previous channel symbol only is developed. To achieve a higher degree of compression, a tree-like FSVQ system is implemented. The groupings are done in this tree-like structure from the lower subbands to the higher subbands in order to exploit the nature of subband analysis in terms of the parent-child relationship. Class A and Class B video sequences from the MPEG-IV testing evaluations are used in the evaluation of this coding method.

  8. Nonlinear, nonbinary cyclic group codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    New cyclic group codes of length 2(exp m) - 1 over (m - j)-bit symbols are introduced. These codes can be systematically encoded and decoded algebraically. The code rates are very close to Reed-Solomon (RS) codes and are much better than Bose-Chaudhuri-Hocquenghem (BCH) codes (a former alternative). The binary (m - j)-tuples are identified with a subgroup of the binary m-tuples which represents the field GF(2 exp m). Encoding is systematic and involves a two-stage procedure consisting of the usual linear feedback register (using the division or check polynomial) and a small table lookup. For low rates, a second shift-register encoding operation may be invoked. Decoding uses the RS error-correcting procedures for the m-tuple codes for m = 4, 5, and 6.

  9. Explosive Formulation Code Naming SOP

    SciTech Connect

    Martz, H. E.

    2014-09-19

    The purpose of this SOP is to provide a procedure for giving individual HME formulations code names. A code name for an individual HME formulation consists of an explosive family code, given by the classified guide, followed by a dash, -, and a number. If the formulation requires preparation such as packing or aging, these add additional groups of symbols to the X-ray specimen name.

  10. Variable Coded Modulation software simulation

    NASA Astrophysics Data System (ADS)

    Sielicki, Thomas A.; Hamkins, Jon; Thorsen, Denise

    This paper reports on the design and performance of a new Variable Coded Modulation (VCM) system. This VCM system comprises eight of NASA's recommended codes from the Consultative Committee for Space Data Systems (CCSDS) standards, including four turbo and four AR4JA/C2 low-density parity-check codes, together with six modulations types (BPSK, QPSK, 8-PSK, 16-APSK, 32-APSK, 64-APSK). The signaling protocol for the transmission mode is based on a CCSDS recommendation. The coded modulation may be dynamically chosen, block to block, to optimize throughput.

  11. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  12. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  13. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  14. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  15. The FLUKA Code: an Overview

    SciTech Connect

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U. /Houston U. /SLAC /Frascati /NASA, Houston /ENEA, Frascati

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  16. The KIDTALK Behavior and Language Code: Manual and Coding Protocol.

    ERIC Educational Resources Information Center

    Delaney, Elizabeth M.; Ezell, Sara S.; Solomon, Ned A.; Hancock, Terry B.; Kaiser, Ann P.

    Developed as part of the Milieu Language Teaching Project at the John F. Kennedy Center at Vanderbilt University in Nashville, Tennessee, this KIDTALK Behavior-Language Coding Protocol and manual measures behavior occurring during adult-child interactions. The manual is divided into 5 distinct sections: (1) the adult behavior codes describe…

  17. Telescope Adaptive Optics Code

    2005-07-28

    The Telescope AO Code has general adaptive optics capabilities plus specialized models for three telescopes with either adaptive optics or active optics systems. It has the capability to generate either single-layer or distributed Kolmogorov turbulence phase screens using the FFT. Missing low order spatial frequencies are added using the Karhunen-Loeve expansion. The phase structure curve is extremely dose to the theoreUcal. Secondly, it has the capability to simulate an adaptive optics control systems. The defaultmore » parameters are those of the Keck II adaptive optics system. Thirdly, it has a general wave optics capability to model the science camera halo due to scintillation from atmospheric turbulence and the telescope optics. Although this capability was implemented for the Gemini telescopes, the only default parameter specific to the Gemini telescopes is the primary mirror diameter. Finally, it has a model for the LSST active optics alignment strategy. This last model is highly specific to the LSST« less

  18. Patched Conic Trajectory Code

    NASA Technical Reports Server (NTRS)

    Park, Brooke Anderson; Wright, Henry

    2012-01-01

    PatCon code was developed to help mission designers run trade studies on launch and arrival times for any given planet. Initially developed in Fortran, the required inputs included launch date, arrival date, and other orbital parameters of the launch planet and arrival planets at the given dates. These parameters include the position of the planets, the eccentricity, semi-major axes, argument of periapsis, ascending node, and inclination of the planets. With these inputs, a patched conic approximation is used to determine the trajectory. The patched conic approximation divides the planetary mission into three parts: (1) the departure phase, in which the two relevant bodies are Earth and the spacecraft, and where the trajectory is a departure hyperbola with Earth at the focus; (2) the cruise phase, in which the two bodies are the Sun and the spacecraft, and where the trajectory is a transfer ellipse with the Sun at the focus; and (3) the arrival phase, in which the two bodies are the target planet and the spacecraft, where the trajectory is an arrival hyperbola with the planet as the focus.

  19. Error coding simulations in C

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1994-01-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  20. Error coding simulations in C

    NASA Astrophysics Data System (ADS)

    Noble, Viveca K.

    1994-10-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  1. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control.

  2. Indices for Testing Neural Codes

    PubMed Central

    Victor, Jonathan D.; Nirenberg, Sheila

    2009-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is smaller than the latter (i.e., if the code cannot account for the behavior), the code can be ruled out. The information-theoretic index most widely used in this context is Shannon’s mutual information. The Shannon test, however, is not ideal for this purpose: while the codes it will rule out are truly nonviable, there will be some nonviable codes that it will fail to rule out. Here we describe a wide range of alternative indices that can be used for ruling codes out. The range includes a continuum from Shannon information to measures of the performance of a Bayesian decoder. We analyze the relationship of these indices to each other and their complementary strengths and weaknesses for addressing this problem. PMID:18533812

  3. Video coding with dynamic background

    NASA Astrophysics Data System (ADS)

    Paul, Manoranjan; Lin, Weisi; Lau, Chiew Tong; Lee, Bu-Sung

    2013-12-01

    Motion estimation (ME) and motion compensation (MC) using variable block size, sub-pixel search, and multiple reference frames (MRFs) are the major reasons for improved coding performance of the H.264 video coding standard over other contemporary coding standards. The concept of MRFs is suitable for repetitive motion, uncovered background, non-integer pixel displacement, lighting change, etc. The requirement of index codes of the reference frames, computational time in ME & MC, and memory buffer for coded frames limits the number of reference frames used in practical applications. In typical video sequences, the previous frame is used as a reference frame with 68-92% of cases. In this article, we propose a new video coding method using a reference frame [i.e., the most common frame in scene (McFIS)] generated by dynamic background modeling. McFIS is more effective in terms of rate-distortion and computational time performance compared to the MRFs techniques. It has also inherent capability of scene change detection (SCD) for adaptive group of picture (GOP) size determination. As a result, we integrate SCD (for GOP determination) with reference frame generation. The experimental results show that the proposed coding scheme outperforms the H.264 video coding with five reference frames and the two relevant state-of-the-art algorithms by 0.5-2.0 dB with less computational time.

  4. ACCELERATION PHYSICS CODE WEB REPOSITORY.

    SciTech Connect

    WEI, J.

    2006-06-26

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  5. Computer algorithm for coding gain

    NASA Technical Reports Server (NTRS)

    Dodd, E. E.

    1974-01-01

    Development of a computer algorithm for coding gain for use in an automated communications link design system. Using an empirical formula which defines coding gain as used in space communications engineering, an algorithm is constructed on the basis of available performance data for nonsystematic convolutional encoding with soft-decision (eight-level) Viterbi decoding.

  6. QPhiX Code Generator

    SciTech Connect

    Joo, Balint

    2014-09-16

    A simple code-generator to generate the low level code kernels used by the QPhiX Library for Lattice QCD. Generates Kernels for Wilson-Dslash, and Wilson-Clover kernels. Can be reused to write other optimized kernels for Intel Xeon Phi(tm), Intel Xeon(tm) and potentially other architectures.

  7. Accelerator Physics Code Web Repository

    SciTech Connect

    Zimmermann, F.; Basset, R.; Bellodi, G.; Benedetto, E.; Dorda, U.; Giovannozzi, M.; Papaphilippou, Y.; Pieloni, T.; Ruggiero, F.; Rumolo, G.; Schmidt, F.; Todesco, E.; Zotter, B.W.; Payet, J.; Bartolini, R.; Farvacque, L.; Sen, T.; Chin, Y.H.; Ohmi, K.; Oide, K.; Furman, M.; /LBL, Berkeley /Oak Ridge /Pohang Accelerator Lab. /SLAC /TRIUMF /Tech-X, Boulder /UC, San Diego /Darmstadt, GSI /Rutherford /Brookhaven

    2006-10-24

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  8. LFSC - Linac Feedback Simulation Code

    SciTech Connect

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  9. South Carolina TEC Student Code.

    ERIC Educational Resources Information Center

    Edwards, C. A., Ed.; Kiser, J. A., Ed.

    This student code has statewide application to South Carolina Technical Colleges and Technical Education Centers (TEC). Provisions are divided into eight articles: (1) General Provisions, including the purpose of a student code, the precept of internal solution of problems, and definitions; (2) Student Rights, including Bill of Rights protections;…

  10. Cracking the bioelectric code

    PubMed Central

    Tseng, AiSun; Levin, Michael

    2013-01-01

    Patterns of resting potential in non-excitable cells of living tissue are now known to be instructive signals for pattern formation during embryogenesis, regeneration and cancer suppression. The development of molecular-level techniques for tracking ion flows and functionally manipulating the activity of ion channels and pumps has begun to reveal the mechanisms by which voltage gradients regulate cell behaviors and the assembly of complex large-scale structures. A recent paper demonstrated that a specific voltage range is necessary for demarcation of eye fields in the frog embryo. Remarkably, artificially setting other somatic cells to the eye-specific voltage range resulted in formation of eyes in aberrant locations, including tissues that are not in the normal anterior ectoderm lineage: eyes could be formed in the gut, on the tail, or in the lateral plate mesoderm. These data challenge the existing models of eye fate restriction and tissue competence maps, and suggest the presence of a bioelectric code—a mapping of physiological properties to anatomical outcomes. This Addendum summarizes the current state of knowledge in developmental bioelectricity, proposes three possible interpretations of the bioelectric code that functionally maps physiological states to anatomical outcomes, and highlights the biggest open questions in this field. We also suggest a speculative hypothesis at the intersection of cognitive science and developmental biology: that bioelectrical signaling among non-excitable cells coupled by gap junctions simulates neural network-like dynamics, and underlies the information processing functions required by complex pattern formation in vivo. Understanding and learning to control the information stored in physiological networks will have transformative implications for developmental biology, regenerative medicine and synthetic bioengineering. PMID:23802040

  11. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  12. Tristan code and its application

    NASA Astrophysics Data System (ADS)

    Nishikawa, K.-I.

    Since TRISTAN: The 3-D Electromagnetic Particle Code was introduced in 1990, it has been used for many applications including the simulations of global solar windmagnetosphere interaction. The most essential ingridients of this code have been published in the ISSS-4 book. In this abstract we describe some of issues and an application of this code for the study of global solar wind-magnetosphere interaction including a substorm study. The basic code (tristan.f) for the global simulation and a local simulation of reconnection with a Harris model (issrec2.f) are available at http:/www.physics.rutger.edu/˜kenichi. For beginners the code (isssrc2.f) with simpler boundary conditions is suitable to start to run simulations. The future of global particle simulations for a global geospace general circulation (GGCM) model with predictive capability (for Space Weather Program) is discussed.

  13. Best practices for code release

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce

    2016-01-01

    In this talk, I want to describe what I think are the best practices for releasing code and having it adopted by end users. Make sure your code is licensed, so users will know how the software can be used and modified, and place your code in a public repository that (and make sure that you follow institutional policies in doing this). Yet licensing and releasing code are not enough: the code must be organized and documented so users can understand what it does, what its limitations are, and how to build and use it. I will describe what I think are best practices in developing the content to support release, including tutorials, design documents, specifications of interfaces and so on. Much of what I have learned on based on ten years of experience in supporting releases of the Montage Image Mosaic Engine.

  14. ETR/ITER systems code

    SciTech Connect

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  15. Coding design for error correcting output codes based on perceptron

    NASA Astrophysics Data System (ADS)

    Zhou, Jin-Deng; Wang, Xiao-Dan; Zhou, Hong-Jian; Cui, Yong-Hua; Jing, Sun

    2012-05-01

    It is known that error-correcting output codes (ECOC) is a common way to model multiclass classification problems, in which the research of encoding based on data is attracting more and more attention. We propose a method for learning ECOC with the help of a single-layered perception neural network. To achieve this goal, the code elements of ECOC are mapped to the weights of network for the given decoding strategy, and an object function with the constrained weights is used as a cost function of network. After the training, we can obtain a coding matrix including lots of subgroups of class. Experimental results on artificial data and University of California Irvine with logistic linear classifier and support vector machine as the binary learner show that our scheme provides better performance of classification with shorter length of coding matrix than other state-of-the-art encoding strategies.

  16. State building energy codes status

    SciTech Connect

    1996-09-01

    This document contains the State Building Energy Codes Status prepared by Pacific Northwest National Laboratory for the U.S. Department of Energy under Contract DE-AC06-76RL01830 and dated September 1996. The U.S. Department of Energy`s Office of Codes and Standards has developed this document to provide an information resource for individuals interested in energy efficiency of buildings and the relevant building energy codes in each state and U.S. territory. This is considered to be an evolving document and will be updated twice a year. In addition, special state updates will be issued as warranted.

  17. Understanding the Code: upholding dignity.

    PubMed

    Griffith, Richard

    2015-04-01

    The Nursing and Midwifery Council, the statutory professional regulator for registered district nurses, has introduced a revised code of standards that came into effect on 31 March 2015. The Code makes clear that while district nurses can interpret the values and principles for use in community settings, the standards are not negotiable or discretionary. They must be applied, otherwise the district nurse's fitness to practice will be called into question. In the second of a series of articles analysing the legal implications of the Code on district nurse practice, the author considers the first standard, which requires district nurses to treat people as individuals and to uphold their dignity.

  18. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  19. Bandwidth efficient coding for satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Costello, Daniel J., Jr.; Miller, Warner H.; Morakis, James C.; Poland, William B., Jr.

    1992-01-01

    An error control coding scheme was devised to achieve large coding gain and high reliability by using coded modulation with reduced decoding complexity. To achieve a 3 to 5 dB coding gain and moderate reliability, the decoding complexity is quite modest. In fact, to achieve a 3 dB coding gain, the decoding complexity is quite simple, no matter whether trellis coded modulation or block coded modulation is used. However, to achieve coding gains exceeding 5 dB, the decoding complexity increases drastically, and the implementation of the decoder becomes very expensive and unpractical. The use is proposed of coded modulation in conjunction with concatenated (or cascaded) coding. A good short bandwidth efficient modulation code is used as the inner code and relatively powerful Reed-Solomon code is used as the outer code. With properly chosen inner and outer codes, a concatenated coded modulation scheme not only can achieve large coding gains and high reliability with good bandwidth efficiency but also can be practically implemented. This combination of coded modulation and concatenated coding really offers a way of achieving the best of three worlds, reliability and coding gain, bandwidth efficiency, and decoding complexity.

  20. The moving mesh code SHADOWFAX

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, B.; De Rijcke, S.

    2016-07-01

    We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.

  1. Property Control through Bar Coding.

    ERIC Educational Resources Information Center

    Kingma, Gerben J.

    1984-01-01

    A public utility company uses laser wands to read bar-coded labels on furniture and equipment. The system allows an 80 percent savings of the time required to create reports for inventory control. (MLF)

  2. Seals Flow Code Development 1993

    NASA Technical Reports Server (NTRS)

    Liang, Anita D. (Compiler); Hendricks, Robert C. (Compiler)

    1994-01-01

    Seals Workshop of 1993 code releases include SPIRALI for spiral grooved cylindrical and face seal configurations; IFACE for face seals with pockets, steps, tapers, turbulence, and cavitation; GFACE for gas face seals with 'lift pad' configurations; and SCISEAL, a CFD code for research and design of seals of cylindrical configuration. GUI (graphical user interface) and code usage was discussed with hands on usage of the codes, discussions, comparisons, and industry feedback. Other highlights for the Seals Workshop-93 include environmental and customer driven seal requirements; 'what's coming'; and brush seal developments including flow visualization, numerical analysis, bench testing, T-700 engine testing, tribological pairing and ceramic configurations, and cryogenic and hot gas facility brush seal results. Also discussed are seals for hypersonic engines and dynamic results for spiral groove and smooth annular seals.

  3. Efficient codes and balanced networks.

    PubMed

    Denève, Sophie; Machens, Christian K

    2016-03-01

    Recent years have seen a growing interest in inhibitory interneurons and their circuits. A striking property of cortical inhibition is how tightly it balances excitation. Inhibitory currents not only match excitatory currents on average, but track them on a millisecond time scale, whether they are caused by external stimuli or spontaneous fluctuations. We review, together with experimental evidence, recent theoretical approaches that investigate the advantages of such tight balance for coding and computation. These studies suggest a possible revision of the dominant view that neurons represent information with firing rates corrupted by Poisson noise. Instead, tight excitatory/inhibitory balance may be a signature of a highly cooperative code, orders of magnitude more precise than a Poisson rate code. Moreover, tight balance may provide a template that allows cortical neurons to construct high-dimensional population codes and learn complex functions of their inputs.

  4. NFPA's Hydrogen Technologies Code Project

    SciTech Connect

    Rivkin, C. H.

    2008-12-01

    This article discusses the development of National Fire Protection Association 2 (NFPA), a comprehensive hydrogen safety code. It analyses the contents of this document with particular attention focused on new requirements for setting hydrogen storage systems. These new requirements use computational fluid dynamic modeling and risk assessment procedures to develop requirements that are based on both technical analyses and defined risk criteria. The intent is to develop requirements based on procedures that can be replicated based on the information provided in the code document. This code will require documentation of the modeling inputs and risk criteria and analyses in the supporting information. This article also includes a description of the codes and standards that address hydrogen technologies in general.

  5. Training course on code implementation.

    PubMed

    Allain, A; De Arango, R

    1992-01-01

    The International Baby Food Action Network (IBFAN) is a coalition of over 40 citizen groups in 70 countries. IBFAN monitors the progress worldwide of the implementation of the International Code of Marketing of Breastmilk Substitutes. The Code is intended to regulate the advertising and promotional techniques used to sell infant formula. The 1991 IBFAN report shows that 75 countries have taken some action to implement the International Code. During 1992, the IBFAN Code Documentation Center in Malaysia conducted 2 training courses to help countries draft legislation to implement and monitor compliance with the International Code. In April, government officials from 19 Asian and African countries attended the first course in Malaysia; the second course was conducted in Spanish in Guatemala and attended by officials from 15 Latin American and Caribbean countries. The resource people included representatives from NGOs in Africa, Asia, Latin America, Europe and North America with experience in Code implementation and monitoring at the national level. The main purpose of each course was to train government officials to use the International Code as a starting point for national legislation to protect breastfeeding. Participants reviewed recent information on lactation management, the advantages of breastfeeding, current trends in breastfeeding and the marketing practices of infant formula manufacturers. The participants studied the terminology contained in the International Code and terminology used by infant formula manufacturers to include breastmilk supplements such as follow-on formulas and cereal-based baby foods. Relevant World Health Assembly resolutions such as the one adopted in 1986 on the need to ban free and low-cost supplies to hospitals were examined. The legal aspects of the current Baby Friendly Hospital Initiative (BFHI) and the progress in the 12 BFHI test countries concerning the elimination of supplies were also examined. International Labor

  6. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  7. Computer-Access-Code Matrices

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr.

    1990-01-01

    Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.

  8. Summary of Code of Ethics.

    PubMed

    Eklund, Kerri

    2016-01-01

    The Guide to the Code of Ethics for Nurses is an excellent guideline for all nurses regardless of their area of practice. I greatly enjoyed reading the revisions in place within the 2015 edition and refreshing my nursing conscience. I plan to always keep my Guide to the Code of Ethics for Nurses near in order to keep my moral compass from veering off the path of quality care. PMID:27183735

  9. Summary of Code of Ethics.

    PubMed

    Eklund, Kerri

    2016-01-01

    The Guide to the Code of Ethics for Nurses is an excellent guideline for all nurses regardless of their area of practice. I greatly enjoyed reading the revisions in place within the 2015 edition and refreshing my nursing conscience. I plan to always keep my Guide to the Code of Ethics for Nurses near in order to keep my moral compass from veering off the path of quality care.

  10. Edge equilibrium code for tokamaks

    SciTech Connect

    Li, Xujing; Drozdov, Vladimir V.

    2014-01-15

    The edge equilibrium code (EEC) described in this paper is developed for simulations of the near edge plasma using the finite element method. It solves the Grad-Shafranov equation in toroidal coordinate and uses adaptive grids aligned with magnetic field lines. Hermite finite elements are chosen for the numerical scheme. A fast Newton scheme which is the same as implemented in the equilibrium and stability code (ESC) is applied here to adjust the grids.

  11. electromagnetics, eddy current, computer codes

    2002-03-12

    TORO Version 4 is designed for finite element analysis of steady, transient and time-harmonic, multi-dimensional, quasi-static problems in electromagnetics. The code allows simulation of electrostatic fields, steady current flows, magnetostatics and eddy current problems in plane or axisymmetric, two-dimensional geometries. TORO is easily coupled to heat conduction and solid mechanics codes to allow multi-physics simulations to be performed.

  12. Criminal Code, 5 December 1986.

    PubMed

    1988-01-01

    Under this Code, performing an illegal abortion or causing a woman to have an abortion are serious offenses reflecting the Mongolian State's interest in population growth. Abortions are allowed only in circumstances specified by medical authorities. The Code also provides for punishment for the use of coercion to force a woman to marry or to prevent her from marrying and for the obstruction of women's equal rights.

  13. The Integrated TIGER Series Codes

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with anmore » input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less

  14. Spaceflight Validation of Hzetrn Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Shinn, J. L.; Singleterry, R. C.; Badavi, F. F.; Badhwar, G. D.; Reitz, G.; Beaujean, R.; Cucinotta, F. A.

    1999-01-01

    HZETRN is being developed as a fast deterministic radiation transport code applicable to neutrons, protons, and multiply charged ions in the space environment. It was recently applied to 50 hours of IMP8 data measured during the August 4, 1972 solar event to map the hourly exposures within the human body under several shield configurations. This calculation required only 18 hours on a VAX 4000 machine. A similar calculation using the Monte Carlo method would have required two years of dedicated computer time. The code has been benchmarked against well documented and tested Monte Carlo proton transport codes with good success. The code will allow important trade studies to be made with relative ease due to the computational speed and will be useful in assessing design alternatives in an integrated system software environment. Since there are no well tested Monte Carlo codes for HZE particles, we have been engaged in flight validation of the HZETRN results. To date we have made comparison with TEPC, CR-39, charge particle telescopes, and Bonner spheres. This broad range of detectors allows us to test a number of functions related to differing physical processes which add to the complicated radiation fields within a spacecraft or the human body, which functions can be calculated by the HZETRN code system. In the present report we will review these results.

  15. The cosmic code comparison project

    NASA Astrophysics Data System (ADS)

    Heitmann, Katrin; Lukić, Zarija; Fasel, Patricia; Habib, Salman; Warren, Michael S.; White, Martin; Ahrens, James; Ankeny, Lee; Armstrong, Ryan; O'Shea, Brian; Ricker, Paul M.; Springel, Volker; Stadel, Joachim; Trac, Hy

    2008-10-01

    Current and upcoming cosmological observations allow us to probe structures on smaller and smaller scales, entering highly nonlinear regimes. In order to obtain theoretical predictions in these regimes, large cosmological simulations have to be carried out. The promised high accuracy from observations makes the simulation task very demanding: the simulations have to be at least as accurate as the observations. This requirement can only be fulfilled by carrying out an extensive code verification program. The first step of such a program is the comparison of different cosmology codes including gravitational interactions only. In this paper, we extend a recently carried out code comparison project to include five more simulation codes. We restrict our analysis to a small cosmological volume which allows us to investigate properties of halos. For the matter power spectrum and the mass function, the previous results hold, with the codes agreeing at the 10% level over wide dynamic ranges. We extend our analysis to the comparison of halo profiles and investigate the halo count as a function of local density. We introduce and discuss ParaView as a flexible analysis tool for cosmological simulations, the use of which immensely simplifies the code comparison task.

  16. Rotating-Pump Design Code

    NASA Technical Reports Server (NTRS)

    Walker, James F.; Chen, Shu-Cheng; Scheer, Dean D.

    2006-01-01

    Pump Design (PUMPDES) is a computer program for designing a rotating pump for liquid hydrogen, liquid oxygen, liquid nitrogen, water, methane, or ethane. Using realistic properties of these fluids provided by another program called GASPAK, this code performs a station-by-station, mean-line analysis along the pump flow path, obtaining thermodynamic properties of the pumped fluid at each station and evaluating hydraulic losses along the flow path. The variables at each station are obtained under constraints that are consistent with the underlying physical principles. The code evaluates the performance of each stage and the overall pump. In addition, by judiciously choosing the givens and the unknowns, the code can perform a geometric inverse design function: that is, it can compute a pump geometry that yields a closest approximation of given design point. The code contains two major parts: one for an axial-rotor/inducer and one for a multistage centrifugal pump. The inducer and the centrifugal pump are functionally integrated. The code can be used in designing and/or evaluating the inducer/centrifugal-pump combination or the centrifugal pump alone. The code is written in standard Fortran 77.

  17. Serial-data correlator/code translator

    NASA Technical Reports Server (NTRS)

    Morgan, L. E.

    1977-01-01

    System, consisting of sampling flip flop, memory (either RAM or ROM), and memory buffer, correlates sampled data with predetermined acceptance code patterns, translates acceptable code patterns to nonreturn-to-zero code, and identifies data dropouts.

  18. Entanglement-assisted codeword stabilized quantum codes

    SciTech Connect

    Shin, Jeonghwan; Heo, Jun; Brun, Todd A.

    2011-12-15

    Entangled qubits can increase the capacity of quantum error-correcting codes based on stabilizer codes. In addition, by using entanglement quantum stabilizer codes can be construct from classical linear codes that do not satisfy the dual-containing constraint. We show that it is possible to construct both additive and nonadditive quantum codes using the codeword stabilized quantum code framework. Nonadditive codes may offer improved performance over the more common stabilizer codes. Like other entanglement-assisted codes, the encoding procedure acts only on the qubits on Alice's side, and only these qubits are assumed to pass through the channel. However, errors in the codeword stabilized quantum code framework give rise to effective Z errors on Bob's side. We use this scheme to construct entanglement-assisted nonadditive quantum codes, in particular, ((5,16,2;1)) and ((7,4,5;4)) codes.

  19. Number of minimum-weight code words in a product code

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1978-01-01

    Consideration is given to the number of minimum-weight code words in a product code. The code is considered as a tensor product of linear codes over a finite field. Complete theorems and proofs are presented.

  20. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  1. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  2. Driver Code for Adaptive Optics

    NASA Technical Reports Server (NTRS)

    Rao, Shanti

    2007-01-01

    A special-purpose computer code for a deformable-mirror adaptive-optics control system transmits pixel-registered control from (1) a personal computer running software that generates the control data to (2) a circuit board with 128 digital-to-analog converters (DACs) that generate voltages to drive the deformable-mirror actuators. This program reads control-voltage codes from a text file, then sends them, via the computer s parallel port, to a circuit board with four AD5535 (or equivalent) chips. Whereas a similar prior computer program was capable of transmitting data to only one chip at a time, this program can send data to four chips simultaneously. This program is in the form of C-language code that can be compiled and linked into an adaptive-optics software system. The program as supplied includes source code for integration into the adaptive-optics software, documentation, and a component that provides a demonstration of loading DAC codes from a text file. On a standard Windows desktop computer, the software can update 128 channels in 10 ms. On Real-Time Linux with a digital I/O card, the software can update 1024 channels (8 boards in parallel) every 8 ms.

  3. International assessment of PCA codes

    SciTech Connect

    Neymotin, L.; Lui, C.; Glynn, J.; Archarya, S.

    1993-11-01

    Over the past three years (1991-1993), an extensive international exercise for intercomparison of a group of six Probabilistic Consequence Assessment (PCA) codes was undertaken. The exercise was jointly sponsored by the Commission of European Communities (CEC) and OECD Nuclear Energy Agency. This exercise was a logical continuation of a similar effort undertaken by OECD/NEA/CSNI in 1979-1981. The PCA codes are currently used by different countries for predicting radiological health and economic consequences of severe accidents at nuclear power plants (and certain types of non-reactor nuclear facilities) resulting in releases of radioactive materials into the atmosphere. The codes participating in the exercise were: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this inter-code comparison effort, two separate groups performed a similar set of calculations using two of the participating codes, MACCS and COSYMA. Results of the intercode and inter-MACCS comparisons are presented in this paper. The MACCS group included four participants: GREECE: Institute of Nuclear Technology and Radiation Protection, NCSR Demokritos; ITALY: ENEL, ENEA/DISP, and ENEA/NUC-RIN; SPAIN: Universidad Politecnica de Madrid (UPM) and Consejo de Seguridad Nuclear; USA: Brookhaven National Laboratory, US NRC and DOE.

  4. AEST: Adaptive Eigenvalue Stability Code

    NASA Astrophysics Data System (ADS)

    Zheng, L.-J.; Kotschenreuther, M.; Waelbroeck, F.; van Dam, J. W.; Berk, H.

    2002-11-01

    An adaptive eigenvalue linear stability code is developed. The aim is on one hand to include the non-ideal MHD effects into the global MHD stability calculation for both low and high n modes and on the other hand to resolve the numerical difficulty involving MHD singularity on the rational surfaces at the marginal stability. Our code follows some parts of philosophy of DCON by abandoning relaxation methods based on radial finite element expansion in favor of an efficient shooting procedure with adaptive gridding. The δ W criterion is replaced by the shooting procedure and subsequent matrix eigenvalue problem. Since the technique of expanding a general solution into a summation of the independent solutions employed, the rank of the matrices involved is just a few hundreds. This makes easier to solve the eigenvalue problem with non-ideal MHD effects, such as FLR or even full kinetic effects, as well as plasma rotation effect, taken into account. To include kinetic effects, the approach of solving for the distribution function as a local eigenvalue ω problem as in the GS2 code will be employed in the future. Comparison of the ideal MHD version of the code with DCON, PEST, and GATO will be discussed. The non-ideal MHD version of the code will be employed to study as an application the transport barrier physics in tokamak discharges.

  5. Constructions for finite-state codes

    NASA Technical Reports Server (NTRS)

    Pollara, F.; Mceliece, R. J.; Abdel-Ghaffar, K.

    1987-01-01

    A class of codes called finite-state (FS) codes is defined and investigated. These codes, which generalize both block and convolutional codes, are defined by their encoders, which are finite-state machines with parallel inputs and outputs. A family of upper bounds on the free distance of a given FS code is derived from known upper bounds on the minimum distance of block codes. A general construction for FS codes is then given, based on the idea of partitioning a given linear block into cosets of one of its subcodes, and it is shown that in many cases the FS codes constructed in this way have a d sub free which is as large as possible. These codes are found without the need for lengthy computer searches, and have potential applications for future deep-space coding systems. The issue of catastropic error propagation (CEP) for FS codes is also investigated.

  6. A coded tracking telemetry system

    USGS Publications Warehouse

    Howey, P.W.; Seegar, W.S.; Fuller, M.R.; Titus, K.

    1989-01-01

    We describe the general characteristics of an automated radio telemetry system designed to operate for prolonged periods on a single frequency. Each transmitter sends a unique coded signal to a receiving system that encodes and records only the appropriater, pre-programmed codes. A record of the time of each reception is stored on diskettes in a micro-computer. This system enables continuous monitoring of infrequent signals (e.g. one per minute or one per hour), thus extending operation life or allowing size reduction of the transmitter, compared to conventional wildlife telemetry. Furthermore, when using unique codes transmitted on a single frequency, biologists can monitor many individuals without exceeding the radio frequency allocations for wildlife.

  7. FORWARD Codes: Now with Widgets!

    NASA Astrophysics Data System (ADS)

    Forland, B.; Gibson, S. E.; Kucera, T. A.

    2013-05-01

    The FORWARD suite of SolarSoft IDL codes converts an analytic model or simulation data cube into a form directly comparable to observations. Observables such as extreme ultra violet, soft X-ray, white light, and polarization images from the Coronal Multichannel Polarimeter (CoMP) can be reproduced. The observer's viewpoint is also incorporated in the forward analysis and the codes can output the results in a variety of forms in order to easily create movies, Carrington maps, or simply observable information at a particular point in the plane of the sky. We present a newly developed front end to the FORWARD codes which utilizes IDL widgets to facilitate ease of use by the solar physics community. Our ultimate goal is to provide as useful a tool as possible for a broad range of scientific applications.

  8. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  9. Pulse code modulated signal synchronizer

    NASA Technical Reports Server (NTRS)

    Kobayashi, H. S. (Inventor)

    1974-01-01

    A bit synchronizer for a split phase PCM transmission is reported that includes three loop circuits which receive incoming phase coded PCM signals. In the first loop, called a Q-loop, a generated, phase coded, PCM signal is multiplied with the incoming signals, and the frequency and phase of the generated signal are nulled to that of the incoming subcarrier signal. In the second loop, called a B-loop, a circuit multiplies a generated signal with incoming signals to null the phase of the generated signal in a bit phase locked relationship to the incoming signal. In a third loop, called the I-loop, a phase coded PCM signal is multiplied with the incoming signals for decoding the bit information from the PCM signal. A counter means is used for timing of the generated signals and timing of sample intervals for each bit period.

  10. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  11. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  12. FORWARD Codes: Now with Widget!

    NASA Astrophysics Data System (ADS)

    Gibson, Sarah; Forland, B.; Kucera, T. A.

    2013-07-01

    The FORWARD suite of SolarSoft IDL codes converts an analytic or simulation data cube into a form directly comparable to observations. Observables such as extreme ultraviolet, soft X-ray, white light, and polarization images from the Coronal Multichannel Polarimeter (CoMP) can be reproduced. The observer's viewpoint is also incorperated in the forward analysis and the codes can output the results in a variety of forms in order to easily create movies, Carrington maps, or simply plasma properties at a particular point in the plane of the sky. We present a newly developed front end to the FORWARD codes which utilizes IDL widgets. Our ultimate goal is to provide as useful a tool as possible for a broad range of scientific applications.Abstract (2,250 Maximum Characters): The FORWARD suite of SolarSoft IDL codes converts an analytic or simulation data cube into a form directly comparable to observations. Observables such as extreme ultraviolet, soft X-ray, white light, and polarization images from the Coronal Multichannel Polarimeter (CoMP) can be reproduced. The observer's viewpoint is also incorperated in the forward analysis and the codes can output the results in a variety of forms in order to easily create movies, Carrington maps, or simply plasma properties at a particular point in the plane of the sky. We present a newly developed front end to the FORWARD codes which utilizes IDL widgets. Our ultimate goal is to provide as useful a tool as possible for a broad range of scientific applications.

  13. Sensor Authentication: Embedded Processor Code

    SciTech Connect

    Svoboda, John

    2012-09-25

    Described is the c code running on the embedded Microchip 32bit PIC32MX575F256H located on the INL developed noise analysis circuit board. The code performs the following functions: Controls the noise analysis circuit board preamplifier voltage gains of 1, 10, 100, 000 Initializes the analog to digital conversion hardware, input channel selection, Fast Fourier Transform (FFT) function, USB communications interface, and internal memory allocations Initiates high resolution 4096 point 200 kHz data acquisition Computes complex 2048 point FFT and FFT magnitude. Services Host command set Transfers raw data to Host Transfers FFT result to host Communication error checking

  14. Hybrid codes: Methods and applications

    SciTech Connect

    Winske, D. ); Omidi, N. )

    1991-01-01

    In this chapter we discuss hybrid'' algorithms used in the study of low frequency electromagnetic phenomena, where one or more ion species are treated kinetically via standard PIC methods used in particle codes and the electrons are treated as a single charge neutralizing massless fluid. Other types of hybrid models are possible, as discussed in Winske and Quest, but hybrid codes with particle ions and massless fluid electrons have become the most common for simulating space plasma physics phenomena in the last decade, as we discuss in this paper.

  15. COLAcode: COmoving Lagrangian Acceleration code

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin V.

    2016-02-01

    COLAcode is a serial particle mesh-based N-body code illustrating the COLA (COmoving Lagrangian Acceleration) method; it solves for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). It differs from standard N-body code by trading accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is useful for generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing; such catalogs are needed to perform detailed error analysis for ongoing and future surveys of LSS.

  16. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    PubMed

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  17. Code Mixing and Modernization across Cultures.

    ERIC Educational Resources Information Center

    Kamwangamalu, Nkonko M.

    A review of recent studies addressed the functional uses of code mixing across cultures. Expressions of code mixing (CM) are not random; in fact, a number of functions of code mixing can easily be delineated, for example, the concept of "modernization.""Modernization" is viewed with respect to how bilingual code mixers perceive themselves, how…

  18. An Interactive Concatenated Turbo Coding System

    NASA Technical Reports Server (NTRS)

    Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  19. Bounding the distance of quantum surface codes

    NASA Astrophysics Data System (ADS)

    Fetaya, Ethan

    2012-06-01

    Homological quantum codes (also called topological codes) are low density parity check error correcting codes that come from surfaces and higher dimension manifolds. Homological codes from surfaces, i.e., surface codes, have also been suggested as a possible way to construct stable quantum memory and fault-tolerant computation. It has been conjectured that all homological codes have a square root bound on there distance and therefore cannot produce good codes. This claim has been disputed in dimension four using the geometric property of systolic freedom. We will show in this paper that the conjecture holds in dimension two due to the negation of systolic freedom, i.e., systolic rigidity.

  20. Entanglement-assisted quantum convolutional coding

    SciTech Connect

    Wilde, Mark M.; Brun, Todd A.

    2010-04-15

    We show how to protect a stream of quantum information from decoherence induced by a noisy quantum communication channel. We exploit preshared entanglement and a convolutional coding structure to develop a theory of entanglement-assisted quantum convolutional coding. Our construction produces a Calderbank-Shor-Steane (CSS) entanglement-assisted quantum convolutional code from two arbitrary classical binary convolutional codes. The rate and error-correcting properties of the classical convolutional codes directly determine the corresponding properties of the resulting entanglement-assisted quantum convolutional code. We explain how to encode our CSS entanglement-assisted quantum convolutional codes starting from a stream of information qubits, ancilla qubits, and shared entangled bits.

  1. QR Codes: Taking Collections Further

    ERIC Educational Resources Information Center

    Ahearn, Caitlin

    2014-01-01

    With some thought and direction, QR (quick response) codes are a great tool to use in school libraries to enhance access to information. From March through April 2013, Caitlin Ahearn interned at Sanborn Regional High School (SRHS) under the supervision of Pam Harland. As a result of Harland's un-Deweying of the nonfiction collection at SRHS,…

  2. GOES satellite time code dissemination

    NASA Technical Reports Server (NTRS)

    Beehler, R. E.

    1983-01-01

    The GOES time code system, the performance achieved to date, and some potential improvements in the future are discussed. The disseminated time code is originated from a triply redundant set of atomic standards, time code generators and related equipment maintained by NBS at NOAA's Wallops Island, VA satellite control facility. It is relayed by two GOES satellites located at 75 W and 135 W longitude on a continuous basis to users within North and South America (with overlapping coverage) and well out into the Atlantic and Pacific ocean areas. Downlink frequencies are near 468 MHz. The signals from both satellites are monitored and controlled from the NBS labs at Boulder, CO with additional monitoring input from geographically separated receivers in Washington, D.C. and Hawaii. Performance experience with the received time codes for periods ranging from several years to one day is discussed. Results are also presented for simultaneous, common-view reception by co-located receivers and by receivers separated by several thousand kilometers.

  3. Tri-Coding of Information.

    ERIC Educational Resources Information Center

    Simpson, Timothy J.

    Paivio's Dual Coding Theory has received widespread recognition for its connection between visual and aural channels of internal information processing. The use of only two channels, however, cannot satisfactorily explain the effects witnessed every day. This paper presents a study suggesting the presence a third, kinesthetic channel, currently…

  4. AEDS Property Classification Code Manual.

    ERIC Educational Resources Information Center

    Association for Educational Data Systems, Washington, DC.

    The control and inventory of property items using data processing machines requires a form of numerical description or code which will allow a maximum of description in a minimum of space on the data card. An adaptation of a standard industrial classification system is given to cover any expendable warehouse item or non-expendable piece of…

  5. Generating Constant Weight Binary Codes

    ERIC Educational Resources Information Center

    Knight, D.G.

    2008-01-01

    The determination of bounds for A(n, d, w), the maximum possible number of binary vectors of length n, weight w, and pairwise Hamming distance no less than d, is a classic problem in coding theory. Such sets of vectors have many applications. A description is given of how the problem can be used in a first-year undergraduate computational…

  6. Overview of CODE V development

    NASA Astrophysics Data System (ADS)

    Harris, Thomas I.

    1991-01-01

    This paper is part of a session that is aimed at briefly describing some of today''s optical design software packages with emphasis on the program''s philosophy and technology. CODE V is the ongoing result of a development process that began in the 1960''s it is now the result of many people''s efforts. This paper summarizes the roots of the program some of its history dominant philosophies and technologies that have contributed to its usefulness and some that drive its continued development. ROOTS OF CODE V Conceived in the early 60''s This was at a time when there was skepticism that " automatic design" could design lenses equal or better than " hand" methods. The concepts underlying CODE V and its predecessors were based on ten years of experience and exposure to the problems of a group of lens designers in a design-for-manufacture environment. The basic challenge was to show that lens design could be done better easier and faster by high quality computer-assisted design tools. The earliest development was for our own use as an engineering services organization -an in-house tool for custom design. As a tool it had to make us efficient in providing lens design and engineering services as a self-sustaining business. PHILOSOPHY OF OVTIM!ZATION IN CODE V Error function formation Based on experience as a designer we felt very strongly that there should be a clear separation of

  7. Three-dimensional stellarator codes

    PubMed Central

    Garabedian, P. R.

    2002-01-01

    Three-dimensional computer codes have been used to develop quasisymmetric stellarators with modular coils that are promising candidates for a magnetic fusion reactor. The mathematics of plasma confinement raises serious questions about the numerical calculations. Convergence studies have been performed to assess the best configurations. Comparisons with recent data from large stellarator experiments serve to validate the theory. PMID:12140367

  8. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1982-01-01

    FORTRAN Static Source Code Analyzer program (SAP) automatically gathers and reports statistics on occurrences of statements and structures within FORTRAN program. Provisions are made for weighting each statistic, providing user with overall figure of complexity. Statistics, as well as figures of complexity, are gathered on module-by-module basis. Overall summed statistics are accumulated for complete input source file.

  9. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  10. Authentication codes that permit arbitration

    SciTech Connect

    Simmons, G.J.

    1987-01-01

    Objective of authentication is to detect attempted deceptions in a communications channel. Traditionally this has been restricted to providing the authorized receiver with a capability of detecting unauthentic messages. The known codes have all left open the possibility for either the transmitter to disavow a message that he actually sent to the receiver, i.e., an authentic message, or else for the receiver to falsely attribute a message of his own devising to the transmitter. Of course the party being deceived would know that he was the victim of a deception by the other, but would be unable to ''prove'' this to a third party. Ideally, authentication should provide a means to detect attempted deceptions by insiders (the transmitter or receiver) as well as outsiders (the opponent). It has been an open question of whether it was possible to devise authentication codes that would permit a third party, an arbiter, to decide (in probability) whether the transmitter or the receiver was cheating in the event of a dispute. We answer this question in that both permits the receiver to detect outsider deceptions, as well affirmative by first constructing an example of an authentication code as permitting a designated arbiter to detect insider deceptions and then by generalizing this construction to an infinite class of such codes.

  11. Computer Code Generates Homotopic Grids

    NASA Technical Reports Server (NTRS)

    Moitra, Anutosh

    1992-01-01

    HOMAR is computer code using homotopic procedure to produce two-dimensional grids in cross-sectional planes, which grids then stacked to produce quasi-three-dimensional grid systems for aerospace configurations. Program produces grids for use in both Euler and Navier-Stokes computation of flows. Written in FORTRAN 77.

  12. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  13. Transversal Clifford gates on folded surface codes

    NASA Astrophysics Data System (ADS)

    Moussa, Jonathan E.

    2016-10-01

    Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surface codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. The specific application of these codes to universal quantum computation based on qubit fusion is also discussed.

  14. Accumulate-Repeat-Accumulate-Accumulate Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Samuel; Thorpe, Jeremy

    2007-01-01

    Accumulate-repeat-accumulate-accumulate (ARAA) codes have been proposed, inspired by the recently proposed accumulate-repeat-accumulate (ARA) codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. ARAA codes can be regarded as serial turbolike codes or as a subclass of low-density parity-check (LDPC) codes, and, like ARA codes they have projected graph or protograph representations; these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms. The objective in proposing ARAA codes as a subclass of ARA codes was to enhance the error-floor performance of ARA codes while maintaining simple encoding structures and low maximum variable node degree.

  15. The neuronal code(s) of the cerebellum.

    PubMed

    Heck, Detlef H; De Zeeuw, Chris I; Jaeger, Dieter; Khodakhah, Kamran; Person, Abigail L

    2013-11-01

    Understanding how neurons encode information in sequences of action potentials is of fundamental importance to neuroscience. The cerebellum is widely recognized for its involvement in the coordination of movements, which requires muscle activation patterns to be controlled with millisecond precision. Understanding how cerebellar neurons accomplish such high temporal precision is critical to understanding cerebellar function. Inhibitory Purkinje cells, the only output neurons of the cerebellar cortex, and their postsynaptic target neurons in the cerebellar nuclei, fire action potentials at high, sustained frequencies, suggesting spike rate modulation as a possible code. Yet, millisecond precise spatiotemporal spike activity patterns in Purkinje cells and inferior olivary neurons have also been observed. These results and ongoing studies suggest that the neuronal code used by cerebellar neurons may span a wide time scale from millisecond precision to slow rate modulations, likely depending on the behavioral context. PMID:24198351

  16. Amino acid codes in mitochondria as possible clues to primitive codes

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  17. Genetic coding and gene expression - new Quadruplet genetic coding model

    NASA Astrophysics Data System (ADS)

    Shankar Singh, Rama

    2012-07-01

    Successful demonstration of human genome project has opened the door not only for developing personalized medicine and cure for genetic diseases, but it may also answer the complex and difficult question of the origin of life. It may lead to making 21st century, a century of Biological Sciences as well. Based on the central dogma of Biology, genetic codons in conjunction with tRNA play a key role in translating the RNA bases forming sequence of amino acids leading to a synthesized protein. This is the most critical step in synthesizing the right protein needed for personalized medicine and curing genetic diseases. So far, only triplet codons involving three bases of RNA, transcribed from DNA bases, have been used. Since this approach has several inconsistencies and limitations, even the promise of personalized medicine has not been realized. The new Quadruplet genetic coding model proposed and developed here involves all four RNA bases which in conjunction with tRNA will synthesize the right protein. The transcription and translation process used will be the same, but the Quadruplet codons will help overcome most of the inconsistencies and limitations of the triplet codes. Details of this new Quadruplet genetic coding model and its subsequent potential applications including relevance to the origin of life will be presented.

  18. Coding and transmission of subband coded images on the Internet

    NASA Astrophysics Data System (ADS)

    Wah, Benjamin W.; Su, Xiao

    2001-09-01

    Subband-coded images can be transmitted in the Internet using either the TCP or the UDP protocol. Delivery by TCP gives superior decoding quality but with very long delays when the network is unreliable, whereas delivery by UDP has negligible delays but with degraded quality when packets are lost. Although images are delivered currently over the Internet by TCP, we study in this paper the use of UDP to deliver multi-description reconstruction-based subband-coded images. First, in order to facilitate recovery from UDP packet losses, we propose a joint sender-receiver approach for designing optimized reconstruction-based subband transform (ORB-ST) in multi-description coding (MDC). Second, we carefully evaluate the delay-quality trade-offs between the TCP delivery of SDC images and the UDP and combined TCP/UDP delivery of MDC images. Experimental results show that our proposed ORB-ST performs well in real Internet tests, and UDP and combined TCP/UDP delivery of MDC images provide a range of attractive alternatives to TCP delivery.

  19. Box codes of lengths 48 and 72

    NASA Technical Reports Server (NTRS)

    Solomon, G.; Jin, Y.

    1993-01-01

    A self-dual code length 48, dimension 24, with Hamming distance essentially equal to 12 is constructed here. There are only six code words of weight eight. All the other code words have weights that are multiples of four and have a minimum weight equal to 12. This code may be encoded systematically and arises from a strict binary representation of the (8,4;5) Reed-Solomon (RS) code over GF (64). The code may be considered as six interrelated (8,7;2) codes. The Mattson-Solomon representation of the cyclic decomposition of these codes and their parity sums are used to detect an odd number of errors in any of the six codes. These may then be used in a correction algorithm for hard or soft decision decoding. A (72,36;15) box code was constructed from a (63,35;8) cyclic code. The theoretical justification is presented herein. A second (72,36;15) code is constructed from an inner (63,27;16) Bose Chaudhuri Hocquenghem (BCH) code and expanded to length 72 using box code algorithms for extension. This code was simulated and verified to have a minimum distance of 15 with even weight words congruent to zero modulo four. The decoding for hard and soft decision is still more complex than the first code constructed above. Finally, an (8,4;5) RS code over GF (512) in the binary representation of the (72,36;15) box code gives rise to a (72,36;16*) code with nine words of weight eight, and all the rest have weights greater than or equal to 16.

  20. Box codes of lengths 48 and 72

    NASA Astrophysics Data System (ADS)

    Solomon, G.; Jin, Y.

    1993-11-01

    A self-dual code length 48, dimension 24, with Hamming distance essentially equal to 12 is constructed here. There are only six code words of weight eight. All the other code words have weights that are multiples of four and have a minimum weight equal to 12. This code may be encoded systematically and arises from a strict binary representation of the (8,4;5) Reed-Solomon (RS) code over GF (64). The code may be considered as six interrelated (8,7;2) codes. The Mattson-Solomon representation of the cyclic decomposition of these codes and their parity sums are used to detect an odd number of errors in any of the six codes. These may then be used in a correction algorithm for hard or soft decision decoding. A (72,36;15) box code was constructed from a (63,35;8) cyclic code. The theoretical justification is presented herein. A second (72,36;15) code is constructed from an inner (63,27;16) Bose Chaudhuri Hocquenghem (BCH) code and expanded to length 72 using box code algorithms for extension. This code was simulated and verified to have a minimum distance of 15 with even weight words congruent to zero modulo four. The decoding for hard and soft decision is still more complex than the first code constructed above. Finally, an (8,4;5) RS code over GF (512) in the binary representation of the (72,36;15) box code gives rise to a (72,36;16*) code with nine words of weight eight, and all the rest have weights greater than or equal to 16.

  1. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    ERIC Educational Resources Information Center

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  2. Biological Information Transfer Beyond the Genetic Code: The Sugar Code

    NASA Astrophysics Data System (ADS)

    Gabius, H.-J.

    In the era of genetic engineering, cloning, and genome sequencing the focus of research on the genetic code has received an even further accentuation in the public eye. In attempting, however, to understand intra- and intercellular recognition processes comprehensively, the two biochemical dimensions established by nucleic acids and proteins are not sufficient to satisfactorily explain all molecular events in, for example, cell adhesion or routing. The consideration of further code systems is essential to bridge this gap. A third biochemical alphabet forming code words with an information storage capacity second to no other substance class in rather small units (words, sentences) is established by monosaccharides (letters). As hardware oligosaccharides surpass peptides by more than seven orders of magnitude in the theoretical ability to build isomers, when the total of conceivable hexamers is calculated. In addition to the sequence complexity, the use of magnetic resonance spectroscopy and molecular modeling has been instrumental in discovering that even small glycans can often reside in not only one but several distinct low-energy conformations (keys). Intriguingly, conformers can display notably different capacities to fit snugly into the binding site of nonhomologous receptors (locks). This process, experimentally verified for two classes of lectins, is termed "differential conformer selection." It adds potential for shifts of the conformer equilibrium to modulate ligand properties dynamically and reversibly to the well-known changes in sequence (including anomeric positioning and linkage points) and in pattern of substitution, for example, by sulfation. In the intimate interplay with sugar receptors (lectins, enzymes, and antibodies) the message of coding units of the sugar code is deciphered. Their recognition will trigger postbinding signaling and the intended biological response. Knowledge about the driving forces for the molecular rendezvous, i

  3. Alternative translation initiation site in the DA strain of Theiler's murine encephalomyelitis virus.

    PubMed Central

    Kong, W P; Roos, R P

    1991-01-01

    Polyprotein processing studies of Theiler's murine encephalomyelitis virus (TMEV), a group of mouse picornaviruses, demonstrated synthesis of a protein we have called l during in vitro translations from the RNA of DA, a demyelinating strain of TMEV, but not GDVII, an acute neurovirulent strain. We have proposed that l is synthesized from an alternative initiation site in the DA leader (L) coding area out of phase with the polyprotein reading frame (R. P. Roos, W.-P. Kong, B. L. Semler, J. Virol. 63:5344-5353, 1989). We now provide support for this proposal from experiments involving in vitro translation of three separate mutations of an infectious DA cDNA clone: DA"l"-1, which contains a base mismatch at the putative initiation codon of l, DAL-1, which contains a base mismatch at the presumed authentic initiation site of L at the beginning of the polyprotein; and DAL:NheI, which contains nucleotides coding for a four-amino-acid insertion in the L coding area with a termination codon in the l reading frame. Our results demonstrate that the DA strain uses an alternative initiation site and reading frame to in vitro synthesize l. l may have a role in the biological activity of the virus. Images PMID:2033677

  4. Dopamine reward prediction error coding.

    PubMed

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  5. Visual analysis of code security

    SciTech Connect

    Goodall, John R; Radwan, Hassan; Halseth, Lenny

    2010-01-01

    To help increase the confidence that software is secure, researchers and vendors have developed different kinds of automated software security analysis tools. These tools analyze software for weaknesses and vulnerabilities, but the individual tools catch different vulnerabilities and produce voluminous data with many false positives. This paper describes a system that brings together the results of disparate software analysis tools into a visual environment to support the triage and exploration of code vulnerabilities. Our system allows software developers to explore vulnerability results to uncover hidden trends, triage the most important code weaknesses, and show who is responsible for introducing software vulnerabilities. By correlating and normalizing multiple software analysis tools' data, the overall vulnerability detection coverage of software is increased. A visual overview and powerful interaction allows the user to focus attention on the most pressing vulnerabilities within huge volumes of data, and streamlines the secure software development workflow through integration with development tools.

  6. GeoPhysical Analysis Code

    SciTech Connect

    2011-05-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-Written components: (i) a set of standard finite elements physics solvers for rersolving Darcy fluid flow, explicit mechanics, implicit mechanics, and fluid-mediated fracturing, including resolution of contact both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problems and problems involving hydraulic fracturing, where the mesh topology is dynamically changed. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for, e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release.

  7. Dopamine reward prediction error coding

    PubMed Central

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards—an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware. PMID:27069377

  8. Sensor Authentication: Embedded Processor Code

    2012-09-25

    Described is the c code running on the embedded Microchip 32bit PIC32MX575F256H located on the INL developed noise analysis circuit board. The code performs the following functions: Controls the noise analysis circuit board preamplifier voltage gains of 1, 10, 100, 000 Initializes the analog to digital conversion hardware, input channel selection, Fast Fourier Transform (FFT) function, USB communications interface, and internal memory allocations Initiates high resolution 4096 point 200 kHz data acquisition Computes complex 2048more » point FFT and FFT magnitude. Services Host command set Transfers raw data to Host Transfers FFT result to host Communication error checking« less

  9. Multidimensional Fuel Performance Code: BISON

    SciTech Connect

    2014-09-03

    BISON is a finite element based nuclear fuel performance code applicable to a variety of fuel forms including light water reactor fuel rods, TRISO fuel particles, and metallic rod and plate fuel (Refs. [a, b, c]). It solves the fully-coupled equations of thermomechanics and species diffusion and includes important fuel physics such as fission gas release and material property degradation with burnup. BISON is based on the MOOSE framework (Ref. [d]) and can therefore efficiently solve problems on 1-, 2- or 3-D meshes using standard workstations or large high performance computers. BISON is also coupled to a MOOSE-based mesoscale phase field material property simulation capability (Refs. [e, f]). As described here, BISON includes the code library named FOX, which was developed concurrent with BISON. FOX contains material and behavioral models that are specific to oxide fuels.

  10. Computer access security code system

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  11. Decoding: Codes and hardware implementation

    NASA Technical Reports Server (NTRS)

    Sulzer, M. P.; Woodman, R. F.

    1983-01-01

    The MST radars vary considerably from one installation to the next in the type of hardware, operating schedule and associated personnel. Most such systems do not have the computing power to decode in software when the decoding must be performed for each received pulse, as is required for certain sets of phase codes. These sets provide the best signal to sidelobe ratio when operating at the minimum band length allowed by the bandwidth of the transmitter. The development of the hardware phase decoder, and the applicability of each to decoding MST radar signals are discussed. A new design for a decoder which is very inexpensive to build, easy to add to an existing system and is capable of decoding on each received pulse using codes with a band length as short as one microsecond is presented.

  12. GeoPhysical Analysis Code

    2011-05-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-Written components: (i) a set of standard finite elements physics solvers for rersolving Darcy fluid flow, explicit mechanics, implicit mechanics, and fluid-mediated fracturing, including resolution of contact both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problemsmore » and problems involving hydraulic fracturing, where the mesh topology is dynamically changed. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for, e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release.« less

  13. The EGS5 Code System

    SciTech Connect

    Hirayama, Hideo; Namito, Yoshihito; Bielajew, Alex F.; Wilderman, Scott J.; U., Michigan; Nelson, Walter R.; /SLAC

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users of EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version

  14. CBP PHASE I CODE INTEGRATION

    SciTech Connect

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  15. Multidimensional Fuel Performance Code: BISON

    2014-09-03

    BISON is a finite element based nuclear fuel performance code applicable to a variety of fuel forms including light water reactor fuel rods, TRISO fuel particles, and metallic rod and plate fuel (Refs. [a, b, c]). It solves the fully-coupled equations of thermomechanics and species diffusion and includes important fuel physics such as fission gas release and material property degradation with burnup. BISON is based on the MOOSE framework (Ref. [d]) and can therefore efficientlymore » solve problems on 1-, 2- or 3-D meshes using standard workstations or large high performance computers. BISON is also coupled to a MOOSE-based mesoscale phase field material property simulation capability (Refs. [e, f]). As described here, BISON includes the code library named FOX, which was developed concurrent with BISON. FOX contains material and behavioral models that are specific to oxide fuels.« less

  16. Investigation of Near Shannon Limit Coding Schemes

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Kim, J.; Mo, Fan

    1999-01-01

    Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.

  17. Predictive coding of multisensory timing

    PubMed Central

    Shi, Zhuanghua; Burr, David

    2016-01-01

    The sense of time is foundational for perception and action, yet it frequently departs significantly from physical time. In the paper we review recent progress on temporal contextual effects, multisensory temporal integration, temporal recalibration, and related computational models. We suggest that subjective time arises from minimizing prediction errors and adaptive recalibration, which can be unified in the framework of predictive coding, a framework rooted in Helmholtz’s ‘perception as inference’.

  18. HADES, A Radiographic Simulation Code

    SciTech Connect

    Aufderheide, M.B.; Slone, D.M.; Schach von Wittenau, A.E.

    2000-08-18

    We describe features of the HADES radiographic simulation code. We begin with a discussion of why it is useful to simulate transmission radiography. The capabilities of HADES are described, followed by an application of HADES to a dynamic experiment recently performed at the Los Alamos Neutron Science Center. We describe quantitative comparisons between experimental data and HADES simulations using a copper step wedge. We conclude with a short discussion of future work planned for HADES.

  19. SWOC: Spectral Wavelength Optimization Code

    NASA Astrophysics Data System (ADS)

    Ruchti, G. R.

    2016-06-01

    SWOC (Spectral Wavelength Optimization Code) determines the wavelength ranges that provide the optimal amount of information to achieve the required science goals for a spectroscopic study. It computes a figure-of-merit for different spectral configurations using a user-defined list of spectral features, and, utilizing a set of flux-calibrated spectra, determines the spectral regions showing the largest differences among the spectra.

  20. Predictive coding of multisensory timing

    PubMed Central

    Shi, Zhuanghua; Burr, David

    2016-01-01

    The sense of time is foundational for perception and action, yet it frequently departs significantly from physical time. In the paper we review recent progress on temporal contextual effects, multisensory temporal integration, temporal recalibration, and related computational models. We suggest that subjective time arises from minimizing prediction errors and adaptive recalibration, which can be unified in the framework of predictive coding, a framework rooted in Helmholtz’s ‘perception as inference’. PMID:27695705

  1. Anelastic Strain Recovery Analysis Code

    1995-04-05

    ASR4 is a nonlinear least-squares regression of Anelastic Strain Recovery (ASR) data for the purpose of determining in situ stress orientations and magnitudes. ASR4 fits the viscoelastic model of Warpinski and Teufel to measure ASR data, calculates the stress orientations directly, and stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and it calculates stress magnitudes using Blanton''s approach, assuming sufficient input data are available.

  2. Emergency department coding and billing.

    PubMed

    Edelberg, Caral

    2004-02-01

    ED coding and billing are challenging additions to the responsibilities of emergency physicians. Assurances that each is performed in the most efficient and accurate manner possible is an essential component of today's emergency medicine practice. Minimizing the risk for submitting fraudulent claims is critical, because it assures the efficient and timely billing of all ED services. For the practice to thrive, each is necessary. PMID:15062501

  3. Computer Code for Nanostructure Simulation

    NASA Technical Reports Server (NTRS)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  4. Image Sequence Coding by Octrees

    NASA Astrophysics Data System (ADS)

    Leonardi, Riccardo

    1989-11-01

    This work addresses the problem of representing an image sequence as a set of octrees. The purpose is to generate a flexible data structure to model video signals, for applications such as motion estimation, video coding and/or analysis. An image sequence can be represented as a 3-dimensional causal signal, which becomes a 3 dimensional array of data when the signal has been digitized. If it is desirable to track long-term spatio-temporal correlation, a series of octree structures may be embedded on this 3D array. Each octree looks at a subset of data in the spatio-temporal space. At the lowest level (leaves of the octree), adjacent pixels of neighboring frames are captured. A combination of these is represented at the parent level of each group of 8 children. This combination may result in a more compact representation of the information of these pixels (coding application) or in a local estimate of some feature of interest (e.g., velocity, classification, object boundary). This combination can be iterated bottom-up to get a hierarchical description of the image sequence characteristics. A coding strategy using such data structure involves the description of the octree shape using one bit per node except for leaves of the tree located at the lowest level, and the value (or parametric model) assigned to each one of these leaves. Experiments have been performed to represent Common Image Format (CIF) sequences.

  5. Multineuronal codes in retinal signaling.

    PubMed Central

    Meister, M

    1996-01-01

    The visual world is presented to the brain through patterns of action potentials in the population of optic nerve fibers. Single-neuron recordings show that each retinal ganglion cell has a spatially restricted receptive field, a limited integration time, and a characteristic spectral sensitivity. Collectively, these response properties define the visual message conveyed by that neuron's action potentials. Since the size of the optic nerve is strictly constrained, one expects the retina to generate a highly efficient representation of the visual scene. By contrast, the receptive fields of nearby ganglion cells often overlap, suggesting great redundancy among the retinal output signals. Recent multineuron recordings may help resolve this paradox. They reveal concerted firing patterns among ganglion cells, in which small groups of nearby neurons fire synchronously with delays of only a few milliseconds. As there are many more such firing patterns than ganglion cells, such a distributed code might allow the retina to compress a large number of distinct visual messages into a small number of optic nerve fibers. This paper will review the evidence for a distributed coding scheme in the retinal output. The performance limits of such codes are analyzed with simple examples, illustrating that they allow a powerful trade-off between spatial and temporal resolution. PMID:8570603

  6. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, J.; Chau, J. L.; Pfeffer, N.; Clahsen, M.; Stober, G.

    2015-07-01

    The concept of coded continuous wave meteor radar is introduced. The radar uses a continuously transmitted pseudo-random waveform, which has several advantages: coding avoids range aliased echoes, which are often seen with commonly used pulsed specular meteor radars (SMRs); continuous transmissions maximize pulse compression gain, allowing operation with significantly lower peak transmit power; the temporal resolution can be changed after performing a measurement, as it does not depend on pulse spacing; and the low signal to noise ratio allows multiple geographically separated transmitters to be used in the same frequency band without significantly interfering with each other. The latter allows the same receiver antennas to be used to receive multiple transmitters. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large scale multi-static network of meteor radar transmitters and receivers. This would, for example, provide higher spatio-temporal resolution for mesospheric wind field measurements.

  7. Tandem Mirror Reactor Systems Code (Version I)

    SciTech Connect

    Reid, R.L.; Finn, P.A.; Gohar, M.Y.; Barrett, R.J.; Gorker, G.E.; Spampinaton, P.T.; Bulmer, R.H.; Dorn, D.W.; Perkins, L.J.; Ghose, S.

    1985-09-01

    A computer code was developed to model a Tandem Mirror Reactor. Ths is the first Tandem Mirror Reactor model to couple, in detail, the highly linked physics, magnetics, and neutronic analysis into a single code. This report describes the code architecture, provides a summary description of the modules comprising the code, and includes an example execution of the Tandem Mirror Reactor Systems Code. Results from this code for two sensitivity studies are also included. These studies are: (1) to determine the impact of center cell plasma radius, length, and ion temperature on reactor cost and performance at constant fusion power; and (2) to determine the impact of reactor power level on cost.

  8. User instructions for the CIDER Dose Code

    SciTech Connect

    Eslinger, P.W.; Lessor, K.S.; Ouderkirk, S.J.

    1994-05-01

    This document provides user instructions for the CIDER (Calculation of Individual Doses from Environmental Radionuclides) computer code. The CIDER code computes estimates of annual doses estimated for both reference individuals with a known residence and food consumption history. This document also provides user instructions for four utility codes used to build input data libraries for CIDER. These utility codes are ENVFAC (environmental factors), FOOFAC (food factors), LIFFAC (lifestyle factors), and ORGFAC (organ factors). Finally, this document provides user instructions for the EXPAND utility code. The EXPAND code processes a result file from CIDER and extracts a summary of the dose information for reporting or plotting purposes.

  9. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Fuel Gas Code. International Green Construction Code. International Mechanical Code. ICC Performance... for Residential Construction in High Wind Regions. ICC 700: National Green Building Standard The... proposals from interested individuals and organizations involved in the construction industry as well as...

  10. An implicit Smooth Particle Hydrodynamic code

    SciTech Connect

    Charles E. Knapp

    2000-04-01

    An implicit version of the Smooth Particle Hydrodynamic (SPH) code SPHINX has been written and is working. In conjunction with the SPHINX code the new implicit code models fluids and solids under a wide range of conditions. SPH codes are Lagrangian, meshless and use particles to model the fluids and solids. The implicit code makes use of the Krylov iterative techniques for solving large linear-systems and a Newton-Raphson method for non-linear corrections. It uses numerical derivatives to construct the Jacobian matrix. It uses sparse techniques to save on memory storage and to reduce the amount of computation. It is believed that this is the first implicit SPH code to use Newton-Krylov techniques, and is also the first implicit SPH code to model solids. A description of SPH and the techniques used in the implicit code are presented. Then, the results of a number of tests cases are discussed, which include a shock tube problem, a Rayleigh-Taylor problem, a breaking dam problem, and a single jet of gas problem. The results are shown to be in very good agreement with analytic solutions, experimental results, and the explicit SPHINX code. In the case of the single jet of gas case it has been demonstrated that the implicit code can do a problem in much shorter time than the explicit code. The problem was, however, very unphysical, but it does demonstrate the potential of the implicit code. It is a first step toward a useful implicit SPH code.

  11. Accumulate-Repeat-Accumulate-Accumulate-Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Sam; Thorpe, Jeremy

    2004-01-01

    Inspired by recently proposed Accumulate-Repeat-Accumulate (ARA) codes [15], in this paper we propose a channel coding scheme called Accumulate-Repeat-Accumulate-Accumulate (ARAA) codes. These codes can be seen as serial turbo-like codes or as a subclass of Low Density Parity Check (LDPC) codes, and they have a projected graph or protograph representation; this allows for a high-speed iterative decoder implementation using belief propagation. An ARAA code can be viewed as a precoded Repeat-and-Accumulate (RA) code with puncturing in concatenation with another accumulator, where simply an accumulator is chosen as the precoder; thus ARAA codes have a very fast encoder structure. Using density evolution on their associated protographs, we find examples of rate-lJ2 ARAA codes with maximum variable node degree 4 for which a minimum bit-SNR as low as 0.21 dB from the channel capacity limit can be achieved as the block size goes to infinity. Such a low threshold cannot be achieved by RA or Irregular RA (IRA) or unstructured irregular LDPC codes with the same constraint on the maximum variable node degree. Furthermore by puncturing the accumulators we can construct families of higher rate ARAA codes with thresholds that stay close to their respective channel capacity thresholds uniformly. Iterative decoding simulation results show comparable performance with the best-known LDPC codes but with very low error floor even at moderate block sizes.

  12. The future of PanDA in ATLAS distributed computing

    NASA Astrophysics Data System (ADS)

    De, K.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Schovancova, J.; Vaniachine, A.; Wenaus, T.

    2015-12-01

    Experiments at the Large Hadron Collider (LHC) face unprecedented computing challenges. Heterogeneous resources are distributed worldwide at hundreds of sites, thousands of physicists analyse the data remotely, the volume of processed data is beyond the exabyte scale, while data processing requires more than a few billion hours of computing usage per year. The PanDA (Production and Distributed Analysis) system was developed to meet the scale and complexity of LHC distributed computing for the ATLAS experiment. In the process, the old batch job paradigm of locally managed computing in HEP was discarded in favour of a far more automated, flexible and scalable model. The success of PanDA in ATLAS is leading to widespread adoption and testing by other experiments. PanDA is the first exascale workload management system in HEP, already operating at more than a million computing jobs per day, and processing over an exabyte of data in 2013. There are many new challenges that PanDA will face in the near future, in addition to new challenges of scale, heterogeneity and increasing user base. PanDA will need to handle rapidly changing computing infrastructure, will require factorization of code for easier deployment, will need to incorporate additional information sources including network metrics in decision making, be able to control network circuits, handle dynamically sized workload processing, provide improved visualization, and face many other challenges. In this talk we will focus on the new features, planned or recently implemented, that are relevant to the next decade of distributed computing workload management using PanDA.

  13. User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.

    1982-01-01

    This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.

  14. Henrique da Rocha Lima*

    PubMed Central

    Bernardes Filho, Fred; Avelleira, João Carlos Regazzi

    2015-01-01

    Brazilian physician and researcher Henrique da Rocha Lima was born in 1879 in the city of Rio de Janeiro, where he studied medicine and obtained the degree of M.D. in 1901. He specialized in Clinical Medicine in Germany and was the ambassador in European countries of the scientific medicine that emerged from the Oswaldo Cruz Institute in the early twentieth century. Rocha Lima has discovered the causative agent of typhus and had a major contribution to the studies of yellow fever, Chagas disease, Carrión’s disease and histoplasmosis. His genius, his research and his discoveries projected his name, and, with it, the image of Brazil in the international scientific scene. PMID:26131867

  15. 21 CFR 106.90 - Coding.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content of Infant Formulas § 106.90 Coding. The manufacturer shall code all infant formulas in...

  16. Block truncation signature coding for hyperspectral analysis

    NASA Astrophysics Data System (ADS)

    Chakravarty, Sumit; Chang, Chein-I.

    2008-08-01

    This paper introduces a new signature coding which is designed based on the well-known Block Truncation Coding (BTC). It comprises of bit-maps of the signature blocks generated by different threshold criteria. Two new BTC-based algorithms are developed for signature coding, to be called Block Truncation Signature Coding (BTSC) and 2-level BTSC (2BTSC). In order to compare the developed BTC based algorithms with current binary signature coding schemes such as Spectral Program Analysis Manager (SPAM) developed by Mazer et al. and Spectral Feature-based Binary Coding (SFBC) by Qian et al., three different thresholding functions, local block mean, local block gradient, local block correlation are derived to improve the BTSC performance where the combined bit-maps generated by these thresholds can provide better spectral signature characterization. Experimental results reveal that the new BTC-based signature coding performs more effectively in characterizing spectral variations than currently available binary signature coding methods.

  17. Search for optimal distance spectrum convolutional codes

    NASA Technical Reports Server (NTRS)

    Connor, Matthew C.; Perez, Lance C.; Costello, Daniel J., Jr.

    1993-01-01

    In order to communicate reliably and to reduce the required transmitter power, NASA uses coded communication systems on most of their deep space satellites and probes (e.g. Pioneer, Voyager, Galileo, and the TDRSS network). These communication systems use binary convolutional codes. Better codes make the system more reliable and require less transmitter power. However, there are no good construction techniques for convolutional codes. Thus, to find good convolutional codes requires an exhaustive search over the ensemble of all possible codes. In this paper, an efficient convolutional code search algorithm was implemented on an IBM RS6000 Model 580. The combination of algorithm efficiency and computational power enabled us to find, for the first time, the optimal rate 1/2, memory 14, convolutional code.

  18. A Better Handoff for Code Officials

    SciTech Connect

    Conover, David R.; Yerkes, Sara

    2010-09-24

    The U.S. Department of Energy's Building Energy Codes Program has partnered with ICC to release the new Building Energy Codes Resource Guide: Code Officials Edition. We created this binder of practical materials for a simple reason: code officials are busy learning and enforcing several codes at once for the diverse buildings across their jurisdictions. This doesn’t leave much time to search www.energycodes.gov, www.iccsafe.org, or the range of other helpful web-based resources for the latest energy codes tools, support, and information. So, we decided to bring the most relevant materials to code officials in a way that works best with their daily routine, and point to where they can find even more. Like a coach’s game plan, the Resource Guide is an "energy playbook" for code officials.

  19. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the

  20. TDRSS telecommunication system PN code analysis

    NASA Technical Reports Server (NTRS)

    Gold, R.

    1977-01-01

    The pseudonoise (PN) code library for the Tracking and Data Relay Satellite System (TDRSS) Services was defined and described. The code library was chosen to minimize user transponder hardware requirements and optimize system performance. Special precautions were taken to insure sufficient code phase separation to minimize cross-correlation sidelobes, and to avoid the generation of spurious code components which would interfere with system performance.

  1. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  2. Design of additive quantum codes via the code-word-stabilized framework

    SciTech Connect

    Kovalev, Alexey A.; Pryadko, Leonid P.; Dumer, Ilya

    2011-12-15

    We consider design of the quantum stabilizer codes via a two-step, low-complexity approach based on the framework of codeword-stabilized (CWS) codes. In this framework, each quantum CWS code can be specified by a graph and a binary code. For codes that can be obtained from a given graph, we give several upper bounds on the distance of a generic (additive or nonadditive) CWS code, and the lower Gilbert-Varshamov bound for the existence of additive CWS codes. We also consider additive cyclic CWS codes and show that these codes correspond to a previously unexplored class of single-generator cyclic stabilizer codes. We present several families of simple stabilizer codes with relatively good parameters.

  3. The new Code of Professional Conduct.

    PubMed

    Semple, Martin; Cable, Stuart

    The Nursing and Midwifery Council (NMC) has approved a new Code of Professional Conduct (NMC 2002a). This article discusses the main elements of the new code, examines the implications for the profession and encourages you to think about the implications for your own nursing practice. It identifies actions that you should take to comply with the code.

  4. Ethical Codes: A Standard for Ethical Behavior.

    ERIC Educational Resources Information Center

    Egan, Katherine

    1990-01-01

    Examines the codes of ethics of three major education associations (the National Association of Secondary School Principals, the National Education Association, and the American Association of School Administrators) and their usefulness in developing a school-specific code. The codes' language reveals how these organizations think about students,…

  5. Ultra-narrow bandwidth voice coding

    DOEpatents

    Holzrichter, John F.; Ng, Lawrence C.

    2007-01-09

    A system of removing excess information from a human speech signal and coding the remaining signal information, transmitting the coded signal, and reconstructing the coded signal. The system uses one or more EM wave sensors and one or more acoustic microphones to determine at least one characteristic of the human speech signal.

  6. Update and inclusion of resuspension model codes

    SciTech Connect

    Porch, W.M.; Greenly, G.D.; Mitchell, C.S.

    1983-12-01

    Model codes for estimating radiation doses from plutonium particles associated with resuspended dust were improved. Only one new code (RSUS) is required in addition to the MATHEW/ADPIC set of codes. The advantage is that it estimates resuspension based on wind blown dust fluxes derived for different soil types. 2 references. (ACR)

  7. The general theory of convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Stanley, R. P.

    1993-01-01

    This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.

  8. Subband coding for image data archiving

    NASA Technical Reports Server (NTRS)

    Glover, D.; Kwatra, S. C.

    1992-01-01

    The use of subband coding on image data is discussed. An overview of subband coding is given. Advantages of subbanding for browsing and progressive resolution are presented. Implementations for lossless and lossy coding are discussed. Algorithm considerations and simple implementations of subband are given.

  9. Subband coding for image data archiving

    NASA Technical Reports Server (NTRS)

    Glover, Daniel; Kwatra, S. C.

    1993-01-01

    The use of subband coding on image data is discussed. An overview of subband coding is given. Advantages of subbanding for browsing and progressive resolution are presented. Implementations for lossless and lossy coding are discussed. Algorithm considerations and simple implementations of subband systems are given.

  10. TRACK : the new beam dynamics code.

    SciTech Connect

    Aseev, V. N.; Ostroumov, P. N.; Lessner, E. S.; Mustapha, B.; Physics

    2005-01-01

    The new ray-tracing code TRACK originally developed to fulfill the special requirements of the RIA accelerator systems is a general beam dynamics code. It is currently being used for the design and simulation of future proton and heavy-ion linacs at several Labs. This paper presents a general description of the code TRACK emphasizing its main new features and recent updates.

  11. An Investigation of Different String Coding Methods.

    ERIC Educational Resources Information Center

    Goyal, Pankaj

    1984-01-01

    Investigates techniques for automatic coding of English language strings which involve titles drawn from bibliographic files, but do not require prior knowledge of source. Coding methods (basic, maximum entropy principle), results of test using 6,260 titles from British National Bibliography, and variations in code element ordering are…

  12. A Mathematical Representation of the Genetic Code

    NASA Astrophysics Data System (ADS)

    Hill, Vanessa J.; Rowlands, Peter

    Algebraic and geometric representations of the genetic code are used to show their functions in coding for amino acids. The algebra is a 64-part vector quaternion combination, and the geometry is based on the structure of the regular icosidodecahedron. An almost perfect pattern suggests that this is a biologically significant way of representing the genetic code.

  13. Regularized robust coding for face recognition.

    PubMed

    Yang, Meng; Zhang, Lei; Yang, Jian; Zhang, David

    2013-05-01

    Recently the sparse representation based classification (SRC) has been proposed for robust face recognition (FR). In SRC, the testing image is coded as a sparse linear combination of the training samples, and the representation fidelity is measured by the l2-norm or l1 -norm of the coding residual. Such a sparse coding model assumes that the coding residual follows Gaussian or Laplacian distribution, which may not be effective enough to describe the coding residual in practical FR systems. Meanwhile, the sparsity constraint on the coding coefficients makes the computational cost of SRC very high. In this paper, we propose a new face coding model, namely regularized robust coding (RRC), which could robustly regress a given signal with regularized regression coefficients. By assuming that the coding residual and the coding coefficient are respectively independent and identically distributed, the RRC seeks for a maximum a posterior solution of the coding problem. An iteratively reweighted regularized robust coding (IR(3)C) algorithm is proposed to solve the RRC model efficiently. Extensive experiments on representative face databases demonstrate that the RRC is much more effective and efficient than state-of-the-art sparse representation based methods in dealing with face occlusion, corruption, lighting, and expression changes, etc.

  14. Genomics: Evolution of the Genetic Code.

    PubMed

    Keeling, Patrick J

    2016-09-26

    The genetic code is not quite universal. The rare variations that we know of reveal selective pressures on the code and on the translation machinery. New data suggest the code changes through ambiguous intermediates and that termination is context dependent. PMID:27676305

  15. Reconstruction of coded aperture images

    NASA Technical Reports Server (NTRS)

    Bielefeld, Michael J.; Yin, Lo I.

    1987-01-01

    Balanced correlation method and the Maximum Entropy Method (MEM) were implemented to reconstruct a laboratory X-ray source as imaged by a Uniformly Redundant Array (URA) system. Although the MEM method has advantages over the balanced correlation method, it is computationally time consuming because of the iterative nature of its solution. Massively Parallel Processing, with its parallel array structure is ideally suited for such computations. These preliminary results indicate that it is possible to use the MEM method in future coded-aperture experiments with the help of the MPP.

  16. PKDGRAV3: Parallel gravity code

    NASA Astrophysics Data System (ADS)

    Potter, Douglas; Stadel, Joachim

    2016-09-01

    Pkdgrav3 is an 𝒪(N) gravity calculation method; it uses a binary tree algorithm with fifth order fast multipole expansion of the gravitational potential, using cell-cell interactions. Periodic boundaries conditions require very little data movement and allow a high degree of parallelism; the code includes GPU acceleration for all force calculations, leading to a significant speed-up with respect to previous versions (ascl:1305.005). Pkdgrav3 also has a sophisticated time-stepping criterion based on an estimation of the local dynamical time.

  17. Princeton spectral equilibrium code: PSEC

    SciTech Connect

    Ling, K.M.; Jardin, S.C.

    1985-05-15

    A fast computer code has been developed to calculate free-boundary solutions to the plasma equilibrium equation that are consistent with the currents in external coils and conductors. The free-boundary formulation is based on the minimization of a mean-square error epsilon-c while the fixed-boundary solution is based on a variational principle and spectral representation of the coordinates x(psi, theta) and z(psi, theta). Specific calculations using the Columbia University Torus II, the Poloidal Divertor Experiment (PDX), and the Tokamak Fusion Test Reactor (TFTR) geometries are performed.

  18. Princeton spectral equilibrium code: PSEC

    SciTech Connect

    Ling, K.M.; Jardin, S.C.

    1984-03-01

    A fast computer code has been developed to calculate free-boundary solutions to the plasma equilibrium equation that are consistent with the currents in external coils and conductors. The free-boundary formulation is based on the minimization of a mean-square error epsilon while the fixed-boundary solution is based on a variational principle and spectral representation of the coordinates x(psi,theta) and z(psi,theta). Specific calculations using the Columbia University Torus II, the Poloidal Divertor Experiment (PDX), and the Tokamak Fusion Test Reactor (TFTR) geometries are performed.

  19. Using the DEWSBR computer code

    SciTech Connect

    Cable, G.D.

    1989-09-01

    A computer code is described which is designed to determine the fraction of time during which a given ground location is observable from one or more members of a satellite constellation in earth orbit. Ground visibility parameters are determined from the orientation and strength of an appropriate ionized cylinder (used to simulate a beam experiment) at the selected location. Satellite orbits are computed in a simplified two-body approximation computation. A variety of printed and graphical outputs is provided. 9 refs., 50 figs., 2 tabs.

  20. Mosaic of coded aperture arrays

    DOEpatents

    Fenimore, Edward E.; Cannon, Thomas M.

    1980-01-01

    The present invention pertains to a mosaic of coded aperture arrays which is capable of imaging off-axis sources with minimum detector size. Mosaics of the basic array pattern create a circular on periodic correlation of the object on a section of the picture plane. This section consists of elements of the central basic pattern as well as elements from neighboring patterns and is a cyclic version of the basic pattern. Since all object points contribute a complete cyclic version of the basic pattern, a section of the picture, which is the size of the basic aperture pattern, contains all the information necessary to image the object with no artifacts.

  1. Pump CFD code validation tests

    NASA Astrophysics Data System (ADS)

    Brozowski, L. A.

    1993-12-01

    Pump CFD code validation tests were accomplished by obtaining nonintrusive flow characteristic data at key locations in generic current liquid rocket engine turbopump configurations. Data were obtained with a laser two-focus (L2F) velocimeter at scaled design flow. Three components were surveyed: a 1970's-designed impeller, a 1990's-designed impeller, and a four-bladed unshrouded inducer. Two-dimensional velocities were measured upstream and downstream of the two impellers. Three-dimensional velocities were measured upstream, downstream, and within the blade row of the unshrouded inducer.

  2. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  3. GeoPhysical Analysis Code

    2012-12-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-written components: (i) a set of standard finite, discrete, and discontinuous displacement element physics solvers for resolving Darcy fluid flow, explicit mechanics, implicit mechanics, fault rupture and earthquake nucleation, and fluid-mediated fracturing, including resolution of physcial behaviors both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. Themore » resultant code can be used alone for linearly elastic problems; ploblems involving hydraulic fracturing, where the mesh topology is dynamically changed; fault rupture modeling and seismic risk assessment; and general granular materials behavior. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for , e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release. CPAC's secondary applications include modeling fault evolution for predicting the statistical distribution of earthquake events and to capture granular materials behavior under different load paths.« less

  4. GeoPhysical Analysis Code

    SciTech Connect

    2012-12-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-written components: (i) a set of standard finite, discrete, and discontinuous displacement element physics solvers for resolving Darcy fluid flow, explicit mechanics, implicit mechanics, fault rupture and earthquake nucleation, and fluid-mediated fracturing, including resolution of physcial behaviors both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problems; ploblems involving hydraulic fracturing, where the mesh topology is dynamically changed; fault rupture modeling and seismic risk assessment; and general granular materials behavior. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for , e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release. CPAC's secondary applications include modeling fault evolution for predicting the statistical distribution of earthquake events and to capture granular materials behavior under different load paths.

  5. The Clawpack Community of Codes

    NASA Astrophysics Data System (ADS)

    Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.

    2014-12-01

    Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.

  6. Transform coding for space applications

    NASA Technical Reports Server (NTRS)

    Glover, Daniel

    1993-01-01

    Data compression coding requirements for aerospace applications differ somewhat from the compression requirements for entertainment systems. On the one hand, entertainment applications are bit rate driven with the goal of getting the best quality possible with a given bandwidth. Science applications are quality driven with the goal of getting the lowest bit rate for a given level of reconstruction quality. In the past, the required quality level has been nothing less than perfect allowing only the use of lossless compression methods (if that). With the advent of better, faster, cheaper missions, an opportunity has arisen for lossy data compression methods to find a use in science applications as requirements for perfect quality reconstruction runs into cost constraints. This paper presents a review of the data compression problem from the space application perspective. Transform coding techniques are described and some simple, integer transforms are presented. The application of these transforms to space-based data compression problems is discussed. Integer transforms have an advantage over conventional transforms in computational complexity. Space applications are different from broadcast or entertainment in that it is desirable to have a simple encoder (in space) and tolerate a more complicated decoder (on the ground) rather than vice versa. Energy compaction with new transforms are compared with the Walsh-Hadamard (WHT), Discrete Cosine (DCT), and Integer Cosine (ICT) transforms.

  7. Code of ethics: principles for ethical leadership.

    PubMed

    Flite, Cathy A; Harman, Laurinda B

    2013-01-01

    The code of ethics for a professional association incorporates values, principles, and professional standards. A review and comparative analysis of a 1934 pledge and codes of ethics from 1957, 1977, 1988, 1998, 2004, and 2011 for a health information management association was conducted. Highlights of some changes in the healthcare delivery system are identified as a general context for the codes of ethics. The codes of ethics are examined in terms of professional values and changes in the language used to express the principles of the various codes.

  8. Bar Coding and Tracking in Pathology.

    PubMed

    Hanna, Matthew G; Pantanowitz, Liron

    2015-06-01

    Bar coding and specimen tracking are intricately linked to pathology workflow and efficiency. In the pathology laboratory, bar coding facilitates many laboratory practices, including specimen tracking, automation, and quality management. Data obtained from bar coding can be used to identify, locate, standardize, and audit specimens to achieve maximal laboratory efficiency and patient safety. Variables that need to be considered when implementing and maintaining a bar coding and tracking system include assets to be labeled, bar code symbologies, hardware, software, workflow, and laboratory and information technology infrastructure as well as interoperability with the laboratory information system. This article addresses these issues, primarily focusing on surgical pathology.

  9. Syndrome source coding and its universal generalization

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1975-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.

  10. National Agenda for Hydrogen Codes and Standards

    SciTech Connect

    Blake, C.

    2010-05-01

    This paper provides an overview of hydrogen codes and standards with an emphasis on the national effort supported and managed by the U.S. Department of Energy (DOE). With the help and cooperation of standards and model code development organizations, industry, and other interested parties, DOE has established a coordinated national agenda for hydrogen and fuel cell codes and standards. With the adoption of the Research, Development, and Demonstration Roadmap and with its implementation through the Codes and Standards Technical Team, DOE helps strengthen the scientific basis for requirements incorporated in codes and standards that, in turn, will facilitate international market receptivity for hydrogen and fuel cell technologies.

  11. Optimal Codes for the Burst Erasure Channel

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2010-01-01

    Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure

  12. DNA: Polymer and molecular code

    NASA Astrophysics Data System (ADS)

    Shivashankar, G. V.

    1999-10-01

    The thesis work focusses upon two aspects of DNA, the polymer and the molecular code. Our approach was to bring single molecule micromanipulation methods to the study of DNA. It included a home built optical microscope combined with an atomic force microscope and an optical tweezer. This combined approach led to a novel method to graft a single DNA molecule onto a force cantilever using the optical tweezer and local heating. With this method, a force versus extension assay of double stranded DNA was realized. The resolution was about 10 picoN. To improve on this force measurement resolution, a simple light backscattering technique was developed and used to probe the DNA polymer flexibility and its fluctuations. It combined the optical tweezer to trap a DNA tethered bead and the laser backscattering to detect the beads Brownian fluctuations. With this technique the resolution was about 0.1 picoN with a millisecond access time, and the whole entropic part of the DNA force-extension was measured. With this experimental strategy, we measured the polymerization of the protein RecA on an isolated double stranded DNA. We observed the progressive decoration of RecA on the l DNA molecule, which results in the extension of l , due to unwinding of the double helix. The dynamics of polymerization, the resulting change in the DNA entropic elasticity and the role of ATP hydrolysis were the main parts of the study. A simple model for RecA assembly on DNA was proposed. This work presents a first step in the study of genetic recombination. Recently we have started a study of equilibrium binding which utilizes fluorescence polarization methods to probe the polymerization of RecA on single stranded DNA. In addition to the study of material properties of DNA and DNA-RecA, we have developed experiments for which the code of the DNA is central. We studied one aspect of DNA as a molecular code, using different techniques. In particular the programmatic use of template specificity makes

  13. 24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 2 2011-04-01 2011-04-01 false Model code provisions for use in... Minimum Property Standards § 200.926c Model code provisions for use in partially accepted code..., those portions of one of the model codes with which the property must comply. Schedule for Model...

  14. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    ERIC Educational Resources Information Center

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  15. Quantum error-correcting codes from algebraic geometry codes of Castle type

    NASA Astrophysics Data System (ADS)

    Munuera, Carlos; Tenório, Wanderson; Torres, Fernando

    2016-10-01

    We study algebraic geometry codes producing quantum error-correcting codes by the CSS construction. We pay particular attention to the family of Castle codes. We show that many of the examples known in the literature in fact belong to this family of codes. We systematize these constructions by showing the common theory that underlies all of them.

  16. Quantum error-correcting codes from algebraic geometry codes of Castle type

    NASA Astrophysics Data System (ADS)

    Munuera, Carlos; Tenório, Wanderson; Torres, Fernando

    2016-07-01

    We study algebraic geometry codes producing quantum error-correcting codes by the CSS construction. We pay particular attention to the family of Castle codes. We show that many of the examples known in the literature in fact belong to this family of codes. We systematize these constructions by showing the common theory that underlies all of them.

  17. Advanced coding and modulation schemes for TDRSS

    NASA Technical Reports Server (NTRS)

    Harrell, Linda; Kaplan, Ted; Berman, Ted; Chang, Susan

    1993-01-01

    This paper describes the performance of the Ungerboeck and pragmatic 8-Phase Shift Key (PSK) Trellis Code Modulation (TCM) coding techniques with and without a (255,223) Reed-Solomon outer code as they are used for Tracking Data and Relay Satellite System (TDRSS) S-Band and Ku-Band return services. The performance of these codes at high data rates is compared to uncoded Quadrature PSK (QPSK) and rate 1/2 convolutionally coded QPSK in the presence of Radio Frequency Interference (RFI), self-interference, and hardware distortions. This paper shows that the outer Reed-Solomon code is necessary to achieve a 10(exp -5) Bit Error Rate (BER) with an acceptable level of degradation in the presence of RFI. This paper also shows that the TCM codes with or without the Reed-Solomon outer code do not perform well in the presence of self-interference. In fact, the uncoded QPSK signal performs better than the TCM coded signal in the self-interference situation considered in this analysis. Finally, this paper shows that the E(sub b)/N(sub 0) degradation due to TDRSS hardware distortions is approximately 1.3 dB with a TCM coded signal or a rate 1/2 convolutionally coded QPSK signal and is 3.2 dB with an uncoded QPSK signal.

  18. Multiplexed coding in the human basal ganglia

    NASA Astrophysics Data System (ADS)

    Andres, D. S.; Cerquetti, D.; Merello, M.

    2016-04-01

    A classic controversy in neuroscience is whether information carried by spike trains is encoded by a time averaged measure (e.g. a rate code), or by complex time patterns (i.e. a time code). Here we apply a tool to quantitatively analyze the neural code. We make use of an algorithm based on the calculation of the temporal structure function, which permits to distinguish what scales of a signal are dominated by a complex temporal organization or a randomly generated process. In terms of the neural code, this kind of analysis makes it possible to detect temporal scales at which a time patterns coding scheme or alternatively a rate code are present. Additionally, finding the temporal scale at which the correlation between interspike intervals fades, the length of the basic information unit of the code can be established, and hence the word length of the code can be found. We apply this algorithm to neuronal recordings obtained from the Globus Pallidus pars interna from a human patient with Parkinson’s disease, and show that a time pattern coding and a rate coding scheme co-exist at different temporal scales, offering a new example of multiplexed neuronal coding.

  19. The Astrophysics Source Code Library: An Update

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.

    2012-01-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.

  20. Breeding quantum error-correcting codes

    SciTech Connect

    Dong Ying; Hu Dan; Yu Sixia

    2010-02-15

    The stabilizer code, one major family of quantum error-correcting codes (QECC), is specified by the joint eigenspace of a commuting set of Pauli observables. It turns out that noncommuting sets of Pauli observables can be used to construct more efficient QECCs, such as the entanglement-assisted QECCs, which are built directly from any linear classical codes whose detailed properties are needed to determine the parameters of the resulting quantum codes. Here we propose another family of QECCs, namely, the breeding QECCs, that also employ noncommuting sets of Pauli observables and can be built from any classical additive codes, either linear or nonlinear, with the advantage that their parameters can be read off directly from the corresponding classical codes. Besides, since nonlinear codes are generally more efficient than linear codes, our breeding codes have better parameters than those codes built from linear codes. The terminology is justified by the fact that our QECCs are related to the ordinary QECCs in exactly the same way that the breeding protocols are related to the hashing protocols in the entanglement purification.

  1. Verification tests for contaminant transport codes

    SciTech Connect

    Rowe, R.K.; Nadarajah, P.

    1996-12-31

    The importance of verifying contaminant transport codes and the techniques that may be used in this verification process are discussed. Commonly used contaminant transport codes are characterized as belonging to one of several types or classes of solution, such as analytic, finite layer, boundary element, finite difference and finite element. Both the level of approximation and the solution methodology should be verified for each contaminant transport code. One powerful method that may be used in contaminant transport code verification is cross-checking (benchmarking) with other codes. This technique is used to check the results of codes from one solution class with the results of codes from another solution class. In this paper cross-checking is performed for three classes of solution; these are, analytic, finite layer, and finite element.

  2. Turbo codes for deep-space communications

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    Turbo codes were recently proposed by Berrou, Glavieux, and Thitimajshima, and it has been claimed these codes achieve near-Shannon-limit error correction performance with relatively simple component codes and large interleavers. A required E(b)/N(o) of 0.7 dB was reported for a bit error rate of 10(exp -5), using a rate 1/2 turbo code. However, some important details that are necessary to reproduce these results were omitted. This article confirms the accuracy of these claims, and presents a complete description of an encoder/decoder pair that could be suitable for deep-space applications, where lower rate codes can be used. We describe a new simple method for trellis termination, analyze the effect of interleaver choice on the weight distribution of the code, and introduce the use of unequal rate component codes, which yield better performance.

  3. Foliated Quantum Error-Correcting Codes.

    PubMed

    Bolt, A; Duclos-Cianci, G; Poulin, D; Stace, T M

    2016-08-12

    We show how to construct a large class of quantum error-correcting codes, known as Calderbank-Steane-Shor codes, from highly entangled cluster states. This becomes a primitive in a protocol that foliates a series of such cluster states into a much larger cluster state, implementing foliated quantum error correction. We exemplify this construction with several familiar quantum error-correction codes and propose a generic method for decoding foliated codes. We numerically evaluate the error-correction performance of a family of finite-rate Calderbank-Steane-Shor codes known as turbo codes, finding that they perform well over moderate depth foliations. Foliated codes have applications for quantum repeaters and fault-tolerant measurement-based quantum computation. PMID:27563942

  4. New asymmetric quantum codes over Fq

    NASA Astrophysics Data System (ADS)

    Ma, Yuena; Feng, Xiaoyi; Xu, Gen

    2016-07-01

    Two families of new asymmetric quantum codes are constructed in this paper. The first family is the asymmetric quantum codes with length n=qm-1 over Fq, where qge 5 is a prime power. The second one is the asymmetric quantum codes with length n=3m-1. These asymmetric quantum codes are derived from the CSS construction and pairs of nested BCH codes. Moreover, let the defining set T1=T2^{-q}, then the real Z-distance of our asymmetric quantum codes are much larger than δ _max+1, where δ _max is the maximal designed distance of dual-containing narrow-sense BCH code, and the parameters presented here have better than the ones available in the literature.

  5. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  6. Evaluation of help model replacement codes

    SciTech Connect

    Whiteside, Tad; Hang, Thong; Flach, Gregory

    2009-07-01

    This work evaluates the computer codes that are proposed to be used to predict percolation of water through the closure-cap and into the waste containment zone at the Department of Energy closure sites. This work compares the currently used water-balance code (HELP) with newly developed computer codes that use unsaturated flow (Richards’ equation). It provides a literature review of the HELP model and the proposed codes, which result in two recommended codes for further evaluation: HYDRUS-2D3D and VADOSE/W. This further evaluation involved performing actual simulations on a simple model and comparing the results of those simulations to those obtained with the HELP code and the field data. From the results of this work, we conclude that the new codes perform nearly the same, although moving forward, we recommend HYDRUS-2D3D.

  7. Foliated Quantum Error-Correcting Codes

    NASA Astrophysics Data System (ADS)

    Bolt, A.; Duclos-Cianci, G.; Poulin, D.; Stace, T. M.

    2016-08-01

    We show how to construct a large class of quantum error-correcting codes, known as Calderbank-Steane-Shor codes, from highly entangled cluster states. This becomes a primitive in a protocol that foliates a series of such cluster states into a much larger cluster state, implementing foliated quantum error correction. We exemplify this construction with several familiar quantum error-correction codes and propose a generic method for decoding foliated codes. We numerically evaluate the error-correction performance of a family of finite-rate Calderbank-Steane-Shor codes known as turbo codes, finding that they perform well over moderate depth foliations. Foliated codes have applications for quantum repeaters and fault-tolerant measurement-based quantum computation.

  8. Code portability and data management considerations in the SAS3D LMFBR accident-analysis code

    SciTech Connect

    Dunn, F.E.

    1981-01-01

    The SAS3D code was produced from a predecessor in order to reduce or eliminate interrelated problems in the areas of code portability, the large size of the code, inflexibility in the use of memory and the size of cases that can be run, code maintenance, and running speed. Many conventional solutions, such as variable dimensioning, disk storage, virtual memory, and existing code-maintenance utilities were not feasible or did not help in this case. A new data management scheme was developed, coding standards and procedures were adopted, special machine-dependent routines were written, and a portable source code processing code was written. The resulting code is quite portable, quite flexible in the use of memory and the size of cases that can be run, much easier to maintain, and faster running. SAS3D is still a large, long running code that only runs well if sufficient main memory is available.

  9. All-optical code-division multiple-access applications: 2(n) extended-prime codes.

    PubMed

    Zhang, J G; Kwong, W C; Mann, S

    1997-09-10

    A new family of 2(n) codes, called 2(n) extended-prime codes, is proposed for all-optical code-division multiple-access networks. Such 2(n) codes are derived from so-called extended-prime codes so that their cross-correlation functions are not greater than 1, as opposed to 2 for recently proposed 2(n) prime codes. As a result, a larger number of active users can now be supported by the new codes for a given bit-error rate than can be by 2(n) prime codes, while power-efficient, waveguide-integrable all-serial coding and correlating configurations proposed for the 2(n) prime codes can still be employed.

  10. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    SciTech Connect

    Langenbuch, S.; Austregesilo, H.; Velkov, K.

    1997-07-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes.

  11. Strict optical orthogonal codes for purely asynchronous code-division multiple-access applications.

    PubMed

    Zhang, J G

    1996-12-10

    Strict optical orthogonal codes are presented for purely asynchronous optical code-division multiple-access (CDMA) applications. The proposed code can strictly guarantee the peaks of its cross-correlation functions and the sidelobes of any of its autocorrelation functions to have a value of 1 in purely asynchronous data communications. The basic theory of the proposed codes is given. An experiment on optical CDMA systems is also demonstrated to verify the characteristics of the proposed code.

  12. TVD: Total Variation Diminishing code

    NASA Astrophysics Data System (ADS)

    Pen, Ue-Li; Arras, Phil; Wong, ShingKwong

    2013-04-01

    TVD solves the magnetohydrodynamic (MHD) equations by updating the fluid variables along each direction using the flux-conservative, second-order, total variation diminishing (TVD), upwind scheme of Jin & Xin. The magnetic field is updated separately in two-dimensional advection-constraint steps. The electromotive force (EMF) is computed in the advection step using the TVD scheme, and this same EMF is used immediately in the constraint step in order to preserve \\Downtriangle B=0 without the need to store intermediate fluxes. The code is extended to three dimensions using operator splitting, and Runge-Kutta is used to get second-order accuracy in time. TVD offers high-resolution per grid cell, second-order accuracy in space and time, and enforcement of the \\Downtriangle B=0 constraint to machine precision. Written in Fortran, It has no memory overhead and is fast. It is also available in a fully scalable message-passing parallel MPI implementation.

  13. CORTRAN code user manual. [LMFBR

    SciTech Connect

    Cheatham, R.L.; Crawford, S.L.; Khan, E.U.

    1981-02-01

    CORTRAN has been developed as a relatively fast running design code for core-wide steady-state and transient analysis of Liquid Metal Fast Breeder Reactor (LMFBR) cores. The preliminary version of this computer program uses subchannel analysis techniques to compute the velocity and temperature fields on a multiassembly basis for three types of transient forcing functions: total power, total flow, and inlet coolant temperature. Interassembly heat transfer, intra-assembly heat transfer, and intra-assembly flow redistribution due to buoyancy are taken into account. Heat generation within the fuel rods and assembly duct walls is also included. Individual pin radial peaking factors (peak to average for each assembly) can be either read in or calculated from specified normalized neutronic power densities (six per assembly).

  14. NSCool: Neutron star cooling code

    NASA Astrophysics Data System (ADS)

    Page, Dany

    2016-09-01

    NSCool is a 1D (i.e., spherically symmetric) neutron star cooling code written in Fortran 77. The package also contains a series of EOSs (equation of state) to build stars, a series of pre-built stars, and a TOV (Tolman- Oppenheimer-Volkoff) integrator to build stars from an EOS. It can also handle “strange stars” that have a huge density discontinuity between the quark matter and the covering thin baryonic crust. NSCool solves the heat transport and energy balance equations in whole GR, resulting in a time sequence of temperature profiles (and, in particular, a Teff - age curve). Several heating processes are included, and more can easily be incorporated. In particular it can evolve a star undergoing accretion with the resulting deep crustal heating, under a steady or time-variable accretion rate. NSCool is robust, very fast, and highly modular, making it easy to add new subroutines for new processes.

  15. Advanced Code for Photocathode Design

    SciTech Connect

    Ives, Robert Lawrence; Jensen, Kevin; Montgomery, Eric; Bui, Thuc

    2015-12-15

    The Phase I activity demonstrated that PhotoQE could be upgraded and modified to allow input using a graphical user interface. Specific calls to platform-dependent (e.g. IMSL) function calls were removed, and Fortran77 components were rewritten for Fortran95 compliance. The subroutines, specifically the common block structures and shared data parameters, were reworked to allow the GUI to update material parameter data, and the system was targeted for desktop personal computer operation. The new structures overcomes the previous rigid and unmodifiable library structures by implementing new, materials library data sets and repositioning the library values to external files. Material data may originate from published literature or experimental measurements. Further optimization and restructuring would allow custom and specific emission models for beam codes that rely on parameterized photoemission algorithms. These would be based on simplified and parametric representations updated and extended from previous versions (e.g., Modified Fowler-Dubridge, Modified Three-Step, etc.).

  16. Quantum coding with finite resources

    PubMed Central

    Tomamichel, Marco; Berta, Mario; Renes, Joseph M.

    2016-01-01

    The quantum capacity of a memoryless channel determines the maximal rate at which we can communicate reliably over asymptotically many uses of the channel. Here we illustrate that this asymptotic characterization is insufficient in practical scenarios where decoherence severely limits our ability to manipulate large quantum systems in the encoder and decoder. In practical settings, we should instead focus on the optimal trade-off between three parameters: the rate of the code, the size of the quantum devices at the encoder and decoder, and the fidelity of the transmission. We find approximate and exact characterizations of this trade-off for various channels of interest, including dephasing, depolarizing and erasure channels. In each case, the trade-off is parameterized by the capacity and a second channel parameter, the quantum channel dispersion. In the process, we develop several bounds that are valid for general quantum channels and can be computed for small instances. PMID:27156995

  17. Terrain-Responsive Atmospheric Code

    1991-11-20

    The Terrain-Responsive Atmospheric Code (TRAC) is a real-time emergency response modeling capability designed to advise Emergency Managers of the path, timing, and projected impacts from an atmospheric release. TRAC evaluates the effects of both radiological and non-radiological hazardous substances, gases and particulates. Using available surface and upper air meteorological information, TRAC realistically treats complex sources and atmospheric conditions, such as those found in mountainous terrain. TRAC calculates atmospheric concentration, deposition, and dose for more thanmore » 25,000 receptor locations within 80 km of the release point. Human-engineered output products support critical decisions on the type, location, and timing of protective actions for workers and the public during an emergency.« less

  18. Parallelizing the XSTAR Photoionization Code

    NASA Astrophysics Data System (ADS)

    Noble, M. S.; Ji, L.; Young, A.; Lee, J. C.

    2009-09-01

    We describe two means by which XSTAR, a code which computes physical conditions and emission spectra of photoionized gases, has been parallelized. The first is pvmxstar, a wrapper which can be used in place of the serial xstar2xspec script to foster concurrent execution of the XSTAR command line application on independent sets of parameters. The second is pmodel, a plugin for the Interactive Spectral Interpretation System (ISIS) which allows arbitrary components of a broad range of astrophysical models to be distributed across processors during fitting and confidence limits calculations, by scientists with little training in parallel programming. Plugging the XSTAR family of analytic models into pmodel enables multiple ionization states (e.g., of a complex absorber/emitter) to be computed simultaneously, alleviating the often prohibitive expense of the traditional serial approach. Initial performance results indicate that these methods substantially enlarge the problem space to which XSTAR may be applied within practical timeframes.

  19. Numerical classification of coding sequences

    NASA Technical Reports Server (NTRS)

    Collins, D. W.; Liu, C. C.; Jukes, T. H.

    1992-01-01

    DNA sequences coding for protein may be represented by counts of nucleotides or codons. A complete reading frame may be abbreviated by its base count, e.g. A76C158G121T74, or with the corresponding codon table, e.g. (AAA)0(AAC)1(AAG)9 ... (TTT)0. We propose that these numerical designations be used to augment current methods of sequence annotation. Because base counts and codon tables do not require revision as knowledge of function evolves, they are well-suited to act as cross-references, for example to identify redundant GenBank entries. These descriptors may be compared, in place of DNA sequences, to extract homologous genes from large databases. This approach permits rapid searching with good selectivity.

  20. Bitplane Image Coding With Parallel Coefficient Processing.

    PubMed

    Auli-Llinas, Francesc; Enfedaque, Pablo; Moure, Juan C; Sanchez, Victor

    2016-01-01

    Image coding systems have been traditionally tailored for multiple instruction, multiple data (MIMD) computing. In general, they partition the (transformed) image in codeblocks that can be coded in the cores of MIMD-based processors. Each core executes a sequential flow of instructions to process the coefficients in the codeblock, independently and asynchronously from the others cores. Bitplane coding is a common strategy to code such data. Most of its mechanisms require sequential processing of the coefficients. The last years have seen the upraising of processing accelerators with enhanced computational performance and power efficiency whose architecture is mainly based on the single instruction, multiple data (SIMD) principle. SIMD computing refers to the execution of the same instruction to multiple data in a lockstep synchronous way. Unfortunately, current bitplane coding strategies cannot fully profit from such processors due to inherently sequential coding task. This paper presents bitplane image coding with parallel coefficient (BPC-PaCo) processing, a coding method that can process many coefficients within a codeblock in parallel and synchronously. To this end, the scanning order, the context formation, the probability model, and the arithmetic coder of the coding engine have been re-formulated. The experimental results suggest that the penalization in coding performance of BPC-PaCo with respect to the traditional strategies is almost negligible.

  1. Spin glasses and error-correcting codes

    NASA Technical Reports Server (NTRS)

    Belongie, M. L.

    1994-01-01

    In this article, we study a model for error-correcting codes that comes from spin glass theory and leads to both new codes and a new decoding technique. Using the theory of spin glasses, it has been proven that a simple construction yields a family of binary codes whose performance asymptotically approaches the Shannon bound for the Gaussian channel. The limit is approached as the number of information bits per codeword approaches infinity while the rate of the code approaches zero. Thus, the codes rapidly become impractical. We present simulation results that show the performance of a few manageable examples of these codes. In the correspondence that exists between spin glasses and error-correcting codes, the concept of a thermal average leads to a method of decoding that differs from the standard method of finding the most likely information sequence for a given received codeword. Whereas the standard method corresponds to calculating the thermal average at temperature zero, calculating the thermal average at a certain optimum temperature results instead in the sequence of most likely information bits. Since linear block codes and convolutional codes can be viewed as examples of spin glasses, this new decoding method can be used to decode these codes in a way that minimizes the bit error rate instead of the codeword error rate. We present simulation results that show a small improvement in bit error rate by using the thermal average technique.

  2. Speech coding research at Bell Laboratories

    NASA Astrophysics Data System (ADS)

    Atal, Bishnu S.

    2001-05-01

    The field of speech coding is now over 70 years old. It started from the desire to transmit voice signals over telegraph cables. The availability of digital computers in the mid 1960s made it possible to test complex speech coding algorithms rapidly. The introduction of linear predictive coding (LPC) started a new era in speech coding. The fundamental philosophy of speech coding went through a major shift, resulting in a new generation of low bit rate speech coders, such as multi-pulse and code-excited LPC. The semiconductor revolution produced faster and faster DSP chips and made linear predictive coding practical. Code-excited LPC has become the method of choice for low bit rate speech coding applications and is used in most voice transmission standards for cell phones. Digital speech communication is rapidly evolving from circuit-switched to packet-switched networks to provide integrated transmission of voice, data, and video signals. The new communication environment is also moving the focus of speech coding research from compression to low cost, reliable, and secure transmission of voice signals on digital networks, and provides the motivation for creating a new class of speech coders suitable for future applications.

  3. The trellis complexity of convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Lin, W.

    1995-01-01

    It has long been known that convolutional codes have a natural, regular trellis structure that facilitates the implementation of Viterbi's algorithm. It has gradually become apparent that linear block codes also have a natural, though not in general a regular, 'minimal' trellis structure, which allows them to be decoded with a Viterbi-like algorithm. In both cases, the complexity of the Viterbi decoding algorithm can be accurately estimated by the number of trellis edges per encoded bit. It would, therefore, appear that we are in a good position to make a fair comparison of the Viterbi decoding complexity of block and convolutional codes. Unfortunately, however, this comparison is somewhat muddled by the fact that some convolutional codes, the punctured convolutional codes, are known to have trellis representations that are significantly less complex than the conventional trellis. In other words, the conventional trellis representation for a convolutional code may not be the minimal trellis representation. Thus, ironically, at present we seem to know more about the minimal trellis representation for block than for convolutional codes. In this article, we provide a remedy, by developing a theory of minimal trellises for convolutional codes. (A similar theory has recently been given by Sidorenko and Zyablov). This allows us to make a direct performance-complexity comparison for block and convolutional codes. A by-product of our work is an algorithm for choosing, from among all generator matrices for a given convolutional code, what we call a trellis-minimal generator matrix, from which the minimal trellis for the code can be directly constructed. Another by-product is that, in the new theory, punctured convolutional codes no longer appear as a special class, but simply as high-rate convolutional codes whose trellis complexity is unexpectedly small.

  4. Serial-Turbo-Trellis-Coded Modulation with Rate-1 Inner Code

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Sam; Pollara, Fabrizio

    2004-01-01

    Serially concatenated turbo codes have been proposed to satisfy requirements for low bit- and word-error rates and for low (in comparison with related previous codes) complexity of coding and decoding algorithms and thus low complexity of coding and decoding circuitry. These codes are applicable to such high-level modulations as octonary phase-shift keying (8PSK) and 16-state quadrature amplitude modulation (16QAM); the signal product obtained by applying one of these codes to one of these modulations is denoted, generally, as serially concatenated trellis-coded modulation (SCTCM). These codes could be particularly beneficial for communication systems that must be designed and operated subject to limitations on bandwidth and power. Some background information is prerequisite to a meaningful summary of this development. Trellis-coded modulation (TCM) is now a well-established technique in digital communications. A turbo code combines binary component codes (which typically include trellis codes) with interleaving. A turbo code of the type that has been studied prior to this development is composed of parallel concatenated convolutional codes (PCCCs) implemented by two or more constituent systematic encoders joined through one or more interleavers. The input information bits feed the first encoder and, after having been scrambled by the interleaver, enter the second encoder. A code word of a parallel concatenated code consists of the input bits to the first encoder followed by the parity check bits of both encoders. The suboptimal iterative decoding structure for such a code is modular, and consists of a set of concatenated decoding modules one for each constituent code connected through an interleaver identical to the one in the encoder side. Each decoder performs weighted soft decoding of the input sequence. PCCCs yield very large coding gains at the cost of a reduction in the data rate and/or an increase in bandwidth.

  5. Turbo Codes with Modified Code Matched Interleaver for Coded-Cooperation in Half-Duplex Wireless Relay Networks

    NASA Astrophysics Data System (ADS)

    Ejaz, Saqib; Yang, Feng-Fan

    2015-03-01

    The parallel encoding and decoding structure of turbo codes makes them natural candidate for coded-cooperative scenarios. In this paper, we focus on one of the key components of turbo codes i.e., interleaver, and analyze its effect on the performance of coded-cooperative communication. The impact of an interleaver on the overall performance of cooperative systems depends on the type of an interleaver and its location in the cooperative encoding scheme. We consider code matched interleaver (CMI) as an optimum choice and present its role in a coded-cooperation scenario. The search and convergence of CMI for long interleaver sizes is an issue; therefore, a modification in the search conditions is included without any compromise on the performance of CMI. We also present analytical method to determine maximum S-constraint length for a CMI design. Further, we analyze the performance of two different encoding schemes of turbo codes, i.e., distributed turbo code (DTC) and distributed multiple turbo code (DMTC) after inclusion of CMI. Monte Carlo simulations show that CMI increases the diversity gain relative to other conventional interleavers such as uniform random interleaver. The channel is assumed to be Rayleigh fading among all communication nodes.

  6. Multidimensional Trellis Coded Phase Modulation Using a Multilevel Concatenation Approach. Part 1; Code Design

    NASA Technical Reports Server (NTRS)

    Rajpal, Sandeep; Rhee, Do Jun; Lin, Shu

    1997-01-01

    The first part of this paper presents a simple and systematic technique for constructing multidimensional M-ary phase shift keying (MMK) trellis coded modulation (TCM) codes. The construction is based on a multilevel concatenation approach in which binary convolutional codes with good free branch distances are used as the outer codes and block MPSK modulation codes are used as the inner codes (or the signal spaces). Conditions on phase invariance of these codes are derived and a multistage decoding scheme for these codes is proposed. The proposed technique can be used to construct good codes for both the additive white Gaussian noise (AWGN) and fading channels as is shown in the second part of this paper.

  7. Physical Activity Monitoring: Gadgets and Uses. Article #6 in a 6-Part Series

    ERIC Educational Resources Information Center

    Mears, Derrick

    2010-01-01

    An early 15th century drawing by Leonardo da Vinci depicted a device that used gears and a pendulum that moved in synchronization with the wearer as he or she walked. This is believed to be the early origins of today's physical activity monitoring devices. Today's devices have vastly expanded on da Vinci's ancient concept with a myriad of options…

  8. Correct coding for the orthopedic surgeon.

    PubMed

    Malek, M Mike; Friedman, Melvin M; Beach, William

    2002-04-01

    Coding accurately is one of the main principles of a successful practice. Some changes that we will see shortly include deletion of the term "separate procedure," deletion of the term "with and/or without," deletion of the term "any method," revision of the criteria for choosing E/M levels, and 52 new and revised Hand Surgery codes. Some other changes to come will be category II and category III codes. More changes are occurring as this is written, and the best advice is to stay tuned. It is obvious to the authors that coding is mainly for reimbursement purposes. The orthopedic surgeon must remain vigilant and must not pass this task on to someone else. Ignorance of coding methods is not an excuse [2]. We must all watch carefully and speak up when necessary. In this day of decreasing reimbursement, we can all increase our revenue stream without working any harder if we code our work properly, completely, and promptly.

  9. HERCULES: A Pattern Driven Code Transformation System

    SciTech Connect

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing; Ilsche, Thomas; Joubert, Wayne; Graham, Richard L

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss the design, implementation and an initial evaluation of HERCULES.

  10. Code Samples Used for Complexity and Control

    NASA Astrophysics Data System (ADS)

    Ivancevic, Vladimir G.; Reid, Darryn J.

    2015-11-01

    The following sections are included: * MathematicaⓇ Code * Generic Chaotic Simulator * Vector Differential Operators * NLS Explorer * 2C++ Code * C++ Lambda Functions for Real Calculus * Accelerometer Data Processor * Simple Predictor-Corrector Integrator * Solving the BVP with the Shooting Method * Linear Hyperbolic PDE Solver * Linear Elliptic PDE Solver * Method of Lines for a Set of the NLS Equations * C# Code * Iterative Equation Solver * Simulated Annealing: A Function Minimum * Simple Nonlinear Dynamics * Nonlinear Pendulum Simulator * Lagrangian Dynamics Simulator * Complex-Valued Crowd Attractor Dynamics * Freeform Fortran Code * Lorenz Attractor Simulator * Complex Lorenz Attractor * Simple SGE Soliton * Complex Signal Presentation * Gaussian Wave Packet * Hermitian Matrices * Euclidean L2-Norm * Vector/Matrix Operations * Plain C-Code: Levenberg-Marquardt Optimizer * Free Basic Code: 2D Crowd Dynamics with 3000 Agents

  11. An Experiment in Scientific Code Semantic Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, distributed expert parsers. These semantic parser are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. The parsers will automatically recognize and document some static, semantic concepts and locate some program semantic errors. Results are shown for a subroutine test case and a collection of combustion code routines. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  12. Wire codes, magnetic fields, and childhood cancer

    SciTech Connect

    Kheifets, L.I.; Kavet, R.; Sussman, S.S.

    1997-05-01

    Childhood cancer has been modestly associated with wire codes, an exposure surrogate for power frequency magnetic fields, but less consistently with measured fields. The authors analyzed data on the population distribution of wire codes and their relationship with several measured magnetic field metrics. In a given geographic area, there is a marked trend for decreased prevalence from low to high wire code categories, but there are differences between areas. For average measured fields, there is a positive relationship between the mean of the distributions and wire codes but a large overlap among the categories. Better discrimination is obtained for the extremes of the measurement values when comparing the highest and the lowest wire code categories. Instability of measurements, intermittent fields, or other exposure conditions do not appear to provide a viable explanation for the differences between wire codes and magnetic fields with respect to the strength and consistency of their respective association with childhood cancer.

  13. Survey of nuclear fuel-cycle codes

    SciTech Connect

    Thomas, C.R.; de Saussure, G.; Marable, J.H.

    1981-04-01

    A two-month survey of nuclear fuel-cycle models was undertaken. This report presents the information forthcoming from the survey. Of the nearly thirty codes reviewed in the survey, fifteen of these codes have been identified as potentially useful in fulfilling the tasks of the Nuclear Energy Analysis Division (NEAD) as defined in their FY 1981-1982 Program Plan. Six of the fifteen codes are given individual reviews. The individual reviews address such items as the funding agency, the author and organization, the date of completion of the code, adequacy of documentation, computer requirements, history of use, variables that are input and forecast, type of reactors considered, part of fuel cycle modeled and scope of the code (international or domestic, long-term or short-term, regional or national). The report recommends that the Model Evaluation Team perform an evaluation of the EUREKA uranium mining and milling code.

  14. Material model library for explicit numerical codes

    SciTech Connect

    Hofmann, R.; Dial, B.W.

    1982-08-01

    A material model logic structure has been developed which is useful for most explicit finite-difference and explicit finite-element Lagrange computer codes. This structure has been implemented and tested in the STEALTH codes to provide an example for researchers who wish to implement it in generically similar codes. In parallel with these models, material parameter libraries have been created for the implemented models for materials which are often needed in DoD applications.

  15. Improvements to SOIL: An Eulerian hydrodynamics code

    SciTech Connect

    Davis, C.G.

    1988-04-01

    Possible improvements to SOIL, an Eulerian hydrodynamics code that can do coupled radiation diffusion and strength of materials, are presented in this report. Our research is based on the inspection of other Eulerian codes and theoretical reports on hydrodynamics. Several conclusions from the present study suggest that some improvements are in order, such as second-order advection, adaptive meshes, and speedup of the code by vectorization and/or multitasking. 29 refs., 2 figs.

  16. Codes and Standards Technical Team Roadmap

    SciTech Connect

    2013-06-01

    The Hydrogen Codes and Standards Tech Team (CSTT) mission is to enable and facilitate the appropriate research, development, & demonstration (RD&D) for the development of safe, performance-based defensible technical codes and standards that support the technology readiness and are appropriate for widespread consumer use of fuel cells and hydrogen-based technologies with commercialization by 2020. Therefore, it is important that the necessary codes and standards be in place no later than 2015.

  17. De-coding and re-coding RNA recognition by PUF and PPR repeat proteins.

    PubMed

    Hall, Traci M Tanaka

    2016-02-01

    PUF and PPR proteins are two families of α-helical repeat proteins that recognize single-stranded RNA sequences. Both protein families hold promise as scaffolds for designed RNA-binding domains. A modular protein RNA recognition code was apparent from the first crystal structures of a PUF protein in complex with RNA, and recent studies continue to advance our understanding of natural PUF protein recognition (de-coding) and our ability to engineer specificity (re-coding). Degenerate recognition motifs make de-coding specificity of individual PPR proteins challenging. Nevertheless, re-coding PPR protein specificity using a consensus recognition code has been successful.

  18. Employment opportunities for non-coding RNAs.

    PubMed

    Morey, Céline; Avner, Philip

    2004-06-01

    Analysis of the genomes of several higher eukaryotic organisms, including mouse and human, has reached the striking conclusion that the mammalian transcriptome is constituted in large part of non-protein-coding transcripts. Conversely, the number of protein-coding genes was initially at least overestimated. A growing number of studies report the involvement of non-coding transcripts in a large variety of regulatory processes. This review examines the different types of non-coding RNAs (ncRNAs) and discusses their putative mode of action with particular reference to large ncRNAs and their role in epigenetic regulation.

  19. The NYU inverse swept wing code

    NASA Technical Reports Server (NTRS)

    Bauer, F.; Garabedian, P.; Mcfadden, G.

    1983-01-01

    An inverse swept wing code is described that is based on the widely used transonic flow program FLO22. The new code incorporates a free boundary algorithm permitting the pressure distribution to be prescribed over a portion of the wing surface. A special routine is included to calculate the wave drag, which can be minimized in its dependence on the pressure distribution. An alternate formulation of the boundary condition at infinity was introduced to enhance the speed and accuracy of the code. A FORTRAN listing of the code and a listing of a sample run are presented. There is also a user's manual as well as glossaries of input and output parameters.

  20. Chronic care management coding for neurologists

    PubMed Central

    2015-01-01

    Abstract Chronic care management provides a way for neurologists to code for time spent by clinical office staff who coordinate services for patients with major chronic illnesses. Medicare allows payment for one such code; some third party payers accept 2 additional codes. When using these codes, the physician develops a Care Plan that organizes the patient's medical and psychosocial needs. Clinical office staff communicates among the patient's physicians, therapists, community services, the patient, family, and caregiver. The patient chooses only one physician whose office provides these coordination services. Rules include 24/7 access for urgent phone contact and use of an electronic health record system. PMID:26526602

  1. QR codes: next level of social media.

    PubMed

    Gottesman, Wesley; Baum, Neil

    2013-01-01

    The OR code, which is short for quick response code, system was invented in Japan for the auto industry. Its purpose was to track vehicles during manufacture; it was designed to allow high-speed component scanning. Now the scanning can be easily accomplished via cell phone, making the technology useful and within reach of your patients. There are numerous applications for OR codes in the contemporary medical practice. This article describes QR codes and how they might be applied for marketing and practice management. PMID:23866649

  2. MINET (momentum integral network) code documentation

    SciTech Connect

    Van Tuyle, G J; Nepsee, T C; Guppy, J G

    1989-12-01

    The MINET computer code, developed for the transient analysis of fluid flow and heat transfer, is documented in this four-part reference. In Part 1, the MINET models, which are based on a momentum integral network method, are described. The various aspects of utilizing the MINET code are discussed in Part 2, The User's Manual. The third part is a code description, detailing the basic code structure and the various subroutines and functions that make up MINET. In Part 4, example input decks, as well as recent validation studies and applications of MINET are summarized. 32 refs., 36 figs., 47 tabs.

  3. Subsystem codes with spatially local generators

    SciTech Connect

    Bravyi, Sergey

    2011-01-15

    We study subsystem codes whose gauge group has local generators in two-dimensional (2D) geometry. It is shown that there exists a family of such codes defined on lattices of size LxL with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest-neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way, we introduce and study properties of generalized Bacon-Shor codes that might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d{sup 2}=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd{sup 2}=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  4. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  5. THE PYTHON SHELL FOR THE ORBIT CODE

    SciTech Connect

    Shishlo, Andrei P; Gorlov, Timofey V; Holmes, Jeffrey A

    2009-01-01

    A development of a Python driver shell for the ORBIT simulation code is presented. The original ORBIT code uses the SuperCode shell to organize accelerator-related simulations. It is outdated, unsupported, and it is an obstacle to future code development. The necessity and consequences of replacing the old shell language are discussed. A set of core modules and extensions that are currently in PyORBIT are presented. They include particle containers, parsers for MAD and SAD lattice files, a Python wrapper for MPI libraries, space charge calculators, TEAPOT trackers, and a laser stripping extension module.

  6. Flexible Generation of Kalman Filter Code

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Wilson, Edward

    2006-01-01

    Domain-specific program synthesis can automatically generate high quality code in complex domains from succinct specifications, but the range of programs which can be generated by a given synthesis system is typically narrow. Obtaining code which falls outside this narrow scope necessitates either 1) extension of the code generator, which is usually very expensive, or 2) manual modification of the generated code, which is often difficult and which must be redone whenever changes are made to the program specification. In this paper, we describe adaptations and extensions of the AUTOFILTER Kalman filter synthesis system which greatly extend the range of programs which can be generated. Users augment the input specification with a specification of code fragments and how those fragments should interleave with or replace parts of the synthesized filter. This allows users to generate a much wider range of programs without their needing to modify the synthesis system or edit generated code. We demonstrate the usefulness of the approach by applying it to the synthesis of a complex state estimator which combines code from several Kalman filters with user-specified code. The work described in this paper allows the complex design decisions necessary for real-world applications to be reflected in the synthesized code. When executed on simulated input data, the generated state estimator was found to produce comparable estimates to those produced by a handcoded estimator

  7. Rate-Compatible Protograph LDPC Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods resulting in rate-compatible low density parity-check (LDPC) codes built from protographs. Described digital coding methods start with a desired code rate and a selection of the numbers of variable nodes and check nodes to be used in the protograph. Constraints are set to satisfy a linear minimum distance growth property for the protograph. All possible edges in the graph are searched for the minimum iterative decoding threshold and the protograph with the lowest iterative decoding threshold is selected. Protographs designed in this manner are used in decode and forward relay channels.

  8. The Fireball integrated code package

    SciTech Connect

    Dobranich, D.; Powers, D.A.; Harper, F.T.

    1997-07-01

    Many deep-space satellites contain a plutonium heat source. An explosion, during launch, of a rocket carrying such a satellite offers the potential for the release of some of the plutonium. The fireball following such an explosion exposes any released plutonium to a high-temperature chemically-reactive environment. Vaporization, condensation, and agglomeration processes can alter the distribution of plutonium-bearing particles. The Fireball code package simulates the integrated response of the physical and chemical processes occurring in a fireball and the effect these processes have on the plutonium-bearing particle distribution. This integrated treatment of multiple phenomena represents a significant improvement in the state of the art for fireball simulations. Preliminary simulations of launch-second scenarios indicate: (1) most plutonium vaporization occurs within the first second of the fireball; (2) large non-aerosol-sized particles contribute very little to plutonium vapor production; (3) vaporization and both homogeneous and heterogeneous condensation occur simultaneously; (4) homogeneous condensation transports plutonium down to the smallest-particle sizes; (5) heterogeneous condensation precludes homogeneous condensation if sufficient condensation sites are available; and (6) agglomeration produces larger-sized particles but slows rapidly as the fireball grows.

  9. Visual Coding in Locust Photoreceptors

    PubMed Central

    Faivre, Olivier; Juusola, Mikko

    2008-01-01

    Information capture by photoreceptors ultimately limits the quality of visual processing in the brain. Using conventional sharp microelectrodes, we studied how locust photoreceptors encode random (white-noise, WN) and naturalistic (1/f stimuli, NS) light patterns in vivo and how this coding changes with mean illumination and ambient temperature. We also examined the role of their plasma membrane in shaping voltage responses. We found that brightening or warming increase and accelerate voltage responses, but reduce noise, enabling photoreceptors to encode more information. For WN stimuli, this was accompanied by broadening of the linear frequency range. On the contrary, with NS the signaling took place within a constant bandwidth, possibly revealing a ‘preference’ for inputs with 1/f statistics. The faster signaling was caused by acceleration of the elementary phototransduction current - leading to bumps - and their distribution. The membrane linearly translated phototransduction currents into voltage responses without limiting the throughput of these messages. As the bumps reflected fast changes in membrane resistance, the data suggest that their shape is predominantly driven by fast changes in the light-gated conductance. On the other hand, the slower bump latency distribution is likely to represent slower enzymatic intracellular reactions. Furthermore, the Q10s of bump duration and latency distribution depended on light intensity. Altogether, this study suggests that biochemical constraints imposed upon signaling change continuously as locust photoreceptors adapt to environmental light and temperature conditions. PMID:18478123

  10. SIRUS spectral signature analysis code

    NASA Astrophysics Data System (ADS)

    Bishop, Gary J.; Caola, Mike J.; Geatches, Rachel M.; Roberts, Nick C.

    2003-09-01

    The Advanced Technology Centre (ATC) is responsible for developing IR signature prediction capabilities for its parent body, BAE SYSTEMS. To achieve this, the SIRUS code has been developed and used on a variety of projects for well over a decade. SIRUS is capable of providing accurate IR predictions for air breathing and rocket motor propelled vehicles. SIRUS models various physical components to derive its predictions. A key component is the radiance reflected from the surface of the modeled vehicle. This is modeled by fitting parameters to the measured Bi-Directional Reflectance Function (BDRF) of the surface material(s). The ATC have successfully implemented a parameterization scheme based on the published OPTASM model, and this is described. However, inconsistencies between reflectance measurements and values calculated from the parameterized fit have led to an elliptical parameter enhancement. The implementation of this is also described. Finally, an end-to-end measurement-parameterization capability is described, based on measurements taken with SOC600 instrumentation.

  11. Risk based ASME Code requirements

    SciTech Connect

    Gore, B.F.; Vo, T.V.; Balkey, K.R.

    1992-09-01

    The objective of this ASME Research Task Force is to develop and to apply a methodology for incorporating quantitative risk analysis techniques into the definition of in-service inspection (ISI) programs for a wide range of industrial applications. An additional objective, directed towards the field of nuclear power generation, is ultimately to develop a recommendation for comprehensive revisions to the ISI requirements of Section XI of the ASME Boiler and Pressure Vessel Code. This will require development of a firm technical basis for such requirements, which does not presently exist. Several years of additional research will be required before this can be accomplished. A general methodology suitable for application to any industry has been defined and published. It has recently been refined and further developed during application to the field of nuclear power generation. In the nuclear application probabilistic risk assessment (PRA) techniques and information have been incorporated. With additional analysis, PRA information is used to determine the consequence of a component rupture (increased reactor core damage probability). A procedure has also been recommended for using the resulting quantified risk estimates to determine target component rupture probability values to be maintained by inspection activities. Structural risk and reliability analysis (SRRA) calculations are then used to determine characteristics which an inspection strategy must posess in order to maintain component rupture probabilities below target values. The methodology, results of example applications, and plans for future work are discussed.

  12. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm

  13. Upper and lower bounds on quantum codes

    NASA Astrophysics Data System (ADS)

    Smith, Graeme Stewart Baird

    This thesis provides bounds on the performance of quantum error correcting codes when used for quantum communication and quantum key distribution. The first two chapters provide a bare-bones introduction to classical and quantum error correcting codes, respectively. The next four chapters present achievable rates for quantum codes in various scenarios. The final chapter is dedicated to an upper bound on the quantum channel capacity. Chapter 3 studies coding for adversarial noise using quantum list codes, showing there exist quantum codes with high rates and short lists. These can be used, together with a very short secret key, to communicate with high fidelity at noise levels for which perfect fidelity is, impossible. Chapter 4 explores the performance of a family of degenerate codes when used to communicate over Pauli channels, showing they can be used to communicate over almost any Pauli channel at rates that are impossible for a nondegenerate code and that exceed those of previously known degenerate codes. By studying the scaling of the optimal block length as a function of the channel's parameters, we develop a heuristic for designing even better codes. Chapter 5 describes an equivalence between a family of noisy preprocessing protocols for quantum key distribution and entanglement distillation protocols whose target state belongs to a class of private states called "twisted states." In Chapter 6, the codes of Chapter 4 are combined with the protocols of Chapter 5 to provide higher key rates for one-way quantum key distribution than were previously thought possible. Finally, Chapter 7 presents a new upper bound on the quantum channel capacity that is both additive and convex, and which can be interpreted as the capacity of the channel for communication given access to side channels from a class of zero capacity "cloning" channels. This "clone assisted capacity" is equal to the unassisted capacity for channels that are degradable, which we use to find new upper

  14. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  15. Recent changes in Criminal Procedure Code and Indian Penal Code relevant to medical profession.

    PubMed

    Agarwal, Swapnil S; Kumar, Lavlesh; Mestri, S C

    2010-02-01

    Some sections in Criminal Procedure Code and Indian Penal Code have a direct binding on medical practitioner. With changing times, few of them have been revised and these changes are presented in this article.

  16. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    SciTech Connect

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.; Miley, Terri B.; Nichols, William E.; Strenge, Dennis L.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  17. Interface requirements to couple thermal hydraulics codes to severe accident codes: ICARE/CATHARE

    SciTech Connect

    Camous, F.; Jacq, F.; Chatelard, P.

    1997-07-01

    In order to describe with the same code the whole sequence of severe LWR accidents, up to the vessel failure, the Institute of Protection and Nuclear Safety has performed a coupling of the severe accident code ICARE2 to the thermalhydraulics code CATHARE2. The resulting code, ICARE/CATHARE, is designed to be as pertinent as possible in all the phases of the accident. This paper is mainly devoted to the description of the ICARE2-CATHARE2 coupling.

  18. Campus Speech Codes Said to Violate Rights

    ERIC Educational Resources Information Center

    Lipka, Sara

    2007-01-01

    Most college and university speech codes would not survive a legal challenge, according to a report released in December by the Foundation for Individual Rights in Education, a watchdog group for free speech on campuses. The report labeled many speech codes as overly broad or vague, and cited examples such as Furman University's prohibition of…

  19. Applications of Coding in Network Communications

    ERIC Educational Resources Information Center

    Chang, Christopher SungWook

    2012-01-01

    This thesis uses the tool of network coding to investigate fast peer-to-peer file distribution, anonymous communication, robust network construction under uncertainty, and prioritized transmission. In a peer-to-peer file distribution system, we use a linear optimization approach to show that the network coding framework significantly simplifies…

  20. Only Speech Codes Should Be Censored

    ERIC Educational Resources Information Center

    Pavela, Gary

    2006-01-01

    In this article, the author discusses the enforcement of "hate speech" codes and confirms research that considers why U.S. colleges and universities continue to promulgate student disciplinary rules prohibiting expression that "subordinates" others or is "demeaning, offensive, or hateful." Such continued adherence to speech codes is by now…