Science.gov

Sample records for da vinci code

  1. Leonardo da Vinci and the Downburst.

    NASA Astrophysics Data System (ADS)

    Gedzelman, Stanley David

    1990-05-01

    Evidence from the drawings, experiments, and writings of Leonardo da Vinci are presented to demonstrate that da Vinci recognized and, possibly, discovered the downburst and understood its associated airflow. Other early references to vortex flows resembling downbursts are mentioned.

  2. How to Think Like Leonardo da Vinci

    ERIC Educational Resources Information Center

    Caouette, Ralph

    2008-01-01

    To be effective and relevant in twenty-first-century learning, art needs to be more inclusive. In this article, the author discusses how teachers can find a good example in Leonardo da Vinci for building an art program. His art, design, and curiosity are the perfect foundation for any art program, at any level. (Contains 3 resources and 3 online…

  3. Hidden sketches by Leonardo da Vinci revealed

    NASA Astrophysics Data System (ADS)

    Dumé, Belle

    2009-02-01

    Three drawings on the back of Leonardo da Vinci's The Virgin and Child with St Anne (circa 1508) have been discovered by researchers led by Michel Menu from the Centre de Recherche et de Restauration des Musées de France (C2RMF) and the Louvre Museum in Paris.

  4. [Leonardo da Vinci--a dyslectic genius?].

    PubMed

    Røsstad, Anna

    2002-12-10

    Leonardo da Vinci's texts consist almost exclusively of scientific notes. Working on a book on Leonardo's art, I studied all Leonardo's published texts carefully for any new information. In some prefaces I came to suspect that Leonardo might have suffered from dyslexia. This article considers the question of whether it is possible to find indications of dyslexia in Leonardo's texts and in the accounts of his life.

  5. Leonardo da Vinci's studies of the heart.

    PubMed

    Shoja, Mohammadali M; Agutter, Paul S; Loukas, Marios; Benninger, Brion; Shokouhi, Ghaffar; Namdar, Husain; Ghabili, Kamyar; Khalili, Majid; Tubbs, R Shane

    2013-08-20

    Leonardo da Vinci's detailed drawings are justly celebrated; however, less well known are his accounts of the structures and functions of the organs. In this paper, we focus on his illustrations of the heart, his conjectures about heart and blood vessel function, his experiments on model systems to test those conjectures, and his unprecedented conclusions about the way in which the cardiovascular system operates. In particular, da Vinci seems to have been the first to recognize that the heart is a muscle and that systole is the active phase of the pump. He also seems to have understood the functions of the auricles and pulmonary veins, identified the relationship between the cardiac cycle and the pulse, and explained the hemodynamic mechanism of valve opening and closure. He also described anatomical variations and changes in structure and function that occurred with age. We outline da Vinci's varied career and suggest ways in which his personality, experience, skills and intellectual heritage contributed to these advances in understanding. We also consider his influence on later studies in anatomy and physiology.

  6. The Case: Bunche-Da Vinci Learning Partnership Academy

    ERIC Educational Resources Information Center

    Eisenberg, Nicole; Winters, Lynn; Alkin, Marvin C.

    2005-01-01

    The Bunche-Da Vinci case described in this article presents a situation at Bunche Elementary School that four theorists were asked to address in their evaluation designs (see EJ791771, EJ719772, EJ791773, and EJ792694). The Bunche-Da Vinci Learning Partnership Academy, an elementary school located between an urban port city and a historically…

  7. Leonardo da Vinci's contributions to neuroscience.

    PubMed

    Pevsner, Jonathan

    2002-04-01

    Leonardo da Vinci (1452-1519) made far-reaching contributions to many areas of science, technology and art. Leonardo's pioneering research into the brain led him to discoveries in neuroanatomy (such as those of the frontal sinus and meningeal vessels) and neurophysiology (he was the first to pith a frog). His injection of hot wax into the brain of an ox provided a cast of the ventricles, and represents the first known use of a solidifying medium to define the shape and size of an internal body structure. Leonardo developed an original, mechanistic model of sensory physiology. He undertook his research with the broad goal of providing physical explanations of how the brain processes visual and other sensory input, and integrates that information via the soul.

  8. Tree branching: Leonardo da Vinci's rule versus biomechanical models.

    PubMed

    Minamino, Ryoko; Tateno, Masaki

    2014-01-01

    This study examined Leonardo da Vinci's rule (i.e., the sum of the cross-sectional area of all tree branches above a branching point at any height is equal to the cross-sectional area of the trunk or the branch immediately below the branching point) using simulations based on two biomechanical models: the uniform stress and elastic similarity models. Model calculations of the daughter/mother ratio (i.e., the ratio of the total cross-sectional area of the daughter branches to the cross-sectional area of the mother branch at the branching point) showed that both biomechanical models agreed with da Vinci's rule when the branching angles of daughter branches and the weights of lateral daughter branches were small; however, the models deviated from da Vinci's rule as the weights and/or the branching angles of lateral daughter branches increased. The calculated values of the two models were largely similar but differed in some ways. Field measurements of Fagus crenata and Abies homolepis also fit this trend, wherein models deviated from da Vinci's rule with increasing relative weights of lateral daughter branches. However, this deviation was small for a branching pattern in nature, where empirical measurements were taken under realistic measurement conditions; thus, da Vinci's rule did not critically contradict the biomechanical models in the case of real branching patterns, though the model calculations described the contradiction between da Vinci's rule and the biomechanical models. The field data for Fagus crenata fit the uniform stress model best, indicating that stress uniformity is the key constraint of branch morphology in Fagus crenata rather than elastic similarity or da Vinci's rule. On the other hand, mechanical constraints are not necessarily significant in the morphology of Abies homolepis branches, depending on the number of daughter branches. Rather, these branches were often in agreement with da Vinci's rule.

  9. Der Telemanipulator daVinci als mechanisches Trackingsystem

    NASA Astrophysics Data System (ADS)

    Käst, Johannes; Neuhaus, Jochen; Nickel, Felix; Kenngott, Hannes; Engel, Markus; Short, Elaine; Reiter, Michael; Meinzer, Hans-Peter; Maier-Hein, Lena

    Der Telemanipulator daVinci (Intuitive Surgical, Sunnyvale, Kalifornien) ist ein M aster-Slave System für roboterassistierte minimalinvasive Chirurgie. Da er über integrierte Gelenksensoren verfügt, kann er unter Verwendung der daVinci-API als mechanisches Trackingsystem verwendet werden. In dieser Arbeit evaluieren wir die Präzision und Genauigkeit eines daVinci mit Hilfe eines Genauigkeitsphantoms mit bekannten Maßen. Der ermittelte Positionierungsfehler liegt in der Größenordnung von 6 mm und ist somit für einen Großteil der medizinischen Fragestellungen zu hoch. Zur Reduktion des Fehlers schlagen wir daher eine Kalibrierung der Gelenksensoren vor.

  10. Studying and Working Abroad. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles recent successful examples of students studying and working abroad as part of the European Commission's Leonardo da Vinci program, which is designed to give students across the European Union the opportunity to experience vocational training in a foreign country. The following examples are presented: (1) 3 Finnish students…

  11. The DaVinci Project: Multimedia in Art and Chemistry.

    ERIC Educational Resources Information Center

    Simonson, Michael; Schlosser, Charles

    1998-01-01

    Provides an overview of the DaVinci Project, a collaboration of students, teachers, and researchers in chemistry and art to develop multimedia materials for grades 3-12 visualizing basic concepts in chemistry and visual art. Topics addressed include standards in art and science; the conceptual framework for the project; and project goals,…

  12. The Potential da Vinci in All of Us

    ERIC Educational Resources Information Center

    Petto, Sarah; Petto, Andrew

    2009-01-01

    The study of the human form is fundamental to both science and art curricula. For vertebrates, perhaps no feature is more important than the skeleton to determine observable form and function. As Leonard da Vinci's famous Proportions of the Human Figure (Virtruvian Man) illustrates, the size, shape, and proportions of the human body are defined by…

  13. Training and Health. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles programs in the fields of health and medicine that are offered through the European Commission's Leonardo da Vinci program. The following programs are profiled: (1) CYTOTRAIN (a transnational vocational training program in cervical cancer screening); (2) Apollo (a program of open and distance learning for paramedical…

  14. Women and Technical Professions. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles programs for women in technical professions that are offered through the European Commission's Leonardo da Vinci program. The following programs are profiled: (1) Artemis and Diana (vocational guidance programs to help direct girls toward technology-related careers); (2) CEEWIT (an Internet-based information and…

  15. Leonardo da Vinci (1452-1519)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Painter, inventor and polymath, born in Vinci (near Empolia), Italy. Although astronomy does not figure large in Leonardo's works, he realized the possibility of constructing a telescope (`making glasses to see the Moon enlarged'). He suggested that `… in order to observe the nature of the planets, open the roof and bring the image of a single planet onto the base of a concave mirror. The image o...

  16. DaVinci's Mona Lisa entering the next dimension.

    PubMed

    Carbon, Claus-Christian; Hesslinger, Vera M

    2013-01-01

    For several of Leonardo da Vinci's paintings, such as The Virgin and Child with St Anne or the Mona Lisa, there exist copies produced by his own studio. In case of the Mona Lisa, a quite exceptional, rediscovered studio copy was presented to the public in 2012 by the Prado Museum in Madrid. Not only does it mirror its famous counterpart superficially; it also features the very same corrections to the lower layers, which indicates that da Vinci and the 'copyist' must have elaborated their panels simultaneously. On the basis of subjective (thirty-two participants estimated painter-model constellations) as well as objective data (analysis of trajectories between landmarks of both paintings), we revealed that both versions differ slightly in perspective. We reconstructed the original studio setting and found evidence that the disparity between both paintings mimics human binocular disparity. This points to the possibility that the two Giocondas together might represent the first stereoscopic image in world history.

  17. Da Vinci's codex and the anatomy of healthcare.

    PubMed

    Stephens-Borg, Keith

    2012-08-01

    We usually display a laid-back approach to medical jargon throughout our theatre work. The word 'perioperative' is built from the Greek word 'peri' (around) and the Latin 'operari' (to work). Latin and Greek became the prefixed language of choice for Leonardo da Vinci, and his research was pivotal in determining the way in which surgical procedures are documented. Ancient manuscripts aided the unfolding of the secrets of anatomy, and Leonardo revealed that art was the key in expressive detailed explanation.

  18. A Creative Approach to the Common Core Standards: The Da Vinci Curriculum

    ERIC Educational Resources Information Center

    Chaucer, Harry

    2012-01-01

    "A Creative Approach to the Common Core Standards: The Da Vinci Curriculum" challenges educators to design programs that boldly embrace the Common Core State Standards by imaginatively drawing from the genius of great men and women such as Leonardo da Vinci. A central figure in the High Renaissance, Leonardo made extraordinary contributions as a…

  19. Leonardo da Vinci and the origin of semen

    PubMed Central

    Noble, Denis; DiFrancesco, Dario; Zancani, Diego

    2014-01-01

    It is well known that Leonardo da Vinci made several drawings of the human male anatomy. The early drawings (before 1500) were incorrect in identifying the origin of semen, where he followed accepted teaching of his time. It is widely thought that he did not correct this mistake, a view that is reflected in several biographies. In fact, he made a later drawing (after 1500) in which the description of the anatomy is remarkably accurate and must have been based on careful dissection. In addition to highlighting this fact, acknowledged previously in only one other source, this article reviews the background to Leonardo's knowledge of the relevant anatomy. PMID:27494016

  20. Visual tracking of da Vinci instruments for laparoscopic surgery

    NASA Astrophysics Data System (ADS)

    Speidel, S.; Kuhn, E.; Bodenstedt, S.; Röhl, S.; Kenngott, H.; Müller-Stich, B.; Dillmann, R.

    2014-03-01

    Intraoperative tracking of laparoscopic instruments is a prerequisite to realize further assistance functions. Since endoscopic images are always available, this sensor input can be used to localize the instruments without special devices or robot kinematics. In this paper, we present an image-based markerless 3D tracking of different da Vinci instruments in near real-time without an explicit model. The method is based on different visual cues to segment the instrument tip, calculates a tip point and uses a multiple object particle filter for tracking. The accuracy and robustness is evaluated with in vivo data.

  1. Leonardo da Vinci and Kethem-Kiveris vena.

    PubMed

    Dolezal, Antonín; Skorepova-Honzlova, Zita; Jelen, Karel

    2012-01-01

    In the drawing of coitus by Leonardo da Vinci are pictured the contemporary hypotheses regarding this act. The authors analyze the mamillaruteral connection depicted by the artist and grow up to believe that this is a hypothetical kiveris vena, female vein described by Anatomist Master Nicolai Physicus from the Salerno School. The Hebrew roots were found in the name. The connection is described also by Mondino in The Anathomia. The same connection can be found in the picture of the pregnant woman in Fasciculus Medicinæ by Johannes De Ketham.

  2. LEONARDO DA VINCI AND THE ORIGIN OF SEMEN.

    PubMed

    Noble, Denis; DiFrancesco, Dario; Zancani, Diego

    2014-12-20

    It is well known that Leonardo da Vinci made several drawings of the human male anatomy. The early drawings (before 1500) were incorrect in identifying the origin of semen, where he followed accepted teaching of his time. It is widely thought that he did not correct this mistake, a view that is reflected in several biographies. In fact, he made a later drawing (after 1500) in which the description of the anatomy is remarkably accurate and must have been based on careful dissection. In addition to highlighting this fact, acknowledged previously in only one other source, this article reviews the background to Leonardo's knowledge of the relevant anatomy.

  3. DaVinci canvas: a telerobotic surgical system with integrated, robot-assisted, laparoscopic ultrasound capability.

    PubMed

    Leven, Joshua; Burschka, Darius; Kumar, Rajesh; Zhang, Gary; Blumenkranz, Steve; Dai, Xiangtian Donald; Awad, Mike; Hager, Gregory D; Marohn, Mike; Choti, Mike; Hasser, Chris; Taylor, Russell H

    2005-01-01

    We present daVinci Canvas: a telerobotic surgical system with integrated robot-assisted laparoscopic ultrasound capability. DaVinci Canvas consists of the integration of a rigid laparoscopic ultrasound probe with the daVinci robot, video tracking of ultrasound probe motions, endoscope and ultrasound calibration and registration, autonomous robot motions, and the display of registered 2D and 3D ultrasound images. Although we used laparoscopic liver cancer surgery as a focusing application, our broader aim was the development of a versatile system that would be useful for many procedures.

  4. Leonardo Da Vinci and stroke - vegetarian diet as a possible cause.

    PubMed

    Oztürk, Serefnur; Altieri, Marta; Troisi, Pina

    2010-01-01

    Leonardo da Vinci (April 15, 1452 to May 2, 1519) was an Italian Renaissance architect, musician, anatomist, inventor, engineer, sculptor, geometer, and painter. It has been gleaned from the many available historical documents that da Vinci was a vegetarian who respected and loved animals, and that he suffered from right hemiparesis in the last 5 years of his life. A vegetarian diet has both positive and negative influences on the cerebrovascular system. In this report, a possible relation between a vegetarian diet and stroke is discussed from various perspectives as related to Leonardo da Vinci's stroke.

  5. Leonardo da Vinci: the search for the soul.

    PubMed

    Del Maestro, R F

    1998-11-01

    The human race has always contemplated the question of the anatomical location of the soul. During the Renaissance the controversy crystallized into those individuals who supported the heart ("cardiocentric soul") and others who supported the brain ("cephalocentric soul") as the abode for this elusive entity. Leonardo da Vinci (1452-1519) joined a long list of other explorers in the "search for the soul." The method he used to resolve this anatomical problem involved the accumulation of information from ancient and contemporary sources, careful notetaking, discussions with acknowledged experts, and his own personal search for the truth. Leonardo used a myriad of innovative methods acquired from his knowledge of painting, sculpture, and architecture to define more clearly the site of the "senso comune"--the soul. In this review the author examines the sources of this ancient question, the knowledge base tapped by Leonardo for his personal search for the soul, and the views of key individuals who followed him.

  6. [Regarding the Manuscript D " Dell' occhio " of Leonardo da Vinci].

    PubMed

    Heitz, Robert F

    2009-01-01

    Leonardo da Vinci's Manuscript D consists of five double pages sheets, which, folded in two, comprise ten folios. This document, in the old Tuscan dialect and mirror writing, reveals the ideas of Leonardo on the anatomy of the eye in relation to the formation of images and visual perception. Leonardo explains in particular the behavior of the rays in the eye in terms of refraction and reflection, and is very mechanistic in his conception of the eye and of the visual process. The most significant innovations found in these folios are the concept of the eye as a camera obscura and the intersection of light rays in the interior of the eye. His texts nevertheless show hesitation, doubts and a troubled confusion, reflecting the ideas and uncertainties of his era. He did not share his results in his lifetime, despite both printing and etching being readily available to him.

  7. Sine ars scientia nihil est: Leonardo da Vinci and beyond.

    PubMed

    Kickhöfel, Eduardo H P

    2009-01-01

    The aim of this article is to reflect on the relationship between art and science so far as it concerns a symposium on neurosciences. We undertake a historical overview of that relationship, paying particular attention to the sui generis case of Leonardo da Vinci, who very often is regarded as the man who worked on art and science with equal ease. We then explain why his idea of merging these two forms of knowledge failed, considering the clear-cut distinction between art and science in his time. With this clarification, we explore the matter today. We look at Raphael's The Transfiguration, in which the representation of the possessed boy is seen by neuroscientists as indicative of an epileptic seizure. We also look at the ideas of neuroscientists Semir Zeki and Vilayanur Ramachandran, who study particular aspects of brain function and suggest a new merging of art and science.

  8. Thinking like Leonardo da Vinci and its implications for the modern doctor.

    PubMed

    Baum, Neil

    2013-01-01

    Most people when asked to name the most creative, innovative, and multidimensional people in history would agree that Leonardo da Vinci is either at the top or very close to the number one position on that list. Wouldn't it be nice to think like da Vinci? This article shares the seven unique principles of thinking that da Vinci used that enabled him to be the greatest painter, sculptor, architect, musician, mathematician, engineer, inventor, anatomist, geologist, cartographer, botanist, and writer of his (if not of all) time. This article will take you deep into the notebooks and codices of da Vinci, and suggest ways his ideas can be used by anyone in the healthcare profession to make them a better healthcare provider.

  9. Load evaluation of the da Vinci surgical system for transoral robotic surgery.

    PubMed

    Fujiwara, Kazunori; Fukuhara, Takahiro; Niimi, Koji; Sato, Takahiro; Kitano, Hiroya

    2015-12-01

    Transoral robotic surgery, performed with the da Vinci surgical system (da Vinci), is a surgical approach for benign and malignant lesions of the oral cavity and laryngopharynx. It provides several unique advantages, which include a 3-dimensional magnified view and ability to see and work around curves or angles. However, the current da Vinci surgical system does not provide haptic feedback. This is problematic because the potential risks specific to the transoral use of the da Vinci include tooth injury, mucosal laceration, ocular injury and mandibular fracture. To assess the potential for intraoperative injuries, we measured the load of the endoscope and the instrument of the da Vinci Si surgical system. We pressed the endoscope and instrument of the da Vinci Si against Load cell six times each and measured the dynamic load and the time-to-maximum load. We also struck the da Vinci Si endoscope and instrument against the Load cell six times each and measured the impact load. The maximum dynamic load was 7.27 ± 1.31 kg for the endoscope and 1.90 ± 0.72 for the instrument. The corresponding time-to-maximum loads were 1.72 ± 0.22 and 1.29 ± 0.34 s, but the impact loads were significantly lower than the dynamic load. It remains possible that a major load is exerted on adjacent structures by continuous contact with the endoscope and instrument of da Vinci Si. However, there is a minor delay in reaching the maximum load. Careful monitoring by an on-site assistant may, therefore, help prevent contiguous injury.

  10. Early clinical experience with the da Vinci Xi Surgical System in general surgery.

    PubMed

    Hagen, Monika E; Jung, Minoa K; Ris, Frederic; Fakhro, Jassim; Buchs, Nicolas C; Buehler, Leo; Morel, Philippe

    2016-12-27

    The da Vinci Xi Surgical System (Intuitive Surgical Inc., Sunnyvale, CA, USA) has been released in 2014 to facilitate minimally invasive surgery. Novel features are targeted towards facilitating complex multi-quadrant procedures, but data is scarce so far. Perioperative data of patients who underwent robotic general surgery with the da Vinci Xi system within the first 6 month after installation were collected and analyzed. The gastric bypass procedures performed with the da Vinci Xi Surgical System were compared to an equal amount of the last procedures with the da Vinci Si Surgical System. Thirty-one foregut (28 Roux-en-Y gastric bypasses), 6 colorectal procedures and 1 revisional biliary procedure were performed. The mean operating room (OR) time was 221.8 (±69.0) minutes for gastric bypasses and 306.5 (±48.8) for colorectal procedures with mean docking time of 9.4 (±3.8) minutes. The gastric bypass procedure was transitioned from a hybrid to a fully robotic approach. In comparison to the last 28 gastric bypass procedures performed with the da Vinci Si Surgical System, the OR time was comparable (226.9 versus 230.6 min, p = 0.8094), but the docking time significantly longer with the da Vinci Xi Surgical System (8.5 versus 6.1 min, p = 0.0415). All colorectal procedures were performed with a single robotic docking. No intraoperative and two postoperative complications occurred. The da Vinci Xi might facilitate single-setups of totally robotic gastric bypass and colorectal surgeries. However, further comparable research is needed to clearly determine the significance of this latest version of the da Vinci Surgical System.

  11. Multiquadrant robotic colorectal surgery: the da Vinci Xi vs Si comparison.

    PubMed

    Protyniak, Bogdan; Jorden, Jeffrey; Farmer, Russell

    2017-03-08

    The newly introduced da Vinci Xi Surgical System hopes to address the shortcomings of its predecessor, specifically robotic arm restrictions and difficulty working in multiple quadrants. We compare the two robot platforms in multiquadrant surgery at a major colorectal referral center. Forty-four patients in the da Vinci Si group and 26 patients in the Xi group underwent sigmoidectomy or low anterior resection between 2014 and 2016. Patient demographics, operative variables, and postoperative outcomes were compared using descriptive statistics. Both groups were similar in age, sex, BMI, pelvic surgeries, and ASA class. Splenic flexure was mobilized in more (p = 0.045) da Vinci Xi cases compared to da Vinci Si both for sigmoidectomy (50 vs 15.4%) and low anterior resection (60 vs 29%). There was no significant difference in operative time (219.9 vs 224.7 min; p = 0.640), blood loss (170.0 vs 188.1 mL; p = 0.289), length of stay (5.7 vs 6 days; p = 0.851), or overall complications (26.9 vs 22.7%; p = 0.692) between the da Vinci Xi and Si groups, respectively. Single-dock multiquadrant robotic surgery, measured by splenic flexure mobilization with concomitant pelvic dissection, was more frequently performed using the da Vinci Xi platform with no increase in operative time, bleeding, or postoperative complications. The new platform provides surgeons an easier alternative to the da Vinci Si dual docking or combined robotic/laparoscopic multiquadrant surgery.

  12. Leonardo da Vinci: engineer, bioengineer, anatomist, and artist.

    PubMed

    West, John B

    2017-03-01

    Leonardo da Vinci (1452-1519) enjoys a reputation as one of the most talented people of all time in the history of science and the arts. However, little attention has been given to his contributions to physiology. One of his main interests was engineering, and he was fascinated by structural problems and the flow patterns of liquids. He also produced a large number of ingenious designs for warfare and a variety of highly original flying machines. But of particular interest to us are his contributions to bioengineering and how he used his knowledge of basic physical principles to throw light on physiological function. For example, he produced new insights into the mechanics of breathing including the action of the ribs and diaphragm. He was the first person to understand the different roles of the internal and external intercostal muscles. He had novel ideas about the airways including the mode of airflow in them. He also worked on the cardiovascular system and had a special interest in the pulmonary circulation. But, interestingly, he was not able to completely divorce his views from those of Galen, in that although he could not see pores in the interventricular septum of the heart, one of his drawings included them. Leonardo was a talented anatomist who made many striking drawings of the human body. Finally, his reputation for many people is based on his paintings including the Mona Lisa that apparently attracts more viewers than any other painting in the world.

  13. [Project Leonardo-da-Vinci for better nursing care].

    PubMed

    Gábor, Katalin; Csanádi, Lajosné; Helembai, Kornélia; Szögi, Zoltánné; Tulkán, Ibolya; Unginé, Kántor Katalin

    2002-08-18

    The aim of the present paper is to inform physicians about the work completed by nurses and professors of baccalaureat nurses in the framework of Leonardo da Vinci project, organised and sponsored by the European Union. The goal of the project was to increase the effectiveness of chief nurses throughout their further training programme in the field of management. The team of Szeged chose the human resource management, since in this field is possible to achieve the greatest improvement with the smallest financial investment. We measured the fluctuations and the absentees of the nurses, the changes in the degree of education, the nurse' and patient' satisfaction at the beginning and at the end of the period studied. Except the patient's satisfaction all the other parameters improved by the end of tested period. The project provided a unique possibility to compare the state of the Hungarian nursing with that of the countries belonging to the European Union, to exchange the experience and to learn some new methods. In the framework of this project a book of two volumes was prepared containing the suggestions of EU. This book is widely available in English and in French.

  14. DaVinci-assisted laparoscopic radical prostatectomy: the learning curve

    NASA Astrophysics Data System (ADS)

    Le, Carter Q.; Ho, Khai-Linh V.; Gettman, Matthew T.

    2007-02-01

    Objective: To define the learning curve for daVinci-assisted laparoscopic radical prostatectomy (DLP) at our institution. Methods: The data from 170 patients who underwent DLP between August 2002 and December 2004 by a single surgeon (MTG) were reviewed. Operative time, hemoglobin decrease, conversion to open procedure, positive margin rates, complications, length of stay (LOS), length of catheterization, continence, and erectile function were analyzed. Results: Hemoglobin decrease (p=0.11), positive margin rates (p=0.80), and early urinary continence (p=0.17) did not significantly correlate with surgical experience. A trend towards lower complications (p=0.07) and an earlier return of erectile function (p=0.09) was noted with increased experience with DLP. Operative time, hospital stay, catheterization time, and open conversion showed significant association with patient sequence. Median operative time for the first 60 and the last 110 patients was 323.5 and 239.5 minutes (p=<0.0001), respectively. Median LOS for the aforementioned groups was 53 and 51 hours (p=0.009). Length of catheterization declined significantly between the first 60 and the remaining 110 patients, 14 as compared to 11.5 days (p=<0.0001). Eight open conversions occurred, six were in the first 30 patients (p=0.03). Conclusion: As an indicator of the learning curve, the operative time in our series showed no correlation with sequence after the 60 th patient. Thus, despite the advantages of robotics, the learning curve to efficient performance of daVinciassisted laparoscopic radical prostatectomy is long. Oncological and functional outcomes should not be affected during the learning curve.

  15. Leonardo da Vinci and the first hemodynamic observations.

    PubMed

    Martins e Silva, J

    2008-02-01

    Leonardo da Vinci was a genius whose accomplishments and ideas come down to us today, five centuries later, with the freshness of innovation and the fascination of discovery. This brief review begins with a summary of Leonardo's life and a description of the most important works of art that he bequeathed us, and then concentrates on his last great challenge. There was a point at which Leonardo's passion for art gave way to the study of human anatomy, not only to improve his drawing but to go beyond what had been simply a representation of form to understand the underlying functioning. Among his many interests, we focus on his study of the heart and blood vessels, which he observed carefully in animals and human autopsies, and reproduced in drawings of great quality with annotations of astonishing acuteness. The experience that he had acquired from observing the flow of water in currents and around obstacles, and the conclusions that he drew concerning hydrodynamics, were central to his interpretation of the mechanisms of the heart and of blood flow, to which he devoted much of his time between 1508 and 1513. From these studies, immortalized in drawings of great clarity, come what are acknowledged to be the first hemodynamic records, in which Leonardo demonstrates the characteristics of blood flow in the aorta and great vessels and the importance of blood reflux and the formation of eddies in the sinus in aortic valve his assiduous and careful observations, and his subsequent deductions, Leonardo put forward detailed findings on hemodynamic questions that advanced technology has only recently enabled us to confirm.

  16. Evolution of robots throughout history from Hephaestus to Da Vinci Robot.

    PubMed

    Iavazzo, Christos; Gkegke, Xanthi-Ekaterini D; Iavazzo, Paraskevi-Evangelia; Gkegkes, Ioannis D

    2014-01-01

    Da Vinci robot is increasingly used for operations adding the advantages of robots to the favor of medicine. This is a historical article with the aim to present the evolution of robots in the medical area from the time of ancient myths to Renaissance and finally to the current revolutionary applications. We endeavored to collect several elegant narratives on the topic. The use of imagination could help the reader to find similarities. A trip from the Greek myths of Hephaestus through Aristotle and Leonardo Da Vinci to the robots of Karel Capek and Isaac Asimov and finally the invention of the medical robots is presented.

  17. The Settings, Pros and Cons of the New Surgical Robot da Vinci Xi System for Transoral Robotic Surgery (TORS): A Comparison With the Popular da Vinci Si System.

    PubMed

    Kim, Da Hee; Kim, Hwan; Kwak, Sanghyun; Baek, Kwangha; Na, Gina; Kim, Ji Hoon; Kim, Se Heon

    2016-10-01

    The da Vinci system (da Vinci Surgical System; Intuitive Surgical Inc.) has rapidly developed in several years from the S system to the Si system and now the Xi System. To investigate the surgical feasibility and to provide workflow guidance for the newly released system, we used the new da Vinci Xi system for transoral robotic surgery (TORS) on a cadaveric specimen. Bilateral supraglottic partial laryngectomy, hypopharyngectomy, lateral oropharyngectomy, and base of the tongue resection were serially performed in search of the optimal procedures with the new system. The new surgical robotic system has been upgraded in all respects. The telescope and camera were incorporated into one system, with a digital end-mounted camera. Overhead boom rotation allows multiquadrant access without axis limitation, the arms are now thinner and longer with grabbing movements for easy adjustments. The patient clearance button dramatically reduces external collisions. The new surgical robotic system has been optimized for improved anatomic access, with better-equipped appurtenances. This cadaveric study of TORS offers guidance on the best protocol for surgical workflow with the new Xi system leading to improvements in the functional results of TORS.

  18. Transparency of Vocational Qualifications: The Leonardo da Vinci Approach. CEDEFOP Panorama Series.

    ERIC Educational Resources Information Center

    Bjornavold, Jens; Pettersson, Sten

    This report gives an overview of the situation of transparency of vocational qualifications by presenting measures introduced at the European Community level and by drawing attention to projects within the Leonardo da Vinci Program dealing with the issue. A 16-page executive summary appears first. Chapter 1 provides general background and aims.…

  19. Leonardo da Vinci, One Year on...a Different Look at Vocational Training in Europe.

    ERIC Educational Resources Information Center

    Le Magazine, 1996

    1996-01-01

    Discusses the success of the Leonardo da Vinci program, a European laboratory of innovation in vocational training, a priority focus of investment in human resources and intelligence, and a way to mobilize innovative forces beyond national boundaries. Trends identified by the program focus on new information and communication technologies. (JOW)

  20. Solving da Vinci stereopsis with depth-edge-selective V2 cells

    PubMed Central

    Assee, Andrew; Qian, Ning

    2007-01-01

    We propose a new model for da Vinci stereopsis based on a coarse-to-fine disparity-energy computation in V1 and disparity-boundary-selective units in V2. Unlike previous work, our model contains only binocular cells, relies on distributed representations of disparity, and has a simple V1-to-V2 feedforward structure. We demonstrate with random dot stereograms that the V2 stage of our model is able to determine the location and the eye-of-origin of monocularly occluded regions and improve disparity map computation. We also examine a few related issues. First, we argue that since monocular regions are binocularly defined, they cannot generally be detected by monocular cells. Second, we show that our coarse-to-fine V1 model for conventional stereopsis explains double matching in Panum’s limiting case. This provides computational support to the notion that the perceived depth of a monocular bar next to a binocular rectangle may not be da Vinci stereopsis per se (Gillam et al., 2003). Third, we demonstrate that some stimuli previously deemed invalid have simple, valid geometric interpretations. Our work suggests that studies of da Vinci stereopsis should focus on stimuli more general than the bar-and-rectangle type and that disparity-boundary-selective V2 cells may provide a simple physiological mechanism for da Vinci stereopsis. PMID:17698163

  1. Modifications of transaxillary approach in endoscopic da Vinci-assisted thyroid and parathyroid gland surgery.

    PubMed

    Al Kadah, Basel; Piccoli, Micaela; Mullineris, Barbara; Colli, Giovanni; Janssen, Martin; Siemer, Stephan; Schick, Bernhard

    2015-03-01

    Endoscopic surgery for treatment of thyroid and parathyroid pathologies is increasingly gaining attention. The da Vinci system has already been widely used in different fields of medicine and quite recently in thyroid and parathyroid surgery. Herein, we report about modifications of the transaxillary approach in endoscopic surgery of thyroid and parathyroid gland pathologies using the da Vinci system. 16 patients suffering from struma nodosa in 14 cases and parathyroid adenomas in two cases were treated using the da Vinci system at the ENT Department of Homburg/Saar University and in cooperation with the Department of General Surgery in New Sant'Agostino Hospital, Modena/Italy. Two different retractors, endoscopic preparation of the access and three different incision modalities were used. The endoscopic preparation of the access allowed us to have a better view during preparation and reduced surgical time compared to the use of a headlamp. To introduce the da Vinci instruments at the end of the access preparation, the skin incisions were over the axilla with one incision in eight patients, two incisions in four patients and three incisions in a further four patients. The two and three skin incisions modality allowed introduction of the da Vinci instruments without arm conflicts. The use of a new retractor (Modena retractor) compared to a self-developed retractor made it easier during the endoscopic preparation of the access and the reposition of the retractor. The scar was hidden in the axilla and independent of the incisions selected, the cosmetic findings were judged by the patients to be excellent. The neurovascular structures such as inferior laryngeal nerve, superior laryngeal nerve and vessels, as well as the different pathologies, were clearly 3D visualized in all 16 cases. No paralysis of the vocal cord was observed. All patients had a benign pathology in their histological examination. The endoscopic surgery of the thyroid and parathyroid gland can be

  2. Battle of the bots: a comparison of the standard da Vinci and the da Vinci Surgical Skills Simulator in surgical skills acquisition.

    PubMed

    Brown, Kevin; Mosley, Natalie; Tierney, James

    2016-08-29

    Virtual reality simulators are increasingly used to gain robotic surgical skills. This study compared use of the da Vinci Surgical Skills Simulator (dVSSS) to the standard da Vinci (SdV) robot for skills acquisition in a prospective randomized study. Residents from urology, gynecology, and general surgery programs performed three virtual reality tasks (thread the ring, ring rail, and tubes) on the dvSSS. Participants were then randomized to one of the two study groups (dVSSS and SdV). Each participant then practiced on either the dVSSS or the SdV (depending on randomization) for 30 min per week over a 4-week time period. The dVSSS arm was not permitted to practice ring rail (due to no similar practice scenario available for the SdV group). Following 4 weeks of practice, participants performed the same three virtual reality tasks and the results were recorded and compared to baseline. Overall and percent improvement were recorded for all participants from pre-test to post-test. Two-way ANOVA analyses were used to compare the dVSSS and SdV groups and three tasks. Initially, 30 participants were identified and enrolled in the study. Randomization resulted in 15 participants in each arm. During the course of the study, four participants were unable to complete all tasks and practice sessions and were, therefore, excluded. This resulted in a total of 26 participants (15 in the dVSSS group and 11 in the SdV group) who completed the study. Overall total improvement score was found to be 23.23 and 23.48 for the SdV and dVSSS groups, respectively (p = 0.9245). The percent improvement was 60 and 47 % for the SdV and dVSSS groups respectively, which was a statistically significant difference between the two groups and three tasks. Practicing on the standard da Vinci is comparable to practicing on the da Vinci simulator for acquiring robotic surgical skills. In spite of several potential advantages, the dVSSS arm performed no better than the SdV arm in the final

  3. An efficient floating-point to fixed-point conversion process for biometric algorithm on DaVinci DSP architecture

    NASA Astrophysics Data System (ADS)

    Konvalinka, Ira; Quddus, Azhar; Asraf, Daniel

    2009-05-01

    Today there is no direct path for the conversion of a floating-point algorithm implementation to an optimized fixed-point implementation. This paper proposes a novel and efficient methodology for Floating-point to Fixed-point Conversion (FFC) of biometric Fingerprint Algorithm Library (FAL) on fixed-point DaVinci processor. A general FFC research task is streamlined along smaller tasks which can be accomplished with lower effort and higher certainty. Formally specified in this paper is the optimization target in FFC, to preserve floating-point accuracy and to reduce execution time, while preserving the majority of algorithm code base. A comprehensive eight point strategy is formulated to achieve that target. Both local (focused on the most time consuming routines) and global optimization flow (to optimize across multiple routines) are used. Characteristic phases in the FFC activity are presented using data from employing the proposed FFC methodology to FAL, starting with target optimization specification, to speed optimization breakthroughs, finalized with validation of FAL accuracy after the execution time optimization. FAL implementation resulted in biometric verification time reduction for over a factor of 5, with negligible impact on accuracy. Any algorithm developer facing the task of implementing his floating-point algorithm on DaVinci DSP is expected to benefit from this presentation.

  4. Visual degradation in Leonardo da Vinci's iconic self-portrait: A nanoscale study

    NASA Astrophysics Data System (ADS)

    Conte, A. Mosca; Pulci, O.; Misiti, M. C.; Lojewska, J.; Teodonio, L.; Violante, C.; Missori, M.

    2014-06-01

    The discoloration of ancient paper, due to the development of oxidized groups acting as chromophores in its chief component, cellulose, is responsible for severe visual degradation in ancient artifacts. By adopting a non-destructive approach based on the combination of optical reflectance measurements and time-dependent density functional theory ab-initio calculations, we describe and quantify the chromophores affecting Leonardo da Vinci's iconic self-portrait. Their relative concentrations are very similar to those measured in modern and ancient samples aged in humid environments. This analysis quantifies the present level of optical degradation of the Leonardo da Vinci's self-portrait which, compared with future measurements, will assess its degradation rate. This is a fundamental information in order to plan appropriate conservation strategies.

  5. Towards the Implementation of an Autonomous Camera Algorithm on the da Vinci Platform.

    PubMed

    Eslamian, Shahab; Reisner, Luke A; King, Brady W; Pandya, Abhilash K

    2016-01-01

    Camera positioning is critical for all telerobotic surgical systems. Inadequate visualization of the remote site can lead to serious errors that can jeopardize the patient. An autonomous camera algorithm has been developed on a medical robot (da Vinci) simulator. It is found to be robust in key scenarios of operation. This system behaves with predictable and expected actions for the camera arm with respect to the tool positions. The implementation of this system is described herein. The simulation closely models the methodology needed to implement autonomous camera control in a real hardware system. The camera control algorithm follows three rules: (1) keep the view centered on the tools, (2) keep the zoom level optimized such that the tools never leave the field of view, and (3) avoid unnecessary movement of the camera that may distract/disorient the surgeon. Our future work will apply this algorithm to the real da Vinci hardware.

  6. The Handedness of Leonardo da Vinci: A Tale of the Complexities of Lateralisation

    ERIC Educational Resources Information Center

    McManus, I. C.; Drury, Helena

    2004-01-01

    The handedness of Leonardo da Vinci is controversial. Although there is little doubt that many of his well-attributed drawings were drawn with the left hand, the hatch marks of the shading going downwards from left to right, it is not clear that he was a natural left-hander, there being some suggestion that he may have become left-handed as the…

  7. [History of robotics: from archytas of tarentum until Da Vinci robot. (Part II)].

    PubMed

    Sánchez-Martín, F M; Jiménez Schlegl, P; Millán Rodríguez, F; Salvador-Bayarri, J; Monllau Font, V; Palou Redorta, J; Villavicencio Mavrich, H

    2007-03-01

    Robotic surgery is a reality. In order to to understand how new robots work is interesting to know the history of ancient (see part i) and modern robotics. The desire to design automatic machines imitating humans continued for more than 4000 years. Archytas of Tarentum (at around 400 a.C.), Heron of Alexandria, Hsieh-Fec, Al-Jazari, Bacon, Turriano, Leonardo da Vinci, Vaucanson o von Kempelen were robot inventors. At 1942 Asimov published the three robotics laws. Mechanics, electronics and informatics advances at XXth century developed robots to be able to do very complex self governing works. At 1985 the robot PUMA 560 was employed to introduce a needle inside the brain. Later on, they were designed surgical robots like World First, Robodoc, Gaspar o Acrobot, Zeus, AESOP, Probot o PAKI-RCP. At 2000 the FDA approved the da Vinci Surgical System (Intuitive Surgical Inc, Sunnyvale, CA, USA), a very sophisticated robot to assist surgeons. Currently urological procedures like prostatectomy, cystectomy and nephrectomy are performed with the da Vinci, so urology has become a very suitable speciality to robotic surgery.

  8. Early assessment of feasibility and technical specificities of transoral robotic surgery using the da Vinci Xi.

    PubMed

    Gorphe, Philippe; Von Tan, Jean; El Bedoui, Sophie; Hartl, Dana M; Auperin, Anne; Qassemyar, Quentin; Moya-Plana, Antoine; Janot, François; Julieron, Morbize; Temam, Stephane

    2017-01-07

    The latest generation Da Vinci(®) Xi™ Surgical System Robot released has not been evaluated to date in transoral surgery for head and neck cancers. We report here the 1-year results of a non-randomized phase II multicentric prospective trial aimed at assessing its feasibility and technical specificities. Our primary objective was to evaluate the feasibility of transoral robotic surgery using the da Vinci(®) Xi™ Surgical System Robot. The secondary objective was to assess peroperative outcomes. Twenty-seven patients, mean age 62.7 years, were included between May 2015 and June 2016 with tumors affecting the following sites: oropharynx (n = 21), larynx (n = 4), hypopharynx (n = 1), parapharyngeal space (n = 1). Eighteen patients were included for primary treatment, three for a local recurrence, and six for cancer in a previously irradiated field. Three were reconstructed with a FAMM flap and 6 with a free ALT flap. The mean docking time was 12 min. "Chopsticking" of surgical instruments was very rare. During hospitalization following surgery, 3 patients experienced significant bleeding between day 8 and 9 that required surgical transoral hemostasis (n = 1) or endovascular embolization (n = 2). Transoral robotic surgery using the da Vinci(®) Xi™ Surgical System Robot proved feasible with technological improvements compared to previous generation surgical system robots and with a similar postoperative course. Further technological progress is expected to be of significant benefit to the patients.

  9. [The Vitruvian Man: an anatomical drawing for proportions by Leonardo Da Vinci].

    PubMed

    Le Floch-Prigent, P

    2008-12-01

    The aim of the study was to find out and to analyse the text by Vitruvius which inspired the famous drawing by Leonardo da Vinci (circa 1490) kept in the Galleria dell'Accademia, in Venezia, Italy: the man inscribed in one circle and in one square. The book "de Architectura" by Vitruvius Marcus Pollio was printed several times since the Renaissance when both the roman architecture of antiquity and this text became very popular. From a French translation by Claude Perrault in 1864, it became easy to find a French translation with the original text in Latin (Paris, 2003, Les Belles Lettres, French text by Pierre Gros). The drawing by Leonardo da Vinci illustrates with great accuracy and fidelity the quotation of Vitruvius (with the exception of two of the 12 main relationships). The genius of Leonardo da Vinci was to keep only one trunk, head and neck for two pairs of limbs: scapular and pelvic; to make the circle tangent to the lower edge of the square; to adjust a few features of the quotation for the equilibrium of the whole figure; and of course to bring his incredible skill as a drawer (one of the best of his century). The drawing was made on a sheet of paper 344x245mm, in black ink which became dark brown with time; several lines complete the figure above and below; a short caption and a horizontal scale appear just under the drawing. The celebrity of the drawing, a symbol of the Renaissance, of the equilibrium of man and mankind, of the universality of the artists and intellectuals of the time (Humanism) made it iconic and it has been constantly reproduced and adapted especially for advertisement and logos, not only in the medical field.

  10. OCT structural examination of Madonna dei Fusi by Leonardo da Vinci

    NASA Astrophysics Data System (ADS)

    Targowski, Piotr; Iwanicka, Magdalena; Sylwestrzak, Marcin; Kaszewska, Ewa A.; Frosinini, Cecilia

    2013-05-01

    Madonna dei Fusi (`Madonna of the Yarnwider') is a spectacular example of Italian Renaissance painting, attributed to Leonardo da Vinci. The aim of this study is to give an account of past restoration procedures. The evidence of a former retouching campaign will be presented with cross-sectional images obtained non-invasively with Optical Coherence Tomography (OCT). Specifically, the locations of overpaintings/retouchings with respect to the original paint layer and secondary varnishes will be given. Additionally, the evidence of a former transfer of the pictorial layer to the new canvas support by detecting the presence of its structure incised into paint layer will be shown.

  11. [The anatomy of a reduced skull model--visualisation of Leonardo da Vinci's anthropology].

    PubMed

    Ahner, E

    2008-04-02

    The article focuses on a rare example of a miniature skull of unknown origin. The profoundness of the anatomical details, conjoint with outstanding virtuosity, reminds of Leonardo da Vinci's anatomical skull studies and asks for additional interpretation beside the emblematic "memento mori"-character. Following the miscellaneous topics of his skull studies an anatomical-anthropological interpretation is proposed. For such a project the mergence of anthropology, history of medicine and history of art was mandatory. Concerning some discrepancies within the anatomical realism, the depiction of a pathology is discussed and beyond the visualisation of a historic concept of brain function.

  12. The Da Vinci European BioBank: A Metabolomics-Driven Infrastructure

    PubMed Central

    Carotenuto, Dario; Luchinat, Claudio; Marcon, Giordana; Rosato, Antonio; Turano, Paola

    2015-01-01

    We present here the organization of the recently-constituted da Vinci European BioBank (daVEB, https://www.davincieuropeanbiobank.org/it). The biobank was created as an infrastructure to support the activities of the Fiorgen Foundation (http://www.fiorgen.net/), a nonprofit organization that promotes research in the field of pharmacogenomics and personalized medicine. The way operating procedures concerning samples and data have been developed at daVEB largely stems from the strong metabolomics connotation of Fiorgen and from the involvement of the scientific collaborators of the foundation in international/European projects aimed to tackle the standardization of pre-analytical procedures and the promotion of data standards in metabolomics. PMID:25913579

  13. The uncatchable smile in Leonardo da Vinci's La Bella Principessa portrait.

    PubMed

    Soranzo, Alessandro; Newberry, Michelle

    2015-08-01

    A portrait of uncertain origin recently came to light which, after extensive research and examination, was shown to be that rarest of things: a newly discovered Leonardo da Vinci painting entitled La Bella Principessa. This research presents a new illusion which is similar to that identified in the Mona Lisa; La Bella Principessa's mouth appears to change slant depending on both the Viewing Distance and the Level of Blur applied to a digital version of the portrait. Through a series of psychophysics experiments, it was found that a perceived change in the slant of the La Bella Principessa's mouth influences her expression of contentment thus generating an illusion that we have coined the "uncatchable smile". The elusive quality of the Mona Lisa's smile has been previously reported (Science, 290 (2000) 1299) and so the existence of a similar illusion in a portrait painted prior to the Mona Lisa becomes more interesting. The question remains whether Leonardo da Vinci intended this illusion. In any case, it can be argued that the ambiguity created adds to the portrait's allure.

  14. Leonardo da Vinci and Andreas Vesalius; the shoulder girdle and the spine, a comparison.

    PubMed

    Ganseman, Y; Broos, P

    2008-01-01

    Leonardo Da Vinci and Andreas Vesalius were two important renaissance persons; Vesalius was a surgeon-anatomist who delivered innovative work on the study of the human body, Leonardo da Vinci was an artist who delivered strikingly accurate and beautiful drawings on the human body. Below we compare both masters with regard to their knowledge of the working of the muscles, their method and system of dissection and their system and presentation of the drawings. The investigation consisted of a comparison between both anatomists, in particular concerning their study on the shoulder girdle and spine, by reviewing their original work as well as already existing literature on this subject. The investigation led to the conclusion that the drawings mentioned meant a change in history, and were of high quality, centuries ahead of their time. Both were anatomists, both were revolutionary, only one changed history at the moment itself, while the other changed history centuries later. Leonardo has made beautiful drawings that are at a match with the drawings of today or are even better. Vesalius set the start for medicine as a science as it is until this day. Their lives differed as strongly as their impact. In the light of their time, the achievement they made was extraordinary.

  15. Educating in the Design and Construction of Built Environments Accessible to Disabled People: The Leonardo da Vinci AWARD Project

    ERIC Educational Resources Information Center

    Frattari, Antonio; Dalpra, Michela; Bernardi, Fabio

    2013-01-01

    An interdisciplinary partnership within an European Leonardo da Vinci project has developed a new approach aimed at educating secondary school students in the creation of built environments accessible to disabled people and at sensitizing them towards the inclusion of people with disabilities in all realms of social life. The AWARD (Accessible…

  16. Virtual Mobility in Reality: A Study of the Use of ICT in Finnish Leonardo da Vinci Mobility Projects.

    ERIC Educational Resources Information Center

    Valjus, Sonja

    An e-mail survey and interviews collected data on use of information and communications technology (ICT) in Finnish Leonardo da Vinci mobility projects from 2000-02. Findings showed that the most common ICT tools used were e-mail, digital tools, and the World Wide Web; ICT was used during all project phases; the most common problems concerned…

  17. Da Vinci Xi and Si platforms have equivalent perioperative outcomes during robot-assisted partial nephrectomy: preliminary experience.

    PubMed

    Abdel Raheem, Ali; Sheikh, Abulhasan; Kim, Dae Keun; Alatawi, Atalla; Alabdulaali, Ibrahim; Han, Woong Kyu; Choi, Young Deuk; Rha, Koon Ho

    2017-03-01

    The aims of this study were to compare the perioperative outcomes of da Vinci Xi to Si during robotic-assisted partial nephrectomy (RAPN) and to discuss the feasibility of our novel port placement scheme for the da Vinci Xi platform, to overcome the existing kinetic and technical difficulties we faced with the linear port placement in patients with a small body habitus. A retrospective data analysis of patients who underwent RPN using da Vinci Xi (n = 18) was carried out. The outcomes of the Xi group were compared with the Si group (n = 18) selected using a case-matched methodology. For da Vinci Xi, we applied the universal linear port placement in 12 patients and our modified port placement in the remaining 6 patients. The Xi group had a shorter mean docking time of 17.8 ± 2.6 min compared to the Si group of 20.5 ± 2.1 min (p = 0.002); otherwise, no significant difference was present with regard to the remaining perioperative variables (p > 0.05). The modified Xi port placement had a shorter mean console time of 70.8 ± 9.7 min compared to the universal linear port placement of 89.3 ± 17.2 min (p = 0.03). Moreover, it provided a broader field of vision with excellent robotic arms movement, minimizing collisions and allowing an easier and comfortable surgical assist. Da Vinci Xi appears to be feasible and safe during RPN with similar outcomes to Si. The novel Xi port placement makes surgery easier in patients with low BMI.

  18. Microbiological Analysis of Surfaces of Leonardo Da Vinci's Atlantic Codex: Biodeterioration Risk

    PubMed Central

    Moroni, Catia; Pasquariello, Giovanna; Maggi, Oriana

    2014-01-01

    Following the discovery of discoloration on some pages of the Atlantic Codex (AC) of Leonardo da Vinci kept in the Biblioteca Ambrosiana in Milan, some investigations have been carried out to verify the presence of microorganisms, such as bacteria and fungi. To verify the presence of microorganisms a noninvasive method of sampling has been used that was efficient and allowed us to highlight the microbial facies of the material that was examined using conventional microbiological techniques. The microclimatic conditions in the storage room as well as the water content of the volume were also assessed. The combined observations allowed the conclusion that the discoloration of suspected biological origin on some pages of AC is not related to the presence or current attack of microbial agents. PMID:25574171

  19. Microbiological Analysis of Surfaces of Leonardo Da Vinci's Atlantic Codex: Biodeterioration Risk.

    PubMed

    Tarsitani, Gianfranco; Moroni, Catia; Cappitelli, Francesca; Pasquariello, Giovanna; Maggi, Oriana

    2014-01-01

    Following the discovery of discoloration on some pages of the Atlantic Codex (AC) of Leonardo da Vinci kept in the Biblioteca Ambrosiana in Milan, some investigations have been carried out to verify the presence of microorganisms, such as bacteria and fungi. To verify the presence of microorganisms a noninvasive method of sampling has been used that was efficient and allowed us to highlight the microbial facies of the material that was examined using conventional microbiological techniques. The microclimatic conditions in the storage room as well as the water content of the volume were also assessed. The combined observations allowed the conclusion that the discoloration of suspected biological origin on some pages of AC is not related to the presence or current attack of microbial agents.

  20. Application of da Vinci® Robot in simple or radical hysterectomy: Tips and tricks

    PubMed Central

    Iavazzo, Christos; Gkegkes, Ioannis D.

    2016-01-01

    The first robotic simple hysterectomy was performed more than 10 years ago. These days, robotic-assisted hysterectomy is accepted as an alternative surgical approach and is applied both in benign and malignant surgical entities. The two important points that should be taken into account to optimize postoperative outcomes in the early period of a surgeon’s training are how to achieve optimal oncological and functional results. Overcoming any technical challenge, as with any innovative surgical method, leads to an improved surgical operation timewise as well as for patients’ safety. The standardization of the technique and recognition of critical anatomical landmarks are essential for optimal oncological and clinical outcomes on both simple and radical robotic-assisted hysterectomy. Based on our experience, our intention is to present user-friendly tips and tricks to optimize the application of a da Vinci® robot in simple or radical hysterectomies. PMID:27403078

  1. Bell's palsy: the answer to the riddle of Leonardo da Vinci's 'Mona Lisa'.

    PubMed

    Maloney, W J

    2011-05-01

    The smile of the famed portrait 'The Mona Lisa' has perplexed both art historians and researchers for the past 500 years. There has been a multitude of theories expounded to explain the nature of the model's enigmatic smile. The origin of the model's wry smile can be demonstrated through a careful analysis of both documented facts concerning the portrait--some gathered only recently through the use of modern technology--and a knowledge of the clinical presentation of Bell's palsy. Bell's palsy is more prevalent in women who are either pregnant or who have recently given birth. This paper postulates that the smile of the portrait's model was due to Leonardo da Vinci's anatomically precise representation of a new mother affected by Bell's palsy subsequent to her recent pregnancy.

  2. Urodynamics in the anatomical work of Leonardo da Vinci (1452-1519).

    PubMed

    Schultheiss, D; Grünewald, V; Jonas, U

    1999-06-01

    Leonardo da Vinci (1452-1519) incorporates the symbiosis of art and medicine and can be addressed as the founder of medical illustration in the time of the Renaissance. His anatomy studies were not published in his time, which explains why Leonardo's outstanding knowledge of anatomy, physiology, and medicine had no impact on his scientific contemporaries and is therefore primarily of retrospective importance in the history of medicine. The collection of anatomical illustrations remained unknown until their rediscovery in the eighteenth century and their wide publication at the beginning of our century. This article systematically reviews Leonardo's genitourinary drawings with regard to urodynamic aspects of the upper and lower urinary tract, highlighting topics such as vesicoureteral reflux and urinary sphincter mechanisms.

  3. Understanding the adoption dynamics of medical innovations: affordances of the da Vinci robot in the Netherlands.

    PubMed

    Abrishami, Payam; Boer, Albert; Horstman, Klasien

    2014-09-01

    This study explored the rather rapid adoption of a new surgical device - the da Vinci robot - in the Netherlands despite the high costs and its controversial clinical benefits. We used the concept 'affordances' as a conceptual-analytic tool to refer to the perceived promises, symbolic meanings, and utility values of an innovation constructed in the wider social context of use. This concept helps us empirically understand robot adoption. Data from 28 in-depth interviews with diverse purposively-sampled stakeholders, and from medical literature, policy documents, Health Technology Assessment reports, congress websites and patients' weblogs/forums between April 2009 and February 2014 were systematically analysed from the perspective of affordances. We distinguished five interrelated affordances of the robot that accounted for shaping and fulfilling its rapid adoption: 'characteristics-related' affordances such as smart nomenclature and novelty, symbolising high-tech clinical excellence; 'research-related' affordances offering medical-technical scientific excellence; 'entrepreneurship-related' affordances for performing better-than-the-competition; 'policy-related' affordances indicating the robot's liberalised provision and its reduced financial risks; and 'communication-related' affordances of the robot in shaping patients' choices and the public's expectations by resonating promising discourses while pushing uncertainties into the background. These affordances make the take-up and use of the da Vinci robot sound perfectly rational and inevitable. This Dutch case study demonstrates the fruitfulness of the affordances approach to empirically capturing the contextual dynamics of technology adoption in health care: exploring in-depth actors' interaction with the technology while considering the interpretative spaces created in situations of use. This approach can best elicit real-life value of innovations, values as defined through the eyes of (potential) users.

  4. From Leonardo to da Vinci: the history of robot-assisted surgery in urology.

    PubMed

    Yates, David R; Vaessen, Christophe; Roupret, Morgan

    2011-12-01

    What's known on the subject? and What does the study add? Numerous urological procedures can now be performed with robotic assistance. Though not definitely proven to be superior to conventional laparoscopy or traditional open surgery in the setting of a randomised trial, in experienced centres robot-assisted surgery allows for excellent surgical outcomes and is a valuable tool to augment modern surgical practice. Our review highlights the depth of history that underpins the robotic surgical platform we utilise today, whilst also detailing the current place of robot-assisted surgery in urology in 2011. The evolution of robots in general and as platforms to augment surgical practice is an intriguing story that spans cultures, continents and centuries. A timeline from Yan Shi (1023-957 bc), Archytas of Tarentum (400 bc), Aristotle (322 bc), Heron of Alexandria (10-70 ad), Leonardo da Vinci (1495), the Industrial Revolution (1790), 'telepresence' (1950) and to the da Vinci(®) Surgical System (1999), shows the incredible depth of history and development that underpins the modern surgical robot we use to treat our patients. Robot-assisted surgery is now well-established in Urology and although not currently regarded as a 'gold standard' approach for any urological procedure, it is being increasingly used for index operations of the prostate, kidney and bladder. We perceive that robotic evolution will continue infinitely, securing the place of robots in the history of Urological surgery. Herein, we detail the history of robots in general, in surgery and in Urology, highlighting the current place of robot-assisted surgery in radical prostatectomy, partial nephrectomy, pyeloplasty and radical cystectomy.

  5. [The art of Leonardo Da Vinci as a resource to science and the ideal of nursing care].

    PubMed

    Nascimento, Maria Aparecida de Luca; de Brito, Isabela Jorge; Dehoul, Marcelo da Silva

    2003-01-01

    Theoretical reflection whose goal is to demonstrate the art a nursing team is required to show in order to perform a technical procedure for transfer of solutions from a normal vial to a microdrops vial, based on Leonardo Da Vinci's theoretical referential, inspired by his work called "Vitruvian Man", so that body harmony is kept. The authors emphasize its relationship to nursing care, viewing it from its broadest sense, and its own motto--"Science, Art and Ideal".

  6. Realization of a single image haze removal system based on DaVinci DM6467T processor

    NASA Astrophysics Data System (ADS)

    Liu, Zhuang

    2014-10-01

    Video monitoring system (VMS) has been extensively applied in domains of target recognition, traffic management, remote sensing, auto navigation and national defence. However the VMS has a strong dependence on the weather, for instance, in foggy weather, the quality of images received by the VMS are distinct degraded and the effective range of VMS is also decreased. All in all, the VMS performs terribly in bad weather. Thus the research of fog degraded images enhancement has very high theoretical and practical application value. A design scheme of a fog degraded images enhancement system based on the TI DaVinci processor is presented in this paper. The main function of the referred system is to extract and digital cameras capture images and execute image enhancement processing to obtain a clear image. The processor used in this system is the dual core TI DaVinci DM6467T - ARM@500MHz+DSP@1GH. A MontaVista Linux operating system is running on the ARM subsystem which handles I/O and application processing. The DSP handles signal processing and the results are available to the ARM subsystem in shared memory.The system benefits from the DaVinci processor so that, with lower power cost and smaller volume, it provides the equivalent image processing capability of a X86 computer. The outcome shows that the system in this paper can process images at 25 frames per second on D1 resolution.

  7. Molecular studies of microbial community structure on stained pages of Leonardo da Vinci's Atlantic Codex.

    PubMed

    Principi, Pamela; Villa, Federica; Sorlini, Claudia; Cappitelli, Francesca

    2011-01-01

    In 2006, after a visual inspection of the Leonardo da Vinci's Atlantic Codex by a scholar, active molds were reported to have been present on Codex pages showing areas of staining. In the present paper, molecular methods were used to assess the current microbiological risk to stained pages of the manuscript. Bacterial and fungal communities were sampled by a non-invasive technique employing nitrocellulose membranes. Denaturing gradient gel electrophoresis of 16 S rRNA gene and internal transcribed space regions were carried out to study the structure of the bacterial and fungal communities and band patterns were analyzed by the multivariate technique of principal component analysis. Any relationship between the presence of an active microbial community and staining was excluded. The presence of potential biodeteriogens was evaluated by constructing bacterial and fungal clone libraries, analyzing them by an operational taxonomic unit (OTU) approach. Among the bacteria, some OTUs were associated with species found on floors in clean room while others were identified with human skin contamination. Some fungal OTU representatives were potential biodeteriogens that, under proper thermo-hygrometric conditions, could grow. The retrieval of these potential biodeteriogens and microorganisms related to human skin suggests the need for a continuous and rigorous monitoring of the environmental conditions, and the need to improve handling procedures.

  8. The handedness of Leonardo da Vinci: a tale of the complexities of lateralisation.

    PubMed

    McManus, I C; Drury, Helena

    2004-07-01

    The handedness of Leonardo da Vinci is controversial. Although there is little doubt that many of his well-attributed drawings were drawn with the left hand, the hatch marks of the shading going downwards from left to right, it is not clear that he was a natural left-hander, there being some suggestion that he may have become left-handed as the result of an injury to his right hand in early adulthood. Leonardo's lateralisation may be illuminated by an obscure passage in his notebooks in which he describes crouching down to look into a dark cave, putting his left hand on his knee, and shading his eyes with his right hand. We carried out a questionnaire survey, using 33 written and photographic items, to find whether this behaviour was typical of right handers or left handers. In fact the 'Leonardo task' showed almost no direct association with handedness, meaning that it contributes little to the immediate problem of elucidating Leonardo's handedness. However, the lateralisation of the task did relate to other aspects of behavioural laterality in surprisingly complex ways. This suggests that individual differences in handedness, and behavioural laterality in general, have a structural complexity which is not fully encompassed by simple measures of direction or degree of handedness.

  9. Leonardo da Vinci's "A skull sectioned": skull and dental formula revisited.

    PubMed

    Gerrits, Peter O; Veening, Jan G

    2013-05-01

    What can be learned from historical anatomical drawings and how to incorporate these drawings into anatomical teaching? The drawing "A skull sectioned" (RL 19058v) by Leonardo da Vinci (1452-1519), hides more detailed information than reported earlier. A well-chosen section cut explores sectioned paranasal sinuses and ductus nasolacrimalis. A dissected lateral wall of the maxilla is also present. Furthermore, at the level of the foramen mentale, the drawing displays compact and spongious bony components, together with a cross-section through the foramen mentale and its connection with the canalis mandibulae. Leonardo was the first to describe a correct dental formula (6424) and made efforts to place this formula above the related dental elements. However, taking into account, the morphological features of the individual elements of the maxilla, it can be suggested that Leonardo sketched a "peculiar dental element" on the position of the right maxillary premolar in the dental sketch. The fact that the author did not make any comment on that special element is remarkable. Leonardo could have had sufficient knowledge of the precise morphology of maxillary and mandibular premolars, since the author depicted these elements in the dissected skull. The fact that the author also had access to premolars in situ corroborates our suggestion that "something went wrong" in this part of the drawing. The present study shows that historical anatomical drawings are very useful for interactive learning of detailed anatomy for students in medicine and dentistry.

  10. Early Experience in Da Vinci Robot-Assisted Partial Nephrectomy: An Australian Single Centre Series

    PubMed Central

    Ting, Francis; Savdie, Richard; Chopra, Sam; Yuen, Carlo; Brenner, Phillip

    2015-01-01

    Introduction and Objectives. To demonstrate the safety and efficacy of the robot-assisted partial nephrectomy (RAPN) technique in an Australian setting. Methods. Between November 2010 and July 2014, a total of 76 patients underwent 77 RAPN procedures using the Da Vinci Surgical System© at our institution. 58 of these procedures were performed primarily by the senior author (PB) and are described in this case series. Results. Median operative time was 4 hours (range 1.5–6) and median warm ischaemic time (WIT) was 8 minutes (range 0–30) including 11 cases with zero ischaemic time. All surgical margins were clear with the exception of one patient who had egress of intravascular microscopic tumour outside the capsule to the point of the resection margin. Complications were identified in 9 patients (15.8%). Major complications included conversion to open surgery due to significant venous bleeding (n = 1), reperfusion injury (n = 1), gluteal compartment syndrome (n = 1), DVT/PE (n = 1), and readmission for haematuria (n = 1). Conclusion. This series demonstrates the safety and efficacy of the RAPN technique in an Australian setting when performed by experienced laparoscopic surgeons in a dedicated high volume robotic centre. PMID:26167299

  11. [Leonardo da Vinci the first human body imaging specialist. A brief communication on the thorax oseum images].

    PubMed

    Cicero, Raúl; Criales, José Luis; Cardoso, Manuel

    2009-01-01

    The impressive development of computed tomography (CT) techniques such as the three dimensional helical CT produces a spatial image of the thoracic skull. At the beginning of the 16th century Leonardo da Vinci drew with great precision the thorax oseum. These drawings show an outstanding similarity with the images obtained by three dimensional helical CT. The cumbersome task of the Renaissance genius is a prime example of the careful study of human anatomy. Modern imaging techniques require perfect anatomic knowledge of the human body in order to generate exact interpretations of images. Leonardo's example is alive for anybody devoted to modern imaging studies.

  12. The LEONARDO-DA-VINCI pilot project "e-learning-assistant" - Situation-based learning in nursing education.

    PubMed

    Pfefferle, Petra Ina; Van den Stock, Etienne; Nauerth, Annette

    2010-07-01

    E-learning will play an important role in the training portfolio of students in higher and vocational education. Within the LEONARDO-DA-VINCI action programme transnational pilot projects were funded by the European Union, which aimed to improve the usage and quality of e-learning tools in education and professional training. The overall aim of the LEONARDO-DA-VINCI pilot project "e-learning-assistant" was to create new didactical and technical e-learning tools for Europe-wide use in nursing education. Based on a new situation-oriented learning approach, nursing teachers enrolled in the project were instructed to adapt, develop and implement e- and blended learning units. According to the training contents nursing modules were developed by teachers from partner institutions, implemented in the project centers and evaluated by students. The user-package "e-learning-assistant" as a product of the project includes two teacher training units, the authoring tool "synapse" to create situation-based e-learning units, a student's learning platform containing blended learning modules in nursing and an open sourced web-based communication centre.

  13. Realization and optimization of AES algorithm on the TMS320DM6446 based on DaVinci technology

    NASA Astrophysics Data System (ADS)

    Jia, Wen-bin; Xiao, Fu-hai

    2013-03-01

    The application of AES algorithm in the digital cinema system avoids video data to be illegal theft or malicious tampering, and solves its security problems. At the same time, in order to meet the requirements of the real-time, scene and transparent encryption of high-speed data streams of audio and video in the information security field, through the in-depth analysis of AES algorithm principle, based on the hardware platform of TMS320DM6446, with the software framework structure of DaVinci, this paper proposes the specific realization methods of AES algorithm in digital video system and its optimization solutions. The test results show digital movies encrypted by AES128 can not play normally, which ensures the security of digital movies. Through the comparison of the performance of AES128 algorithm before optimization and after, the correctness and validity of improved algorithm is verified.

  14. Leonardo da Vinci's drapery studies: characterization of lead white pigments by µ-XRD and 2D scanning XRF

    NASA Astrophysics Data System (ADS)

    Gonzalez, Victor; Calligaro, Thomas; Pichon, Laurent; Wallez, Gilles; Mottin, Bruno

    2015-11-01

    This work focuses on the composition and microstructure of the lead white pigment employed in a set of paintworks, using a combination of µ-XRD and 2D scanning XRF, directly applied on five drapery studies attributed to Leonardo da Vinci (1452-1519) and conserved in the Département des Arts Graphiques, Musée du Louvre and in the Musée des Beaux- Arts de Rennes. Trace elements present in the composition as well as in the lead white highlights were imaged by 2D scanning XRF. Mineral phases were determined in a fully noninvasive way using a special µ-XRD diffractometer. Phase proportions were estimated by Rietveld refinement. The analytical results obtained will contribute to differentiate lead white qualities and to highlight the artist's technique.

  15. [First 24 Japanese cases of robotic-assisted laparoscopic radical prostatectomy using the daVinci Surgical System].

    PubMed

    Yoshioka, Kunihiko; Hatano, Tadashi; Nakagami, Yoshihiro; Ozu, Choichiro; Horiguchi, Yutaka; Sakamoto, Noboru; Yonov, Hiroyuki; Ohno, Yoshio; Ohori, Makoto; Tachibana, Masaaki; Patel, Vipul R

    2008-05-01

    In Japan, as of September 2007, prostatectomy is conducted with open surgical procedures in more than 90% of the cases. Following the first reported robotic prostatectomy by Binder, et al. in 2000, a robotic-assisted laparoscopic radical prostatectomy (RALP) using the daVinci Surgical System (Intuitive Surgical, Inc., Sunnyvale, California, USA) has been extensively used as a standard procedure with gratifying results in the United States. In the Asian region, in contrast, RALP is still in an introductory phase. Recently, we introduced RALP in Japan. A total of 24 patients received robotic surgery within a year since August 2006. RALP was completed in all patients without conversion to open surgery, except for the first patient in whom a restriction to a 2-hour operation had been imposed by the Ethical Committee. The mean operative time using the daVinci device and the mean estimated blood loss were 232.0 (range; 136-405) minutes and 313.0 (range; 10-1,000) ml, respectively. The training program we recently developed proved remarkably effective in reducing the learning curve of robotic surgery in Japan, where there is no person with expertise in this operating procedure. In particular, the intraoperative guidance given by the expert was useful after relevant problematic points were delineated by operators who received comprehensive video-based image training and actually performed robot surgery in several cases. With direct intraoperative guidance by the mentor during cases 13 and 14, both the operation time and estimated blood loss was markedly reduced.

  16. How did Leonardo perceive himself? Metric iconography of da Vinci's self-portraits

    NASA Astrophysics Data System (ADS)

    Tyler, Christopher W.

    2010-02-01

    Some eighteen portraits are now recognized of Leonardo in old age, consolidating the impression from his bestestablished self-portrait of an old man with long white hair and beard. However, his appearance when younger is generally regarded as unknown, although he was described as very beautiful as a youth. Application of the principles of metric iconography, the study of the quantitative analysis of the painted images, provides an avenue for the identification of other portraits that may be proposed as valid portraits of Leonardo during various stages of his life, by himself and by his contemporaries. Overall, this approach identifies portraits of Leonardo by Verrocchio, Raphael, Botticelli, and others. Beyond this physiognomic analysis, Leonardo's first known drawing provides further insight into his core motivations. Topographic considerations make clear that the drawing is of the hills behind Vinci with a view overlooking the rocky promontory of the town and the plain stretching out before it. The outcroppings in the foreground bear a striking resemblance to those of his unique composition, 'The Virgin of the Rocks', suggesting a deep childhood appreciation of this wild terrain. and an identification with that religious man of the mountains, John the Baptist, who was also the topic of Leonardo's last known painting. Following this trail leads to a line of possible selfportraits continuing the age-regression concept back to a self view at about two years of age.

  17. Amid the possible causes of a very famous foxing: molecular and microscopic insight into Leonardo da Vinci's self‐portrait

    PubMed Central

    Tafer, Hakim; Sterflinger, Katja; Pinzari, Flavia

    2015-01-01

    Summary Leonardo da Vinci's self‐portrait is affected by foxing spots. The portrait has no fungal or bacterial infections in place, but is contaminated with airborne spores and fungal material that could play a role in its disfigurement. The knowledge of the nature of the stains is of great concern because future conservation treatments should be derived from scientific investigations. The lack of reliable scientific data, due to the non‐culturability of the microorganisms inhabiting the portrait, prompted the investigation of the drawing using non‐invasive and micro‐invasive sampling, in combination with scanning electron microscope (SEM) imaging and molecular techniques. The fungus E urotium halophilicum was found in foxing spots using SEM analyses. Oxalates of fungal origin were also documented. Both findings are consistent with the hypothesis that tonophilic fungi germinate on paper metabolizing organic acids, oligosaccharides and proteic compounds, which react chemically with the material at a low water activity, forming brown products and oxidative reactions resulting in foxing spots. Additionally, molecular techniques enabled a screening of the fungi inhabiting the portrait and showed differences when different sampling techniques were employed. Swabs samples showed a high abundance of lichenized Ascomycota, while the membrane filters showed a dominance of A cremonium sp. colonizing the drawing. PMID:26111623

  18. Amid the possible causes of a very famous foxing: molecular and microscopic insight into Leonardo da Vinci's self-portrait.

    PubMed

    Piñar, Guadalupe; Tafer, Hakim; Sterflinger, Katja; Pinzari, Flavia

    2015-12-01

    Leonardo da Vinci's self-portrait is affected by foxing spots. The portrait has no fungal or bacterial infections in place, but is contaminated with airborne spores and fungal material that could play a role in its disfigurement. The knowledge of the nature of the stains is of great concern because future conservation treatments should be derived from scientific investigations. The lack of reliable scientific data, due to the non-culturability of the microorganisms inhabiting the portrait, prompted the investigation of the drawing using non-invasive and micro-invasive sampling, in combination with scanning electron microscope (SEM) imaging and molecular techniques. The fungus Eurotium halophilicum was found in foxing spots using SEM analyses. Oxalates of fungal origin were also documented. Both findings are consistent with the hypothesis that tonophilic fungi germinate on paper metabolizing organic acids, oligosaccharides and proteic compounds, which react chemically with the material at a low water activity, forming brown products and oxidative reactions resulting in foxing spots. Additionally, molecular techniques enabled a screening of the fungi inhabiting the portrait and showed differences when different sampling techniques were employed. Swabs samples showed a high abundance of lichenized Ascomycota, while the membrane filters showed a dominance of Acremonium sp. colonizing the drawing.

  19. The oldest anatomical handmade skull of the world c. 1508: 'the ugliness of growing old' attributed to Leonardo da Vinci.

    PubMed

    Missinne, Stefaan J

    2014-06-01

    The author discusses a previously unknown early sixteenth-century renaissance handmade anatomical miniature skull. The small, naturalistic skull made from an agate (calcedonia) stone mixture (mistioni) shows remarkable osteologic details. Dr. Saban was the first to link the skull to Leonardo. The three-dimensional perspective of and the search for the senso comune are discussed. Anatomical errors both in the drawings of Leonardo and this skull are presented. The article ends with the issue of physiognomy, his grotesque faces, the Perspective Communis and his experimenting c. 1508 with the stone mixture and the human skull. Evidence, including the Italian scale based on Crazie and Braccia, chemical analysis leading to a mine in Volterra and Leonardo's search for the soul in the skull are presented. Written references in the inventory of Salai (1524), the inventory of the Villa Riposo (Raffaello Borghini 1584) and Don Ambrogio Mazenta (1635) are reviewed. The author attributes the skull c. 1508 to Leonardo da Vinci.

  20. The mother relationship and artistic inhibition in the lives of Leonardo da Vinci and Erik H. Erikson.

    PubMed

    Capps, Donald

    2008-12-01

    In four earlier articles, I focused on the theme of the relationship of melancholia and the mother, and suggested that the melancholic self may experience humor (Capps, 2007a), play (Capps, 2007b), dreams (Capps, 2008a), and art (Capps, 2008b) as restorative resources. I argued that Erik H. Erikson found these resources to be valuable remedies for his own melancholic condition, which had its origins in the fact that he was illegitimate and was raised solely by his mother until he was three years old, when she remarried. In this article, I focus on two themes in Freud's Leonardo da Vinci and a memory of his childhood (1964): Leonardo's relationship with his mother in early childhood and his inhibitions as an artist. I relate these two themes to Erikson's own early childhood and his failure to achieve his goal as an aspiring artist in his early twenties. The article concludes with a discussion of Erikson's frustrated aspirations to become an artist and his emphasis, in his psychoanalytic work, on children's play.

  1. 2D and 3D optical diagnostic techniques applied to Madonna dei Fusi by Leonardo da Vinci

    NASA Astrophysics Data System (ADS)

    Fontana, R.; Gambino, M. C.; Greco, M.; Marras, L.; Materazzi, M.; Pampaloni, E.; Pelagotti, A.; Pezzati, L.; Poggi, P.; Sanapo, C.

    2005-06-01

    3D measurement and modelling have been traditionally applied to statues, buildings, archeological sites or similar large structures, but rarely to paintings. Recently, however, 3D measurements have been performed successfully also on easel paintings, allowing to detect and document the painting's surface. We used 3D models to integrate the results of various 2D imaging techniques on a common reference frame. These applications show how the 3D shape information, complemented with 2D colour maps as well as with other types of sensory data, provide the most interesting information. The 3D data acquisition was carried out by means of two devices: a high-resolution laser micro-profilometer, composed of a commercial distance meter mounted on a scanning device, and a laser-line scanner. The 2D data acquisitions were carried out using a scanning device for simultaneous RGB colour imaging and IR reflectography, and a UV fluorescence multispectral image acquisition system. We present here the results of the techniques described, applied to the analysis of an important painting of the Italian Reinassance: `Madonna dei Fusi', attributed to Leonardo da Vinci.

  2. Michelangelo in Florence, Leonardo in Vinci.

    ERIC Educational Resources Information Center

    Herberholz, Barbara

    2003-01-01

    Provides background information on the lives and works of Michelangelo and Leonardo da Vinci. Focuses on the artwork of the artists and the museums where their work is displayed. Includes museum photographs of their work. (CMK)

  3. Integrating Leonardo da Vinci's principles of demonstration, uncertainty, and cultivation in contemporary nursing education.

    PubMed

    Story, Lachel; Butts, Janie

    2014-03-01

    Nurses today are facing an ever changing health care system. Stimulated by health care reform and limited resources, nursing education is being challenged to prepare nurses for this uncertain environment. Looking to the past can offer possible solutions to the issues nursing education is confronting. Seven principles of da Vincian thinking have been identified (Gelb, 2004). As a follow-up to an exploration of the curiosità principle (Butts & Story, 2013), this article will explore the three principles of dimostrazione, sfumato, and corporalita. Nursing faculty can set the stage for a meaningful educational experience through these principles of demonstration (dimostrazione), uncertainty (sfumato), and cultivation (corporalita). Preparing nurses not only to manage but also to flourish in the current health care environment that will enhance the nurse's and patient's experience.

  4. Placement of {sup 125}I implants with the da Vinci robotic system after video-assisted thoracoscopic wedge resection: A feasibility study

    SciTech Connect

    Pisch, Julianna . E-mail: jpisch@bethisraelny.org; Belsley, Scott J.; Ashton, Robert; Wang Lin; Woode, Rudolph; Connery, Cliff

    2004-11-01

    Purpose: To evaluate the feasibility of using the da Vinci robotic system for radioactive seed placement in the wedge resection margin of pigs' lungs. Methods and materials: Video-assisted thoracoscopic wedge resection was performed in the upper and lower lobes in pigs. Dummy {sup 125}I seeds embedded in absorbable sutures were sewn into the resection margin with the aid of the da Vinci robotic system without complications. In the 'loop technique,' the seeds were placed in a cylindrical pattern; in the 'longitudinal,' they were above and lateral to the resection margin. Orthogonal radiographs were taken in the operating room. For dose calculation, Variseed 66.7 (Build 11312) software was used. Results: With looping seed placement, in the coronal view, the dose at 1 cm from the source was 97.0 Gy; in the lateral view it was 107.3 Gy. For longitudinal seed placement, the numbers were 89.5 Gy and 70.0 Gy, respectively. Conclusion: Robotic technology allows direct placement of radioactive seeds into the resection margin by endoscopic surgery. It overcomes the technical difficulties of manipulating in the narrow chest cavity. With the advent of robotic technology, new options in the treatment of lung cancer, as well as other malignant tumors, will become available.

  5. Specific learning curve for port placement and docking of da Vinci(®) Surgical System: one surgeon's experience in robotic-assisted radical prostatectomy.

    PubMed

    Dal Moro, F; Secco, S; Valotto, C; Artibani, W; Zattoni, F

    2012-12-01

    Port placement and docking of the da Vinci(®) Surgical System is fundamental in robotic-assisted laparoscopic radical prostatectomy (RALP). The aim of our study was to investigate learning curves for port placement and docking of robots (PPDR) in RALP. This manuscript is a retrospective review of prospectively collected data looking at PPDR in 526 patients who underwent RALP in our institute from April 2005 to May 2010. Data included patient-factor features such as body mass index (BMI), and pre-, intra- and post-operative data. Intra-operative information included operation time, subdivided into anesthesia, PPDR and console times. 526 patients underwent RALP, but only those in whom PPDR was performed by the same surgeon without laparoscopic and robotic experience (F.D.M.) were studied, totalling 257 cases. The PPDR phase revealed an evident learning curve, comparable with other robotic phases. Efficiency improved until approximately the 60th case (P < 0.001), due more to effective port placement than to docking of robotic arms. In our experience, conversion to open surgery is so rare that statistical evaluation is not significant. Conversion due to robotic device failure is also very rare. This study on da Vinci procedures in RALP revealed a learning curve during PPDR and throughout the robotic-assisted procedure, reaching a plateau after 60 cases.

  6. A psychoanalytic understanding of the desire for knowledge as reflected in Freud's Leonardo da Vinci and a memory of his childhood.

    PubMed

    Blass, Rachel B

    2006-10-01

    The author offers an understanding of the psychoanalytic notion of the desire for knowledge and the possibility of attaining it as it fi nds expression in Freud's Leonardo da Vinci and a memory of his childhood. This understanding has not been explicitly articulated by Freud but may be considered integral to psychoanalysis' Weltanschauung as shaped by Freud's legacy. It emerges through an attempt to explain basic shifts, contradictions, inconsistencies and tensions that become apparent from a close reading of the text of Leonardo. Articulating this implicit understanding of knowledge provides the grounds for a stance on epistemology that is integral to psychoanalysis and relevant to contemporary psychoanalytic concerns on this topic. This epistemology focuses on the necessary involvement of passion, rather than detachment, in the search for knowledge and views the psychoanalytic aim of self-knowledge as a derivative, and most immediate expression, of a broader and more basic human drive to know.

  7. Da Vinci Coding? Using Renaissance Artists’ Depictions of the Brain to Engage Student Interest in Neuroanatomy

    PubMed Central

    Watson, Todd D.

    2013-01-01

    This report describes a pair of brief, interactive classroom exercises utilizing Renaissance artists’ depictions of the brain to help increase student interest in learning basic neuroanatomy. Undergraduate students provided anonymous quantitative evaluations of both exercises. The feedback data suggest that students found both exercises engaging. The data also suggest that the first exercise increased student interest in learning more about neuroanatomy in general, while the second provided useful practice in identifying major neuroanatomical structures. Overall, the data suggest that these exercises may be a useful addition to courses that introduce or review neuroanatomical concepts. PMID:23805058

  8. The left ventricle as a mechanical engine: from Leonardo da Vinci to the echocardiographic assessment of peak power output-to-left ventricular mass.

    PubMed

    Dini, Frank L; Guarini, Giacinta; Ballo, Piercarlo; Carluccio, Erberto; Maiello, Maria; Capozza, Paola; Innelli, Pasquale; Rosa, Gian M; Palmiero, Pasquale; Galderisi, Maurizio; Razzolini, Renato; Nodari, Savina

    2013-03-01

    The interpretation of the heart as a mechanical engine dates back to the teachings of Leonardo da Vinci, who was the first to apply the laws of mechanics to the function of the heart. Similar to any mechanical engine, whose performance is proportional to the power generated with respect to weight, the left ventricle can be viewed as a power generator whose performance can be related to left ventricular mass. Stress echocardiography may provide valuable information on the relationship between cardiac performance and recruited left ventricular mass that may be used in distinguishing between adaptive and maladaptive left ventricular remodeling. Peak power output-to-mass, obtained during exercise or pharmacological stress echocardiography, is a measure that reflects the number of watts that are developed by 100 g of left ventricular mass under maximal stimulation. Power output-to-mass may be calculated as left ventricular power output per 100 g of left ventricular mass: 100× left ventricular power output divided by left ventricular mass (W/100 g). A simplified formula to calculate power output-to-mass is as follows: 0.222 × cardiac output (l/min) × mean blood pressure (mmHg)/left ventricular mass (g). When the integrity of myocardial structure is compromised, a mismatch becomes apparent between maximal cardiac power output and left ventricular mass; when this occurs, a reduction of the peak power output-to-mass index is observed.

  9. Investigating the 'Uncatchable Smile' in Leonardo da Vinci's La Bella Principessa: A Comparison with the Mona Lisa and Pollaiuolo's Portrait of a Girl.

    PubMed

    Soranzo, Alessandro; Newberry, Michelle

    2016-10-04

    This paper discusses how the 'Uncatchable Smile' illusion in Leonardo da Vinci's La Bella Principessa portrait was discovered. Kemp and Cotte(1) described the expression of the Princess as ambiguous and "subtle to an inexpressible degree". A combination of three methods was used (inter-observation, structured interviews, and psychophysical experiments) to identify what may underlie this 'ambiguity'. The inter-observation and the structured interview methods were firstly applied to generate experimental hypotheses that were successively tested by a series of psychophysical experiments. The combination of these research methods minimizes the impact of the researcher's beliefs and biases in the development of the research design. It emerged that the ambiguity in La Bella Principessa is triggered by a change in the perceived level of contentment in her facial expression and that this perceptual change is attributable to a visual illusion relating to her mouth. Moreover, it was found that a similar effect can be observed in the Mona Lisa. As the smile in La Bella Principessa disappears as soon as the viewer tries to 'catch it', we named this visual illusion the 'Uncatchable Smile'. The elusive quality of the Mona Lisa's smile(2) is probably why the portrait is so famous, and so the existence of a similar ambiguity in a portrait painted by Leonardo prior to the Mona Lisa is even more interesting.

  10. Reforming Upper Secondary Education in Europe. The Leonardo da Vinci Project Post-16 Strategies. Surveys of Strategies for Post-16 Education To Improve the Parity of Esteem for Initial Vocational Education in Eight European Educational Systems. Theory into Practice 92. Institute for Educational Research Publication Series B.

    ERIC Educational Resources Information Center

    Lasonen, Johanna, Ed.

    This book contains the following papers on the Leonardo da Vinci project: "Looking for Post-16 Education Strategies for Parity of Esteem in Europe" (Lasonen); "Improving Parity of Esteem as a Policy Goal" (Makinen, Volanen); "Alternative Strategies for Parity of Esteem between General/Academic and Vocational Education in…

  11. A second Leonardo da Vinci?

    PubMed

    Nakano, Mitsuko; Endo, Toshitaka; Tanaka, Shigeki

    2003-10-01

    We describe a young woman who suddenly began mirror writing with her right hand and has not reverted to normal writing for more than 6 years, although she writes normally with her left hand. She is ambidextrous, although she had previously used only her right hand for writing and drawing. Since it is much easier for her to use right-handed mirror writing, she uses her left hand only for writing meant to be read by others and her right hand for all other writing. Her hobbies are sculpture and painting, and her chief complaint is migraine accompanied by sensory and perceptive disturbances.

  12. What Is the Moral Imperative of Workplace Learning: Unlocking the DaVinci Code of Human Resource Development?

    ERIC Educational Resources Information Center

    Short, Tom

    2006-01-01

    In the course of the author's doctoral study, he is exploring the strategic linkages between learning activities in the modern workplace and the long-term success they bring to organisations. For many years, this challenge has been the Holy Grail of human resource (HR) development practitioners, who invest heavily on training and professional…

  13. VINCI: the VLT Interferometer commissioning instrument

    NASA Astrophysics Data System (ADS)

    Kervella, Pierre; Coudé du Foresto, Vincent; Glindemann, Andreas; Hofmann, Reiner

    2000-07-01

    The Very Large Telescope Interferometer (VLTI) is a complex system, made of a large number of separated elements. To prepare an early successful operation, it will require a period of extensive testing and verification to ensure that the many devices involved work properly together, and can produce meaningful data. This paper describes the concept chosen for the VLTI commissioning instrument, LEONARDO da VINCI, and details its functionalities. It is a fiber based two-way beam combiner, associated with an artificial star and an alignment verification unit. The technical commissioning of the VLTI is foreseen as a stepwise process: fringes will first be obtained with the commissioning instrument in an autonomous mode (no other parts of the VLTI involved); then the VLTI telescopes and optical trains will be tested in autocollimation; finally fringes will be observed on the sky.

  14. [Studies of vision by Leonardo da Vinci].

    PubMed

    Berggren, L

    2001-01-01

    Leonardo was an advocate of the intromission theory of vision. Light rays from the object to the eye caused visual perceptions which were transported to the brain ventricles via a hollow optic nerve. Leonardo introduced wax injections to explore the ventricular system. Perceptions were assumed to go to the "senso comune" in the middle (3rd) ventricle, also the seat of the soul. The processing station "imprensiva" in the anterior lateral horns together with memory "memoria" in th posterior (4th) ventricle integrated the visual perceptions to visual experience. - Leonardo's sketches with circular lenses in the center of the eye reveal that his dependence on medieval optics prevailed over anatomical observations. Drawings of the anatomy of the sectioned eye are missing although Leonardo had invented a new embedding technique. In order to dissect the eye without spilling its contents, the eye was first boiled in egg white and then cut. The procedure was now repeated and showed that the ovoid lens after boiling had become spherical. - Leonardo described that light rays were refracted and reflected in the eye but his imperfect anatomy prevented a development of physiological optics. He was, however, the first to compare the eye with a pin-hole camera (camera obscura). Leonardo's drawings of the inverted pictures on the back wall of a camera obscura inspired to its use as an instrument for artistic practice. The camera obscura was for centuries a model for explaining human vision.

  15. Tourism. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This brochure, part of a series about good practices in vocational training in the European Union, describes 10 projects that have promoted investment in human resources through training in the tourism sector to promote sustainable, or responsible, tourism. The projects and their countries of origin are as follows: (1) BEEFT, training of mobility…

  16. Automated Support for da Vinci Surgical System

    DTIC Science & Technology

    2011-05-01

    flexible and scalable client-server architecture to share data collected from different platforms (Figure 9). The system consists of • an HTTP web ...server with a secure web -based user interface and Simple Object Access Protocol (SOAP) end-point • a Java based application server based on the business...comparing performance data in the form of a dV-Trainer database. The web -server’s interface will provide an alternative for those who do not have

  17. The PAKY, HERMES, AESOP, ZEUS, and da Vinci robotic systems.

    PubMed

    Kim, Hyung L; Schulam, Peter

    2004-11-01

    In 1965 Gordon Moore, cofounder of Intel Corporation, made his famous observation now known as Moore's law. He predicted that computing capacity will double every 18 to 24 months. Since then, Moore's law has held true; the number of transistors per integrated computer circuit has doubled every couple of years. This relentless advance in computer technology ensures future advances in robotic technology. The ultimate goal of robotics is to allow surgeons to perform difficult procedures with a level of precision and improved clinical outcomes not possible by conventional methods. Robotics has the potential to enable surgeons with various levels of surgical skill to achieve a uniform outcome. As long as urologists continue to embrace technological advances and incorporate beneficial technology into their practice, the outlook for patients remains bright.

  18. Possible role of DaVinci Robot in uterine transplantation.

    PubMed

    Iavazzo, Christos; Gkegkes, Ioannis D

    2015-01-01

    Minimally invasive surgery, specifically robotic surgery, became a common technique used by gynecological surgeons over the last decade. The realization of the first human uterine transplantation commenced new perspectives in the treatment of uterine agenesia or infertility in women with history of hysterectomy at a young age. Robot-assisted technique may enhance the safety of the procedure by facilitating the microvascular anastomosis, vaginal anastomosis, and ligaments' fixation. This study proposes the formation of a multicenter collaboration group to organize a protocol with the aim to clarify the possible role of robotic surgery in uterine transplantation.

  19. Leonardo Da Vinci, the genius and the monsters. Casual encounters?

    PubMed

    Ciseri, Lorenzo Montemagno

    2014-01-01

    This article analyses Leonardo's interest in monsters and deformed reality, one of the lesser known aspects of his vast and multifaceted output. With the possible exception of his studies of physiognomy, relevant drawings, sketches and short stories represent a marginal aspect of his work, but they are nevertheless significant for historians of teratology. The purpose of this study is to provide a broad overview of the relationship between Leonardo and both the literature on mythological monsters and the reports on monstrous births that he either read about or witnessed personally. While aspects of his appreciation and attention to beauty and the pursuit of perfection and good proportions are the elements most emphasised in Leonardo's work, other no less interesting aspects related to deformity have been considered of marginal importance. My analysis will demonstrate that Leonardo approached the realm of monstrosity as if he considered abnormality a mirror of normality, deformity a mirror of harmony, and disease a mirror of health, as if to emphasise that, ultimately, it is the monster that gives the world the gift of normality. Two special cases of monstrosity are analysed: the famous monster of Ravenna, whose image was found among his papers, and a very rare case of parasitic conjoined twins (thoracopagus parasiticus) portrayed for the first time alive, probably in Florence, by Leonardo himself.

  20. Distance Learning. Leonardo da Vinci Series: Good Practices.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This brochure, part of a series about good practices in vocational training in the European Union, describes 12 projects that use distance learning to promote lifelong learning in adults. The projects and their countries of origin are as follows: (1) 3D Project, training in the use of IT tools for 3D simulation and animation and practical…

  1. Scientific Aspects of Leonardo da Vinci's Drawings: An Interdisciplinary Model.

    ERIC Educational Resources Information Center

    Struthers, Sally A.

    While interdisciplinary courses can help demonstrate the relevance of learning to students and reinforce education from different fields, they can be difficult to implement and are often not cost effective. An interdisciplinary art history course at Ohio's Sinclair Community College incorporates science into the art history curriculum, making use…

  2. The Rosslyn Code: Can Physics Explain a 500-Year Old Melody Etched in the Walls of a Scottish Chapel?

    SciTech Connect

    Wilson, Chris

    2011-10-19

    For centuries, historians have puzzled over a series of 213 symbols carved into the stone of Scotland’s Rosslyn Chapel. (Disclaimer: You may recognize this chapel from The Da Vinci Code, but this is real and unrelated!) Several years ago, a composer and science enthusiast noticed that the symbols bore a striking similarity to Chladni patterns, the elegant images that form on a two- dimensional surface when it vibrates at certain frequencies. This man’s theory: A 500-year-old melody was inscribed in the chapel using the language of physics. But not everyone is convinced. Slate senior editor Chris Wilson travelled to Scotland to investigate the claims and listen to this mysterious melody, whatever it is. Come find out what he discovered, including images of the patterns and audio of the music they inspired.

  3. The rare DAT coding variant Val559 perturbs DA neuron function, changes behavior, and alters in vivo responses to psychostimulants.

    PubMed

    Mergy, Marc A; Gowrishankar, Raajaram; Gresch, Paul J; Gantz, Stephanie C; Williams, John; Davis, Gwynne L; Wheeler, C Austin; Stanwood, Gregg D; Hahn, Maureen K; Blakely, Randy D

    2014-11-04

    Despite the critical role of the presynaptic dopamine (DA) transporter (DAT, SLC6A3) in DA clearance and psychostimulant responses, evidence that DAT dysfunction supports risk for mental illness is indirect. Recently, we identified a rare, nonsynonymous Slc6a3 variant that produces the DAT substitution Ala559Val in two male siblings who share a diagnosis of attention-deficit hyperactivity disorder (ADHD), with other studies identifying the variant in subjects with bipolar disorder (BPD) and autism spectrum disorder (ASD). Previously, using transfected cell studies, we observed that although DAT Val559 displays normal total and surface DAT protein levels, and normal DA recognition and uptake, the variant transporter exhibits anomalous DA efflux (ADE) and lacks capacity for amphetamine (AMPH)-stimulated DA release. To pursue the significance of these findings in vivo, we engineered DAT Val559 knock-in mice, and here we demonstrate in this model the presence of elevated extracellular DA levels, altered somatodendritic and presynaptic D2 DA receptor (D2R) function, a blunted ability of DA terminals to support depolarization and AMPH-evoked DA release, and disruptions in basal and psychostimulant-evoked locomotor behavior. Together, our studies demonstrate an in vivo functional impact of the DAT Val559 variant, providing support for the ability of DAT dysfunction to impact risk for mental illness.

  4. Molecular cloning and sequence analysis of the gene coding for the 57kDa soluble antigen of the salmonid fish pathogen Renibacterium salmoninarum

    USGS Publications Warehouse

    Chien, Maw-Sheng; Gilbert , Teresa L.; Huang, Chienjin; Landolt, Marsha L.; O'Hara, Patrick J.; Winton, James R.

    1992-01-01

    The complete sequence coding for the 57-kDa major soluble antigen of the salmonid fish pathogen, Renibacterium salmoninarum, was determined. The gene contained an opening reading frame of 1671 nucleotides coding for a protein of 557 amino acids with a calculated Mr value of 57190. The first 26 amino acids constituted a signal peptide. The deduced sequence for amino acid residues 27–61 was in agreement with the 35 N-terminal amino acid residues determined by microsequencing, suggesting the protein in synthesized as a 557-amino acid precursor and processed to produce a mature protein of Mr 54505. Two regions of the protein contained imperfect direct repeats. The first region contained two copies of an 81-residue repeat, the second contained five copies of an unrelated 25-residue repeat. Also, a perfect inverted repeat (including three in-frame UAA stop codons) was observed at the carboxyl-terminus of the gene.

  5. Locus NMB0035 codes for a 47-kDa surface-accessible conserved antigen in Neisseria.

    PubMed

    Arenas, Jesús; Abel, Ana; Sánchez, Sandra; Alcalá, Belén; Criado, María T; Ferreirós, Carlos M

    2006-12-01

    A47 kDa neisserial outer-membrane antigenic protein (P47) was purified to homogeneity and used to prepare polyclonal anti-P47 antisera. Protein P47 was identified by MALDI-TOF fingerprinting analysis as the hypothetical lipoprotein NMB0035. Two-dimensional diagonal SDS-PAGE results suggested that, contrary to previous findings, P47 is not strongly associated with other proteins in membrane complexes. Western blotting with the polyclonal monospecific serum showed that linear P47 epitopes were expressed in similar amounts in the 27 Neisseria meningitidis strains tested and, to a lesser extent, in commensal Neisseria, particularly N. lactamica. However, dot-blotting assays with the same serum demonstrated binding variability between meningococcal strains, indicating differences in surface accessibility or steric hindrance by other surface structures. Specific anti-P47 antibodies were bactericidal against the homologous strain but had variable activity against heterologous strains, consistent with the results from dot-blotting experiments. An in-depth study of P47 is necessary to evaluate its potential as a candidate for new vaccine designs.

  6. Novel dynamic information integration during da Vinci robotic partial nephrectomy and radical nephrectomy.

    PubMed

    Bhayani, Sam B; Snow, Devon C

    2008-07-01

    With the increasing discovery of small renal neoplasms, minimally invasive excisional approaches have become more popular. Robotic partial nephrectomy is an emerging procedure. During robotic renal surgery, the console surgeon often has a need to view images or other data during the surgical dissection. Herein, we describe the preliminary use of integrative surgical imaging in the console surgical view during 20 cases of robotic partial and radical nephrectomy. Integration of this technology, termed Tilepro, allows the surgeon to view data within the robotic console and thus prevents disengagement. The success rate of transmission was 95% and the usefulness of the transmission was 89%. Complications included delayed transmission and cabling issues. This technology is useful in robotic renal surgery and may have benefits in telepresence or other surgical fields.

  7. Back to the Drawing Board Reconstructing DaVinci's Vitruvian Man to Teach Anatomy

    ERIC Educational Resources Information Center

    Babaian, C.

    2009-01-01

    In today's high tech world, one hardly expects to see the original chalkboard or blackboard utilized in research, teaching, or scientific communication, but having spent an equal number of years doing both art and biology and dabbling in computer graphics, the author has found the simple technology of the chalkboard and chalk to have incredible…

  8. Social and Occupational Integration of Disadvantaged People. Leonardo da Vinci Good Practices Series.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles nine European programs that exemplify good practice in social and occupational integration of disadvantaged people. The programs profiled are as follows: (1) Restaurant Venezia (a CD-ROM program to improve the reading and writing skills of young people in Luxembourg who have learning difficulties); (2) an integrated…

  9. Building Skills and Qualifications among SME Employees. Leonardo da Vinci Good Practices Series.

    ERIC Educational Resources Information Center

    Commission of the European Communities, Brussels (Belgium). Directorate-General for Education and Culture.

    This document profiles 10 European programs that exemplify good practice in building skills and qualifications among employees of small and medium enterprises (SMEs). The programs profiled are as follows: (1) TRICTSME (a program providing World Wide Web-based information and communication technologies training for SMEs in manufacturing); (2)…

  10. [From Leonardo Da Vinci to present days; from the history of antiplague costume].

    PubMed

    Kalmykov, A A; Aminev, R M; Korneev, A G; Polyakov, V S; Artebyakin, S V

    2016-01-01

    As a prototype of the antiplague costume can be considered a special clothing, which physicians in medieval Europe wear for protection in plague nidus. Inventor of the first antiplague costume is considered to be a French doctor Charles de Lorme (1619). Much later, in 1878, a Russian professor Pashutin V V offered to use a costume, which looked like a hermetically sealed "bag" with a special breathing device aimed at protection of medical staff. Later, professor O.I. Dogel's respirator became well-known (1889). At the beginning of 20th century as part of the antiplague costume was used a charcoal filter mask, invented by Zelinsky N.D. Requirements to order the use of modern means of individual protection when working in nidus of especially dangerous infections identified sanitary-epidemiological rules, which reflect issues of laboratory workers working and protective clothing, respiratory protection, and view, especially operation, the procedure of putting on, removing and disinfecting antiplague costumes, pneumocostumes, pneumohelmets, isolation suits, gas-protection boxes, etc.

  11. Moving towards Optimising Demand-Led Learning: The 2005-2007 ECUANET Leonardo Da Vinci Project

    ERIC Educational Resources Information Center

    Dealtry, Richard; Howard, Keith

    2008-01-01

    Purpose: The purpose of this paper is to present the key project learning points and outcomes as a guideline for the future quality management of demand-led learning and development. Design/methodology/approach: The research methodology was based upon a corporate university blueprint architecture and browser toolkit developed by a member of the…

  12. Depth of Monocular Elements in a Binocular Scene: The Conditions for da Vinci Stereopsis

    ERIC Educational Resources Information Center

    Cook, Michael; Gillam, Barbara

    2004-01-01

    Quantitative depth based on binocular resolution of visibility constraints is demonstrated in a novel stereogram representing an object, visible to 1 eye only, and seen through an aperture or camouflaged against a background. The monocular region in the display is attached to the binocular region, so that the stereogram represents an object which…

  13. Da Vincis Children Take Flight: Unmanned Aircraft Systems in the Homeland

    DTIC Science & Technology

    2014-03-01

    present, it is possible for UAS sensors and guidance systems to be hacked . In fact, in 2012, Iran hijacked a U.S. RQ-170 drone, claiming to have used...2 Yatish Yadav, “UAVs Prone to Hacking , Warn Intel Agencies,” Indian Express, July 25, 2013. http...www.newindianexpress.com/nation/UAVs-prone-to- hacking -warn-intel- agencies/2013/07/25/article1700651.ece#.UwkD0mJdXhc. 4 2. Growing Market Competition Delaying

  14. Polar Codes

    DTIC Science & Technology

    2014-12-01

    density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes. iii CONTENTS EXECUTIVE SUMMARY...the most common. Many civilian systems use low density parity check (LDPC) FEC codes, and the Navy is planning to use LDPC for some future systems...other forward error correction methods: a turbo code, a low density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes

  15. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships.

  16. Ethical coding.

    PubMed

    Resnik, Barry I

    2009-01-01

    It is ethical, legal, and proper for a dermatologist to maximize income through proper coding of patient encounters and procedures. The overzealous physician can misinterpret reimbursement requirements or receive bad advice from other physicians and cross the line from aggressive coding to coding fraud. Several of the more common problem areas are discussed.

  17. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  18. Leonardo da Vinci, visual perspective and the crystalline sphere (lens): if only Leonardo had had a freezer.

    PubMed

    Hilloowala, Rumy

    2004-06-01

    This study confirms Leonardo's claim to have experimented on the bovine eye to determine the internal anatomy of the eye. The experiment, as described by Leonardo, was repeated in our laboratory. The study further discusses Leonardo's primary interest in the study of the eye (especially the lens), to determine how the image of an object which enters the eye in an inverted form is righted. The study shows the evolution of Leonardo's understanding of the anatomy and the physiology of vision. Initially, in keeping with his reading of the literature, the lens was placed in the centre but he made it globular. Later he promulgated two theories, reflection from the uvea and refraction within the lens to explain reversal of the image in the eye. Subsequently he rejected the first theory and, putting credence in the second theory, experimented (1509) to show that the lens is globular and is centrally placed. The fact that the present knowledge about the lens is at variance from his findings is not because he did not carry out the experiment, as suggested by some modern authors, but because of the limitation of the techniques available to him at the time.

  19. Sharing code

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

  20. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  1. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  2. TRACKING CODE DEVELOPMENT FOR BEAM DYNAMICS OPTIMIZATION

    SciTech Connect

    Yang, L.

    2011-03-28

    Dynamic aperture (DA) optimization with direct particle tracking is a straight forward approach when the computing power is permitted. It can have various realistic errors included and is more close than theoretical estimations. In this approach, a fast and parallel tracking code could be very helpful. In this presentation, we describe an implementation of storage ring particle tracking code TESLA for beam dynamics optimization. It supports MPI based parallel computing and is robust as DA calculation engine. This code has been used in the NSLS-II dynamics optimizations and obtained promising performance.

  3. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  4. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  5. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  6. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints.

  7. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  8. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  9. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  10. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  11. Bar Codes for Libraries.

    ERIC Educational Resources Information Center

    Rahn, Erwin

    1984-01-01

    Discusses the evolution of standards for bar codes (series of printed lines and spaces that represent numbers, symbols, and/or letters of alphabet) and describes the two types most frequently adopted by libraries--Code-A-Bar and CODE 39. Format of the codes is illustrated. Six references and definitions of terminology are appended. (EJS)

  12. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  13. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  14. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  15. Efficient entropy coding for scalable video coding

    NASA Astrophysics Data System (ADS)

    Choi, Woong Il; Yang, Jungyoup; Jeon, Byeungwoo

    2005-10-01

    The standardization for the scalable extension of H.264 has called for additional functionality based on H.264 standard to support the combined spatio-temporal and SNR scalability. For the entropy coding of H.264 scalable extension, Context-based Adaptive Binary Arithmetic Coding (CABAC) scheme is considered so far. In this paper, we present a new context modeling scheme by using inter layer correlation between the syntax elements. As a result, it improves coding efficiency of entropy coding in H.264 scalable extension. In simulation results of applying the proposed scheme to encoding the syntax element mb_type, it is shown that improvement in coding efficiency of the proposed method is up to 16% in terms of bit saving due to estimation of more adequate probability model.

  16. An Interferometric Study of the Fomalhaut Inner Debris Disk. I. Near-Infrared Detection of Hot Dust with VLTI/VINCI

    NASA Astrophysics Data System (ADS)

    Absil, Olivier; Mennesson, Bertrand; Le Bouquin, Jean-Baptiste; Di Folco, Emmanuel; Kervella, Pierre; Augereau, Jean-Charles

    2009-10-01

    The innermost parts of dusty debris disks around main-sequence stars are currently poorly known due to the high contrast and small angular separation with their parent stars. Using near-infrared interferometry, we aim to detect the signature of hot dust around the nearby A4 V star Fomalhaut, which has already been suggested to harbor a warm dust population in addition to a cold dust ring located at about 140 AU. Archival data obtained with the VINCI instrument at the VLTI are used to study the fringe visibility of the Fomalhaut system at projected baseline lengths ranging from 4 m to 140 m in the K band. A significant visibility deficit is observed at short baselines with respect to the expected visibility of the sole stellar photosphere. This is interpreted as the signature of resolved circumstellar emission, producing a relative flux of 0.88% ± 0.12% with respect to the stellar photosphere. While our interferometric data cannot directly constrain the morphology of the excess emission source, complementary data from the literature allow us to discard an off-axis point-like object as the source of circumstellar emission. We argue that the thermal emission from hot dusty grains located within 6 AU from Fomalhaut is the most plausible explanation for the detected excess. Our study also provides a revised limb-darkened diameter for Fomalhaut (θLD = 2.223 ± 0.022 mas), taking into account the effect of the resolved circumstellar emission. Based on observations made with ESO Telescopes at the Paranal Observatory (public VINCI commissioning data).

  17. [A Case of Advanced Rectal Cancer in Which Combined Prostate Removal and ISR Using the da Vinci Surgical System with Preoperative Chemotherapy Allowed Curative Resection].

    PubMed

    Kawakita, Hideaki; Katsumata, Kenji; Kasahara, Kenta; Kuwabara, Hiroshi; Shigoka, Masatoshi; Matsudo, Takaaki; Enomoto, Masanobu; Ishizaki, Tetsuo; Hisada, Masayuki; Kasuya, Kazuhiko; Tsuchida, Akihiko

    2016-11-01

    A 53-year-old male presented with a chief complaint of dyschezia.Lower gastrointestinal endoscopy confirmed the presence of a type II tumor in the lower part of the rectum, and a biopsy detected a well-differentiated adenocarcinoma.As invasion of the prostate and levator muscle of the anus was suspected on diagnostic imaging, surgery was performed after preoperative chemotherapy.With no clear postoperative complications, the patient was discharged 26 days after surgery. After 24 months, the number of urination ranged from 1 to 6, with a Wexner score of 6 and a mild desire to urinate in the absence of incontinence.At present, the patient is alive without recurrence.When combined with chemotherapy, robotassisted surgery allows the curative resection of extensive rectal cancer involving the suspected invasion of other organs.In this respect, it is likely to be a useful method to conserve anal and bladder function.

  18. Imaging atherosclerosis with hybrid [18F]fluorodeoxyglucose positron emission tomography/computed tomography imaging: what Leonardo da Vinci could not see.

    PubMed

    Cocker, Myra S; Mc Ardle, Brian; Spence, J David; Lum, Cheemun; Hammond, Robert R; Ongaro, Deidre C; McDonald, Matthew A; Dekemp, Robert A; Tardif, Jean-Claude; Beanlands, Rob S B

    2012-12-01

    Prodigious efforts and landmark discoveries have led toward significant advances in our understanding of atherosclerosis. Despite significant efforts, atherosclerosis continues globally to be a leading cause of mortality and reduced quality of life. With surges in the prevalence of obesity and diabetes, atherosclerosis is expected to have an even more pronounced impact upon the global burden of disease. It is imperative to develop strategies for the early detection of disease. Positron emission tomography (PET) imaging utilizing [(18)F]fluorodeoxyglucose (FDG) may provide a non-invasive means of characterizing inflammatory activity within atherosclerotic plaque, thus serving as a surrogate biomarker for detecting vulnerable plaque. The aim of this review is to explore the rationale for performing FDG imaging, provide an overview into the mechanism of action, and summarize findings from the early application of FDG PET imaging in the clinical setting to evaluate vascular disease. Alternative imaging biomarkers and approaches are briefly discussed.

  19. Honesty and Honor Codes.

    ERIC Educational Resources Information Center

    McCabe, Donald; Trevino, Linda Klebe

    2002-01-01

    Explores the rise in student cheating and evidence that students cheat less often at schools with an honor code. Discusses effective use of such codes and creation of a peer culture that condemns dishonesty. (EV)

  20. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  1. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  2. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  3. DIANE multiparticle transport code

    NASA Astrophysics Data System (ADS)

    Caillaud, M.; Lemaire, S.; Ménard, S.; Rathouit, P.; Ribes, J. C.; Riz, D.

    2014-06-01

    DIANE is the general Monte Carlo code developed at CEA-DAM. DIANE is a 3D multiparticle multigroup code. DIANE includes automated biasing techniques and is optimized for massive parallel calculations.

  4. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  5. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  6. Mapping Local Codes to Read Codes.

    PubMed

    Bonney, Wilfred; Galloway, James; Hall, Christopher; Ghattas, Mikhail; Tramma, Leandro; Nind, Thomas; Donnelly, Louise; Jefferson, Emily; Doney, Alexander

    2017-01-01

    Background & Objectives: Legacy laboratory test codes make it difficult to use clinical datasets for meaningful translational research, where populations are followed for disease risk and outcomes over many years. The Health Informatics Centre (HIC) at the University of Dundee hosts continuous biochemistry data from the clinical laboratories in Tayside and Fife dating back as far as 1987. However, the HIC-managed biochemistry dataset is coupled with incoherent sample types and unstandardised legacy local test codes, which increases the complexity of using the dataset for reasonable population health outcomes. The objective of this study was to map the legacy local test codes to the Scottish 5-byte Version 2 Read Codes using biochemistry data extracted from the repository of the Scottish Care Information (SCI) Store.

  7. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  8. Cryptographer

    ERIC Educational Resources Information Center

    Sullivan, Megan

    2005-01-01

    For the general public, the field of cryptography has recently become famous as the method used to uncover secrets in Dan Brown's fictional bestseller, The Da Vinci Code. But the science of cryptography has been popular for centuries--secret hieroglyphics discovered in Egypt suggest that code-making dates back almost 4,000 years. In today's…

  9. [The "myologie dynamique" by Girolamo Fabrizi da Aquapendente in the scientific language in the Renaissance age (XVI-XVII)].

    PubMed

    Stroppiana, L

    1989-01-01

    Beginning from the XV century, mechanical materialism underwent an evolution in "biological mechanics" within the scientific doctrine. Among the greatest exponents of this new current there were two Italian men, Leonardo da Vinci (1452-1519) and Girolamo da Acquapendente (1533-1619). By the trend given by Leonardo, the myology, instead of being a static science, took a dynamic meaning and valence. Later, Fabrizi resumed and investigated the subject above all in its less known expression, elaborating an original theory. With Acquapendente, the anatomy lost its merely descriptive pecularity and evolved in analysis of the structure in connection with the function. Moreover, he opposed the syllogism against the mechanic language and the mathematical formulation. A new scientific way will be afterwards characterized by Galileo Galilei in the field of the physics and by Giovanni Alfonso Borrelli in the biology.

  10. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  11. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  12. Defeating the coding monsters.

    PubMed

    Colt, Ross

    2007-02-01

    Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.

  13. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  14. More box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    A new investigation shows that, starting from the BCH (21,15;3) code represented as a 7 x 3 matrix and adding a row and column to add even parity, one obtains an 8 x 4 matrix (32,15;8) code. An additional dimension is obtained by specifying odd parity on the rows and even parity on the columns, i.e., adjoining to the 8 x 4 matrix, the matrix, which is zero except for the fourth column (of all ones). Furthermore, any seven rows and three columns will form the BCH (21,15;3) code. This box code has the same weight structure as the quadratic residue and BCH codes of the same dimensions. Whether there exists an algebraic isomorphism to either code is as yet unknown.

  15. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  16. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  17. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  18. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  19. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  20. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  1. Transonic airfoil codes

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1979-01-01

    Computer codes for the design and analysis of transonic airfoils are considered. The design code relies on the method of complex characteristics in the hodograph plane to construct shockless airfoil. The analysis code uses artificial viscosity to calculate flows with weak shock waves at off-design conditions. Comparisons with experiments show that an excellent simulation of two dimensional wind tunnel tests is obtained. The codes have been widely adopted by the aircraft industry as a tool for the development of supercritical wing technology.

  2. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  3. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding.

    PubMed

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions.

  4. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  5. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  6. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  7. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  8. Coding Acoustic Metasurfaces.

    PubMed

    Xie, Boyang; Tang, Kun; Cheng, Hua; Liu, Zhengyou; Chen, Shuqi; Tian, Jianguo

    2017-02-01

    Coding acoustic metasurfaces can combine simple logical bits to acquire sophisticated functions in wave control. The acoustic logical bits can achieve a phase difference of exactly π and a perfect match of the amplitudes for the transmitted waves. By programming the coding sequences, acoustic metasurfaces with various functions, including creating peculiar antenna patterns and waves focusing, have been demonstrated.

  9. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses.

  10. Pseudonoise code tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T. (Inventor)

    1980-01-01

    A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.

  11. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  12. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  13. AN INTERFEROMETRIC STUDY OF THE FOMALHAUT INNER DEBRIS DISK. I. NEAR-INFRARED DETECTION OF HOT DUST WITH VLTI/VINCI

    SciTech Connect

    Absil, Olivier; Mennesson, Bertrand; Le Bouquin, Jean-Baptiste; Di Folco, Emmanuel; Kervella, Pierre; Augereau, Jean-Charles

    2009-10-10

    The innermost parts of dusty debris disks around main-sequence stars are currently poorly known due to the high contrast and small angular separation with their parent stars. Using near-infrared interferometry, we aim to detect the signature of hot dust around the nearby A4 V star Fomalhaut, which has already been suggested to harbor a warm dust population in addition to a cold dust ring located at about 140 AU. Archival data obtained with the VINCI instrument at the VLTI are used to study the fringe visibility of the Fomalhaut system at projected baseline lengths ranging from 4 m to 140 m in the K band. A significant visibility deficit is observed at short baselines with respect to the expected visibility of the sole stellar photosphere. This is interpreted as the signature of resolved circumstellar emission, producing a relative flux of 0.88% +- 0.12% with respect to the stellar photosphere. While our interferometric data cannot directly constrain the morphology of the excess emission source, complementary data from the literature allow us to discard an off-axis point-like object as the source of circumstellar emission. We argue that the thermal emission from hot dusty grains located within 6 AU from Fomalhaut is the most plausible explanation for the detected excess. Our study also provides a revised limb-darkened diameter for Fomalhaut (theta{sub LD} = 2.223 +- 0.022 mas), taking into account the effect of the resolved circumstellar emission.

  14. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  15. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  16. Embedded foveation image coding.

    PubMed

    Wang, Z; Bovik, A C

    2001-01-01

    The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.

  17. Code Disentanglement: Initial Plan

    SciTech Connect

    Wohlbier, John Greaton; Kelley, Timothy M.; Rockefeller, Gabriel M.; Calef, Matthew Thomas

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  18. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  19. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  20. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  1. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  2. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  3. Coded source neutron imaging

    NASA Astrophysics Data System (ADS)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  4. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  5. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  6. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  7. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  8. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  9. Mathematical Fiction for Senior Students and Undergraduates: Novels, Plays, and Film

    ERIC Educational Resources Information Center

    Padula, Janice

    2006-01-01

    Mathematical fiction has probably existed since ideas have been written down and certainly as early as 414 BC (Kasman, 2000). Mathematical fiction is a recently rediscovered and growing literature, as sales of the novels: "The Curious Incident of the Dog in the Night-time" (Haddon, 2003) and "The Da Vinci Code" (Brown, 2004) attest. Science…

  10. Polar Code Validation

    DTIC Science & Technology

    1989-09-30

    Unclassified 2a SECURITY CLASSiF-ICATiON AUTHORIT’Y 3 DIStRIBUTION AVAILABILITY OF REPORT N,A Approved for public release; 2o DECLASSIFICAIiON DOWNGRADING SCH DI...SUMMARY OF POLAR ACHIEVEMENTS ..... .......... 3 3 . POLAR CODE PHYSICAL MODELS ..... ............. 5 3.1 PL-ASMA Su ^"ru5 I1LS SH A...11 Structure of the Bipolar Plasma Sheath Generated by SPEAR I ... ...... 1 3 The POLAR Code Wake Model: Comparison with in Situ Observations . . 23

  11. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  12. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  13. Coding for urologic office procedures.

    PubMed

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff.

  14. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  15. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  16. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  17. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  18. Student Dress Codes.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  19. Video Coding for ESL.

    ERIC Educational Resources Information Center

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  20. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  1. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  2. Coding Theory and Projective Spaces

    NASA Astrophysics Data System (ADS)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  3. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  4. Clip and Save.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2003-01-01

    Focuses on the facial expression in the "Mona Lisa" by Leonardo da Vinci. Offers background information on da Vinci as well as learning activities for students. Includes a reproduction of the "Mona Lisa" and information about the painting. (CMK)

  5. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  6. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semianalytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection, designed to assist state and local technical staff with the task of Wellhead Protection Area (WHPA) delineation. A complete news item appeared in Eos, May 1, 1990, p. 690.The model consists of four independent, semianalytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  7. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area (WHPA) code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semi-analytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection. It is designed to assist state and local technical staff with the task of WHPA delineation.The model consists of four independent, semi-analytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  8. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  9. HYCOM Code Development

    DTIC Science & Technology

    2003-02-10

    HYCOM code development Alan J. Wallcraft Naval Research Laboratory 2003 Layered Ocean Model Users’ Workshop February 10, 2003 Report Documentation...unlimited 13. SUPPLEMENTARY NOTES Layered Ocean Modeling Workshop (LOM 2003), Miami, FL, Feb 2003 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY...Kraus-Turner mixed-layer Æ Energy-Loan (passive) ice model Æ High frequency atmospheric forcing Æ New I/O scheme (.a and .b files) Æ Scalability via

  10. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  11. Trajectory Code Studies, 1987

    SciTech Connect

    Poukey, J.W.

    1988-01-01

    The trajectory code TRAJ has been used extensively to study nonimmersed foilless electron diodes. The basic goal of the research is to design low-emittance injectors for electron linacs and propagation experiments. Systems studied during 1987 include Delphi, Recirc, and Troll. We also discuss a partly successful attempt to extend the same techniques to high currents (tens of kA). 7 refs., 30 figs.

  12. The PHARO Code.

    DTIC Science & Technology

    1981-11-24

    n.cet..ary ad Identfy by block nutrb.) Visible radiation Sensors Infrared radiation Line and band transitions Isophots High altitude nuclear data...radiation (watts sr) in arbitrary wavelength intervals is determined. The results are a series of " isophot " plots for rbitrariiy placed cameras or sensors...Section II. The output of the PHARO code consists of contour plots of radiative intensity (watts/cm ster) or " isophot " plots for arbitrarily placed sensors

  13. The Phantom SPH code

    NASA Astrophysics Data System (ADS)

    Price, Daniel; Wurster, James; Nixon, Chris

    2016-05-01

    I will present the capabilities of the Phantom SPH code for global simulations of dust and gas in protoplanetary discs. I will present our new algorithms for simulating both small and large grains in discs, as well as our progress towards simulating evolving grain populations and coupling with radiation. Finally, I will discuss our recent applications to HL Tau and the physics of dust gap opening.

  14. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  15. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic.

  16. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  17. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  18. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  19. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  20. Construction of new quantum MDS codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  1. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  2. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  3. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  4. A class of constacyclic BCH codes and new quantum codes

    NASA Astrophysics Data System (ADS)

    liu, Yang; Li, Ruihu; Lv, Liangdong; Ma, Yuena

    2017-03-01

    Constacyclic BCH codes have been widely studied in the literature and have been used to construct quantum codes in latest years. However, for the class of quantum codes of length n=q^{2m}+1 over F_{q^2} with q an odd prime power, there are only the ones of distance δ ≤ 2q^2 are obtained in the literature. In this paper, by a detailed analysis of properties of q2-ary cyclotomic cosets, maximum designed distance δ _{max} of a class of Hermitian dual-containing constacyclic BCH codes with length n=q^{2m}+1 are determined, this class of constacyclic codes has some characteristic analog to that of primitive BCH codes over F_{q^2}. Then we can obtain a sequence of dual-containing constacyclic codes of designed distances 2q^2<δ ≤ δ _{max}. Consequently, new quantum codes with distance d > 2q^2 can be constructed from these dual-containing codes via Hermitian Construction. These newly obtained quantum codes have better code rate compared with those constructed from primitive BCH codes.

  5. Dopamine neurons code subjective sensory experience and uncertainty of perceptual decisions

    PubMed Central

    de Lafuente, Victor; Romo, Ranulfo

    2011-01-01

    Midbrain dopamine (DA) neurons respond to sensory stimuli associated with future rewards. When reward is delivered probabilistically, DA neurons reflect this uncertainty by increasing their firing rates in a period between the sensory cue and reward delivery time. Probability of reward, however, has been externally conveyed by visual cues, and it is not known whether DA neurons would signal uncertainty arising internally. Here we show that DA neurons code the uncertainty associated with a perceptual judgment about the presence or absence of a vibrotactile stimulus. We observed that uncertainty modulates the activity elicited by a go cue instructing monkey subjects to communicate their decisions. That is, the same go cue generates different DA responses depending on the uncertainty level of a judgment made a few seconds before the go instruction. Easily detected suprathreshold stimuli elicit small DA responses, indicating that future reward will not be a surprising event. In contrast, the absence of a sensory stimulus generates large DA responses associated with uncertainty: was the stimulus truly absent, or did a low-amplitude vibration go undetected? In addition, the responses of DA neurons to the stimulus itself increase with vibration amplitude, but only when monkeys correctly detect its presence. This finding suggests that DA activity is not related to actual intensity but rather to perceived intensity. Therefore, in addition to their well-known role in reward prediction, DA neurons code subjective sensory experience and uncertainty arising internally from perceptual decisions. PMID:22106310

  6. Summary of 1990 Code Conference

    SciTech Connect

    Cooper, R.K.; Chan, Kwok-Chi D.

    1990-01-01

    The Conference on Codes and the Linear Accelerator Community was held in Los Alamos in January 1990, and had approximately 100 participants. This conference was the second in a series which has as its goal the exchange of information about codes and code practices among those writing and actually using these codes for the design and analysis of linear accelerators and their components. The first conference was held in San Diego in January 1988, and concentrated on beam dynamics codes and Maxwell solvers. This most recent conference concentrated on 3-D codes and techniques to handle the large amounts of data required for three-dimensional problems. In addition to descriptions of codes, their algorithms and implementations, there were a number of paper describing the use of many of the codes. Proceedings of both these conferences are available. 3 refs., 2 tabs.

  7. Chemical Laser Computer Code Survey,

    DTIC Science & Technology

    1980-12-01

    DOCUMENTATION: Resonator Geometry Synthesis Code Requi rement NV. L. Gamiz); Incorporate General Resonator into Ray Trace Code (W. H. Southwell... Synthesis Code Development (L. R. Stidhm) CATEGRY ATIUEOPTICS KINETICS GASOYNAM41CS None * None *iNone J.LEVEL Simrple Fabry Perot Simple SaturatedGt... Synthesis Co2de Require- ment (V L. ami l ncor~orate General Resonatorn into Ray Trace Code (W. H. Southwel) Srace Optimization Algorithms and Equations (W

  8. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  9. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  10. IRIG Serial Time Code Formats

    DTIC Science & Technology

    2016-08-01

    TELECOMMUNICATIONS AND TIMING GROUP IRIG STANDARD 200-16 IRIG SERIAL TIME CODE FORMATS DISTRIBUTION A: APPROVED FOR...ARNOLD ENGINEERING DEVELOPMENT COMPLEX NATIONAL AERONAUTICS AND SPACE ADMINISTRATION This page intentionally left blank. IRIG SERIAL TIME CODE ...Serial Time Code Formats, RCC 200-16, August 2016 v Table of Contents Preface

  11. Coding Major Fields of Study.

    ERIC Educational Resources Information Center

    Bobbitt, L. G.; Carroll, C. D.

    The National Center for Education Statistics conducts surveys which require the coding of the respondent's major field of study. This paper presents a new system for the coding of major field of study. It operates on-line i a Computer Assisted Telephone Interview (CATI) environment and allows conversational checks to verify coding directly from…

  12. Improved code-tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T.

    1980-01-01

    Delay-locked loop tracks pseudonoise codes without introducing dc timing errors, because it is not sensitive to gain imbalance between signal processing arms. "Early" and "late" reference codes pass in combined form through both arms, and each arm acts on both codes. Circuit accomodates 1 dB weaker input signals with tracking ability equal to that of tau-dither loops.

  13. Validation of the BEPLATE code

    SciTech Connect

    Giles, G.E.; Bullock, J.S.

    1997-11-01

    The electroforming simulation code BEPLATE (Boundary Element-PLATE) has been developed and validated for specific applications at Oak Ridge. New areas of application are opening up and more validations are being performed. This paper reports the validation experience of the BEPLATE code on two types of electroforms and describes some recent applications of the code.

  14. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  15. Ptolemy Coding Style

    DTIC Science & Technology

    2014-09-05

    COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Ptolemy Coding Style 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...lisp module for GNU Emacs that has appropriate indenting rules. This file works well with Emacs under both Unix and Windows. • testsuite/ptspell is a...Unix. It is much more liberal that the commonly used “GPL” or “ GNU Public License,” which encumbers the software and derivative works with the

  16. Structured error recovery for code-word-stabilized quantum codes

    SciTech Connect

    Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-15

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  17. Structured error recovery for code-word-stabilized quantum codes

    NASA Astrophysics Data System (ADS)

    Li, Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-01

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3t times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  18. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  19. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  20. Quantum Codes From Cyclic Codes Over The Ring R2

    NASA Astrophysics Data System (ADS)

    Altinel, Alev; Güzeltepe, Murat

    2016-10-01

    Let R 2 denotes the ring F 2 + μF 2 + υ2 + μυF 2 + wF 2 + μwF 2 + υwF 2 + μυwF2. In this study, we construct quantum codes from cyclic codes over the ring R2, for arbitrary length n, with the restrictions μ2 = 0, υ2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R2 and we give an example of quantum error-correcting codes form cyclic codes over R 2.

  1. Particle Size Control for PIV Seeding Using Dry Ice

    DTIC Science & Technology

    2010-03-01

    in flight actually being carried out, the observations, drawings and notes of Leonardo da Vinci showed an analytical process to develop a way for...theoretical particle response: dvp dt = −C(vp − U) C = 18µ ρpd2p 86 87 Bibliography 1. Linscott, R. N. and Da Vinci , L., The Notebooks of Leonardo Da Vinci

  2. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  3. Genetic code for sine

    NASA Astrophysics Data System (ADS)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  4. FAST GYROSYNCHROTRON CODES

    SciTech Connect

    Fleishman, Gregory D.; Kuznetsov, Alexey A.

    2010-10-01

    Radiation produced by charged particles gyrating in a magnetic field is highly significant in the astrophysics context. Persistently increasing resolution of astrophysical observations calls for corresponding three-dimensional modeling of the radiation. However, available exact equations are prohibitively slow in computing a comprehensive table of high-resolution models required for many practical applications. To remedy this situation, we develop approximate gyrosynchrotron (GS) codes capable of quickly calculating the GS emission (in non-quantum regime) from both isotropic and anisotropic electron distributions in non-relativistic, mildly relativistic, and ultrarelativistic energy domains applicable throughout a broad range of source parameters including dense or tenuous plasmas and weak or strong magnetic fields. The computation time is reduced by several orders of magnitude compared with the exact GS algorithm. The new algorithm performance can gradually be adjusted to the user's needs depending on whether precision or computation speed is to be optimized for a given model. The codes are made available for users as a supplement to this paper.

  5. Determinate-state convolutional codes

    NASA Technical Reports Server (NTRS)

    Collins, O.; Hizlan, M.

    1991-01-01

    A determinate state convolutional code is formed from a conventional convolutional code by pruning away some of the possible state transitions in the decoding trellis. The type of staged power transfer used in determinate state convolutional codes proves to be an extremely efficient way of enhancing the performance of a concatenated coding system. The decoder complexity is analyzed along with free distances of these new codes and extensive simulation results is provided of their performance at the low signal to noise ratios where a real communication system would operate. Concise, practical examples are provided.

  6. Circular codes, symmetries and transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-06-01

    Circular codes, putative remnants of primeval comma-free codes, have gained considerable attention in the last years. In fact they represent a second kind of genetic code potentially involved in detecting and maintaining the normal reading frame in protein coding sequences. The discovering of an universal code across species suggested many theoretical and experimental questions. However, there is a key aspect that relates circular codes to symmetries and transformations that remains to a large extent unexplored. In this article we aim at addressing the issue by studying the symmetries and transformations that connect different circular codes. The main result is that the class of 216 C3 maximal self-complementary codes can be partitioned into 27 equivalence classes defined by a particular set of transformations. We show that such transformations can be put in a group theoretic framework with an intuitive geometric interpretation. More general mathematical results about symmetry transformations which are valid for any kind of circular codes are also presented. Our results pave the way to the study of the biological consequences of the mathematical structure behind circular codes and contribute to shed light on the evolutionary steps that led to the observed symmetries of present codes.

  7. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  8. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  9. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  10. Telescope Adaptive Optics Code

    SciTech Connect

    Phillion, D.

    2005-07-28

    The Telescope AO Code has general adaptive optics capabilities plus specialized models for three telescopes with either adaptive optics or active optics systems. It has the capability to generate either single-layer or distributed Kolmogorov turbulence phase screens using the FFT. Missing low order spatial frequencies are added using the Karhunen-Loeve expansion. The phase structure curve is extremely dose to the theoreUcal. Secondly, it has the capability to simulate an adaptive optics control systems. The default parameters are those of the Keck II adaptive optics system. Thirdly, it has a general wave optics capability to model the science camera halo due to scintillation from atmospheric turbulence and the telescope optics. Although this capability was implemented for the Gemini telescopes, the only default parameter specific to the Gemini telescopes is the primary mirror diameter. Finally, it has a model for the LSST active optics alignment strategy. This last model is highly specific to the LSST

  11. Code lock with microcircuit

    NASA Astrophysics Data System (ADS)

    Korobka, A.; May, I.

    1985-01-01

    A code lock with a microcircuit was invented which contains only a very few components. Two DD-triggers control the state of two identical transistors. When both transistors are turned on simultaneously the transistor VS1 is turned on so that the electromagnet YA1 pulls in the bolt and the door opens. This will happen only when a logic 1 appears at the inverted output of the first trigger and at the straight output of the second one. After the door is opened, a button on it resets the contactors to return both triggers to their original state. The electromagnetic is designed to produce the necessary pull force and sufficient power when under rectified 127 V line voltage, with the neutral wire of the lock circuit always connected to the - terminal of the power supply.

  12. Peripheral coding of taste

    PubMed Central

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  13. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  14. Improved lossless intra coding for next generation video coding

    NASA Astrophysics Data System (ADS)

    Vanam, Rahul; He, Yuwen; Ye, Yan

    2016-09-01

    Recently, there have been efforts by the ITU-T VCEG and ISO/IEC MPEG to further improve the compression performance of the High Efficiency Video Coding (HEVC) standard for developing a potential next generation video coding standard. The exploratory codec software of this potential standard includes new coding tools for inter and intra coding. In this paper, we present a new intra prediction mode for lossless intra coding. Our new intra mode derives a prediction filter for each input pixel using its neighboring reconstructed pixels, and applies this filter to the nearest neighboring reconstructed pixels to generate a prediction pixel. The proposed intra mode is demonstrated to improve the performance of the exploratory software for lossless intra coding, yielding a maximum and average bitrate savings of 4.4% and 2.11%, respectively.

  15. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  16. On multilevel block modulation codes

    NASA Technical Reports Server (NTRS)

    Kasami, Tadao; Takata, Toyoo; Fujiwara, Toru; Lin, Shu

    1991-01-01

    The multilevel (ML) technique for combining block coding and modulation is investigated. A general formulation is presented for ML modulation codes in terms of component codes with appropriate distance measures. A specific method for constructing ML block modulation codes (MLBMCs) with interdependency among component codes is proposed. Given an MLBMC C with no interdependency among the binary component codes, the proposed method gives an MLBC C-prime that has the same rate as C, a minimum squared Euclidean distance not less than that of C, a trellis diagram with the same number of states as that of C, and a smaller number of nearest-neighbor codewords than that of C. Finally, a technique is presented for analyzing the error performance of MLBMCs for an additive white Gaussian noise channel based on soft-decision maximum-likelihood decoding.

  17. QR code for medical information uses.

    PubMed

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  18. Code Speed Measuring for VC++

    DTIC Science & Technology

    2015-10-01

    UNCLASSIFIED AD-E403 688 Technical Report ARWSE-TR-14025 CODE SPEED MEASURING FOR VC++ Tom Nealis...TYPE Final 3. DATES COVERED (From – To) 4. TITLE AND SUBTITLE CODE SPEED MEASURING FOR VC++ 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...ABSTRACT It’s often important to know how fast a snippet of code executes. This information allows the coder to make important decisions

  19. Explosive Formulation Code Naming SOP

    SciTech Connect

    Martz, H. E.

    2014-09-19

    The purpose of this SOP is to provide a procedure for giving individual HME formulations code names. A code name for an individual HME formulation consists of an explosive family code, given by the classified guide, followed by a dash, -, and a number. If the formulation requires preparation such as packing or aging, these add additional groups of symbols to the X-ray specimen name.

  20. Bar-Code-Scribing Tool

    NASA Technical Reports Server (NTRS)

    Badinger, Michael A.; Drouant, George J.

    1991-01-01

    Proposed hand-held tool applies indelible bar code to small parts. Possible to identify parts for management of inventory without tags or labels. Microprocessor supplies bar-code data to impact-printer-like device. Device drives replaceable scribe, which cuts bar code on surface of part. Used to mark serially controlled parts for military and aerospace equipment. Also adapts for discrete marking of bulk items used in food and pharmaceutical processing.

  1. Upgrades to NRLMOL code

    NASA Astrophysics Data System (ADS)

    Basurto, Luis

    This project consists of performing upgrades to the massively parallel NRLMOL electronic structure code in order to enhance its performance by increasing its flexibility by: a) Utilizing dynamically allocated arrays, b) Executing in a parallel environment sections of the program that were previously executed in a serial mode, c) Exploring simultaneous concurrent executions of the program through the use of an already existing MPI environment; thus enabling the simulation of larger systems than it is currently capable of performing. Also developed was a graphical user interface that will allow less experienced users to start performing electronic structure calculations by aiding them in performing the necessary configuration of input files as well as providing graphical tools for the displaying and analysis of results. Additionally, a computational toolkit that can avail of large supercomputers and make use of various levels of approximation for atomic interactions was developed to search for stable atomic clusters and predict novel stable endohedral fullerenes. As an application of the developed computational toolkit, a search was conducted for stable isomers of Sc3N C80 fullerene. In this search, about 1.2 million isomers of C80 were optimized in various charged states at the PM6 level. Subsequently, using the selected optimized isomers of C80 in various charged state, about 10,000 isomers of Sc3N C80 were constructed which were optimized using semi-empirical PM6 quantum chemical method. A few selected lowest isomers of Sc3N C80 were optimized at the DFT level. The calculation confirms the lowest 3 isomers previously reported in literature but 4 new isomers are found within the lowest 10 isomers. Using the upgraded NRLMOL code, a study was done of the electronic structure of a multichromoric molecular complex containing two of each borondipyrromethane dye, Zn-tetraphenyl-porphyrin, bisphenyl anthracene and a fullerene. A systematic examination of the effect of

  2. The FLUKA Code: an Overview

    SciTech Connect

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U. /Houston U. /SLAC /Frascati /NASA, Houston /ENEA, Frascati

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  3. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  4. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  5. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  6. Golay and other box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6 x 4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (53,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  7. Golay and other box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6x4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (63,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  8. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  9. The KIDTALK Behavior and Language Code: Manual and Coding Protocol.

    ERIC Educational Resources Information Center

    Delaney, Elizabeth M.; Ezell, Sara S.; Solomon, Ned A.; Hancock, Terry B.; Kaiser, Ann P.

    Developed as part of the Milieu Language Teaching Project at the John F. Kennedy Center at Vanderbilt University in Nashville, Tennessee, this KIDTALK Behavior-Language Coding Protocol and manual measures behavior occurring during adult-child interactions. The manual is divided into 5 distinct sections: (1) the adult behavior codes describe…

  10. Patched Conic Trajectory Code

    NASA Technical Reports Server (NTRS)

    Park, Brooke Anderson; Wright, Henry

    2012-01-01

    PatCon code was developed to help mission designers run trade studies on launch and arrival times for any given planet. Initially developed in Fortran, the required inputs included launch date, arrival date, and other orbital parameters of the launch planet and arrival planets at the given dates. These parameters include the position of the planets, the eccentricity, semi-major axes, argument of periapsis, ascending node, and inclination of the planets. With these inputs, a patched conic approximation is used to determine the trajectory. The patched conic approximation divides the planetary mission into three parts: (1) the departure phase, in which the two relevant bodies are Earth and the spacecraft, and where the trajectory is a departure hyperbola with Earth at the focus; (2) the cruise phase, in which the two bodies are the Sun and the spacecraft, and where the trajectory is a transfer ellipse with the Sun at the focus; and (3) the arrival phase, in which the two bodies are the target planet and the spacecraft, where the trajectory is an arrival hyperbola with the planet as the focus.

  11. Civil Code, 11 December 1987.

    PubMed

    1988-01-01

    Article 162 of this Mexican Code provides, among other things, that "Every person has the right freely, responsibly, and in an informed fashion to determine the number and spacing of his or her children." When a marriage is involved, this right is to be observed by the spouses "in agreement with each other." The civil codes of the following states contain the same provisions: 1) Baja California (Art. 159 of the Civil Code of 28 April 1972 as revised in Decree No. 167 of 31 January 1974); 2) Morelos (Art. 255 of the Civil Code of 26 September 1949 as revised in Decree No. 135 of 29 December 1981); 3) Queretaro (Art. 162 of the Civil Code of 29 December 1950 as revised in the Act of 9 January 1981); 4) San Luis Potosi (Art. 147 of the Civil Code of 24 March 1946 as revised in 13 June 1978); Sinaloa (Art. 162 of the Civil Code of 18 June 1940 as revised in Decree No. 28 of 14 October 1975); 5) Tamaulipas (Art. 146 of the Civil Code of 21 November 1960 as revised in Decree No. 20 of 30 April 1975); 6) Veracruz-Llave (Art. 98 of the Civil Code of 1 September 1932 as revised in the Act of 30 December 1975); and 7) Zacatecas (Art. 253 of the Civil Code of 9 February 1965 as revised in Decree No. 104 of 13 August 1975). The Civil Codes of Puebla and Tlaxcala provide for this right only in the context of marriage with the spouses in agreement. See Art. 317 of the Civil Code of Puebla of 15 April 1985 and Article 52 of the Civil Code of Tlaxcala of 31 August 1976 as revised in Decree No. 23 of 2 April 1984. The Family Code of Hidalgo requires as a formality of marriage a certification that the spouses are aware of methods of controlling fertility, responsible parenthood, and family planning. In addition, Article 22 the Civil Code of the Federal District provides that the legal capacity of natural persons is acquired at birth and lost at death; however, from the moment of conception the individual comes under the protection of the law, which is valid with respect to the

  12. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control.

  13. Cracking the bioelectric code

    PubMed Central

    Tseng, AiSun; Levin, Michael

    2013-01-01

    Patterns of resting potential in non-excitable cells of living tissue are now known to be instructive signals for pattern formation during embryogenesis, regeneration and cancer suppression. The development of molecular-level techniques for tracking ion flows and functionally manipulating the activity of ion channels and pumps has begun to reveal the mechanisms by which voltage gradients regulate cell behaviors and the assembly of complex large-scale structures. A recent paper demonstrated that a specific voltage range is necessary for demarcation of eye fields in the frog embryo. Remarkably, artificially setting other somatic cells to the eye-specific voltage range resulted in formation of eyes in aberrant locations, including tissues that are not in the normal anterior ectoderm lineage: eyes could be formed in the gut, on the tail, or in the lateral plate mesoderm. These data challenge the existing models of eye fate restriction and tissue competence maps, and suggest the presence of a bioelectric code—a mapping of physiological properties to anatomical outcomes. This Addendum summarizes the current state of knowledge in developmental bioelectricity, proposes three possible interpretations of the bioelectric code that functionally maps physiological states to anatomical outcomes, and highlights the biggest open questions in this field. We also suggest a speculative hypothesis at the intersection of cognitive science and developmental biology: that bioelectrical signaling among non-excitable cells coupled by gap junctions simulates neural network-like dynamics, and underlies the information processing functions required by complex pattern formation in vivo. Understanding and learning to control the information stored in physiological networks will have transformative implications for developmental biology, regenerative medicine and synthetic bioengineering. PMID:23802040

  14. Breaking the Code of Silence.

    ERIC Educational Resources Information Center

    Halbig, Wolfgang W.

    2000-01-01

    Schools and communities must break the adolescent code of silence concerning threats of violence. Schools need character education stressing courage, caring, and responsibility; regular discussions of the school discipline code; formal security discussions with parents; 24-hour hotlines; and protocols for handling reports of potential violence.…

  15. ACCELERATION PHYSICS CODE WEB REPOSITORY.

    SciTech Connect

    WEI, J.

    2006-06-26

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  16. Accelerator Physics Code Web Repository

    SciTech Connect

    Zimmermann, F.; Basset, R.; Bellodi, G.; Benedetto, E.; Dorda, U.; Giovannozzi, M.; Papaphilippou, Y.; Pieloni, T.; Ruggiero, F.; Rumolo, G.; Schmidt, F.; Todesco, E.; Zotter, B.W.; Payet, J.; Bartolini, R.; Farvacque, L.; Sen, T.; Chin, Y.H.; Ohmi, K.; Oide, K.; Furman, M.; /LBL, Berkeley /Oak Ridge /Pohang Accelerator Lab. /SLAC /TRIUMF /Tech-X, Boulder /UC, San Diego /Darmstadt, GSI /Rutherford /Brookhaven

    2006-10-24

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  17. LFSC - Linac Feedback Simulation Code

    SciTech Connect

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  18. Indices for Testing Neural Codes

    PubMed Central

    Victor, Jonathan D.; Nirenberg, Sheila

    2009-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is smaller than the latter (i.e., if the code cannot account for the behavior), the code can be ruled out. The information-theoretic index most widely used in this context is Shannon’s mutual information. The Shannon test, however, is not ideal for this purpose: while the codes it will rule out are truly nonviable, there will be some nonviable codes that it will fail to rule out. Here we describe a wide range of alternative indices that can be used for ruling codes out. The range includes a continuum from Shannon information to measures of the performance of a Bayesian decoder. We analyze the relationship of these indices to each other and their complementary strengths and weaknesses for addressing this problem. PMID:18533812

  19. Population coding in somatosensory cortex.

    PubMed

    Petersen, Rasmus S; Panzeri, Stefano; Diamond, Mathew E

    2002-08-01

    Computational analyses have begun to elucidate which components of somatosensory cortical population activity may encode basic stimulus features. Recent results from rat barrel cortex suggest that the essence of this code is not synergistic spike patterns, but rather the precise timing of single neuron's first post-stimulus spikes. This may form the basis for a fast, robust population code.

  20. QPhiX Code Generator

    SciTech Connect

    Joo, Balint

    2014-09-16

    A simple code-generator to generate the low level code kernels used by the QPhiX Library for Lattice QCD. Generates Kernels for Wilson-Dslash, and Wilson-Clover kernels. Can be reused to write other optimized kernels for Intel Xeon Phi(tm), Intel Xeon(tm) and potentially other architectures.

  1. Using NAEYC's Code of Ethics.

    ERIC Educational Resources Information Center

    Young Children, 1995

    1995-01-01

    Considers how to deal with an ethical dilemma concerning a caregiver's dislike for a child. Recognizes that no statement in NAEYC's Code of Ethical Conduct requires that a professional must like each child, and presents some ideals and principles from the code that may guide professionals through similar situations. (BAC)

  2. Video coding with dynamic background

    NASA Astrophysics Data System (ADS)

    Paul, Manoranjan; Lin, Weisi; Lau, Chiew Tong; Lee, Bu-Sung

    2013-12-01

    Motion estimation (ME) and motion compensation (MC) using variable block size, sub-pixel search, and multiple reference frames (MRFs) are the major reasons for improved coding performance of the H.264 video coding standard over other contemporary coding standards. The concept of MRFs is suitable for repetitive motion, uncovered background, non-integer pixel displacement, lighting change, etc. The requirement of index codes of the reference frames, computational time in ME & MC, and memory buffer for coded frames limits the number of reference frames used in practical applications. In typical video sequences, the previous frame is used as a reference frame with 68-92% of cases. In this article, we propose a new video coding method using a reference frame [i.e., the most common frame in scene (McFIS)] generated by dynamic background modeling. McFIS is more effective in terms of rate-distortion and computational time performance compared to the MRFs techniques. It has also inherent capability of scene change detection (SCD) for adaptive group of picture (GOP) size determination. As a result, we integrate SCD (for GOP determination) with reference frame generation. The experimental results show that the proposed coding scheme outperforms the H.264 video coding with five reference frames and the two relevant state-of-the-art algorithms by 0.5-2.0 dB with less computational time.

  3. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  4. Medical coding in clinical trials.

    PubMed

    Babre, Deven

    2010-01-01

    Data generated in all clinical trial are recorded on the data collection instrument Case report Form / Electronic Case Report Form by investigators located at various sites in various countries. In multicentric clinical trials since different investigator or medically qualified experts are from different sites / centers recording the medical term(s) uniformly is a big challenge. Medical coders from clinical data management team process these terms and perform medical coding. Medical coding is performed to categorize the medical terms reported appropriately so that they can be analyzed/reviewed. This article describes process which is used for medical coding in clinical data management and two most commonly used medical dictionaries MedDRA and WHO-DDE in brief. It is expected to help medical coders to understand the process of medical coding in clinical data management. Few common issues which the medical coder faces while performing medical coding, are also highlighted.

  5. Detecting non-coding selective pressure in coding regions

    PubMed Central

    Chen, Hui; Blanchette, Mathieu

    2007-01-01

    Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements. PMID:17288582

  6. BASS Code Development

    NASA Technical Reports Server (NTRS)

    Sawyer, Scott

    2004-01-01

    The BASS computational aeroacoustic code solves the fully nonlinear Euler equations in the time domain in two-dimensions. The acoustic response of the stator is determined simultaneously for the first three harmonics of the convected vortical gust of the rotor. The spatial mode generation, propagation and decay characteristics are predicted by assuming the acoustic field away from the stator can be represented as a uniform flow with small harmonic perturbations superimposed. The computed field is then decomposed using a joint temporal-spatial transform to determine the wave amplitudes as a function of rotor harmonic and spatial mode order. This report details the following technical aspects of the computations and analysis. 1) the BASS computational technique; 2) the application of periodic time shifted boundary conditions; 3) the linear theory aspects unique to rotor-stator interactions; and 4) the joint spatial-temporal transform. The computational results presented herein are twofold. In each case, the acoustic response of the stator is determined simultaneously for the first three harmonics of the convected vortical gust of the rotor. The fan under consideration here like modern fans is cut-off at +, and propagating acoustic waves are only expected at 2BPF and 3BPF. In the first case, the computations showed excellent agreement with linear theory predictions. The frequency and spatial mode order of acoustic field was computed and found consistent with linear theory. Further, the propagation of the generated modes was also correctly predicted. The upstream going waves propagated from the domain without reflection from the in ow boundary. However, reflections from the out ow boundary were noticed. The amplitude of the reflected wave was approximately 5% of the incident wave. The second set of computations were used to determine the influence of steady loading on the generated noise. Toward this end, the acoustic response was determined with three steady loading

  7. Coding for effective denial management.

    PubMed

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  8. Non-Pauli observables for CWS codes

    NASA Astrophysics Data System (ADS)

    Santiago, Douglas F. G.; Portugal, Renato; Melo, Nolmar

    2013-05-01

    It is known that nonadditive quantum codes can have higher code dimensions than stabilizer codes for the same length and minimum distance. The class of codeword stabilized codes (CWS) provides tools to obtain new nonadditive quantum codes by reducing the problem to finding nonlinear classical codes. In this work, we establish some results on the kind of non-Pauli operators that can be used as observables in the decoding scheme of CWS codes and propose a procedure to obtain those observables.

  9. ETR/ITER systems code

    SciTech Connect

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  10. A secure and efficient entropy coding based on arithmetic coding

    NASA Astrophysics Data System (ADS)

    Li, Hengjian; Zhang, Jiashu

    2009-12-01

    A novel security arithmetic coding scheme based on nonlinear dynamic filter (NDF) with changeable coefficients is proposed in this paper. The NDF is employed to generate the pseudorandom number generator (NDF-PRNG) and its coefficients are derived from the plaintext for higher security. During the encryption process, the mapping interval in each iteration of arithmetic coding (AC) is decided by both the plaintext and the initial values of NDF, and the data compression is also achieved with entropy optimality simultaneously. And this modification of arithmetic coding methodology which also provides security is easy to be expanded into the most international image and video standards as the last entropy coding stage without changing the existing framework. Theoretic analysis and numerical simulations both on static and adaptive model show that the proposed encryption algorithm satisfies highly security without loss of compression efficiency respect to a standard AC or computation burden.

  11. CodedStream: live media streaming with overlay coded multicast

    NASA Astrophysics Data System (ADS)

    Guo, Jiang; Zhu, Ying; Li, Baochun

    2003-12-01

    Multicasting is a natural paradigm for streaming live multimedia to multiple end receivers. Since IP multicast is not widely deployed, many application-layer multicast protocols have been proposed. However, all of these schemes focus on the construction of multicast trees, where a relatively small number of links carry the multicast streaming load, while the capacity of most of the other links in the overlay network remain unused. In this paper, we propose CodedStream, a high-bandwidth live media distribution system based on end-system overlay multicast. In CodedStream, we construct a k-redundant multicast graph (a directed acyclic graph) as the multicast topology, on which network coding is applied to work around bottlenecks. Simulation results have shown that the combination of k-redundant multicast graph and network coding may indeed bring significant benefits with respect to improving the quality of live media at the end receivers.

  12. New optimal asymmetric quantum codes constructed from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Lü, Liangdong

    2017-02-01

    In this paper, we propose the construction of asymmetric quantum codes from two families of constacyclic codes over finite field 𝔽q2 of code length n, where for the first family, q is an odd prime power with the form 4t + 1 (t ≥ 1 is integer) or 4t ‑ 1 (t ≥ 2 is integer) and n1 = q2+1 2; for the second family, q is an odd prime power with the form 10t + 3 or 10t + 7 (t ≥ 0 is integer) and n2 = q2+1 5. As a result, families of new asymmetric quantum codes [[n,k,dz/dx

  13. State building energy codes status

    SciTech Connect

    1996-09-01

    This document contains the State Building Energy Codes Status prepared by Pacific Northwest National Laboratory for the U.S. Department of Energy under Contract DE-AC06-76RL01830 and dated September 1996. The U.S. Department of Energy`s Office of Codes and Standards has developed this document to provide an information resource for individuals interested in energy efficiency of buildings and the relevant building energy codes in each state and U.S. territory. This is considered to be an evolving document and will be updated twice a year. In addition, special state updates will be issued as warranted.

  14. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  15. Bandwidth efficient coding for satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Costello, Daniel J., Jr.; Miller, Warner H.; Morakis, James C.; Poland, William B., Jr.

    1992-01-01

    An error control coding scheme was devised to achieve large coding gain and high reliability by using coded modulation with reduced decoding complexity. To achieve a 3 to 5 dB coding gain and moderate reliability, the decoding complexity is quite modest. In fact, to achieve a 3 dB coding gain, the decoding complexity is quite simple, no matter whether trellis coded modulation or block coded modulation is used. However, to achieve coding gains exceeding 5 dB, the decoding complexity increases drastically, and the implementation of the decoder becomes very expensive and unpractical. The use is proposed of coded modulation in conjunction with concatenated (or cascaded) coding. A good short bandwidth efficient modulation code is used as the inner code and relatively powerful Reed-Solomon code is used as the outer code. With properly chosen inner and outer codes, a concatenated coded modulation scheme not only can achieve large coding gains and high reliability with good bandwidth efficiency but also can be practically implemented. This combination of coded modulation and concatenated coding really offers a way of achieving the best of three worlds, reliability and coding gain, bandwidth efficiency, and decoding complexity.

  16. Internationalizing professional codes in engineering.

    PubMed

    Harris, Charles E

    2004-07-01

    Professional engineering societies which are based in the United States, such as the American Society of Mechanical Engineers (ASME, now ASME International) are recognizing that their codes of ethics must apply to engineers working throughout the world. An examination of the ethical code of the ASME International shows that its provisions pose many problems of application, especially in societies outside the United States. In applying the codes effectively in the international environment, two principal issues must be addressed. First, some Culture Transcending Guidelines must be identified and justified. Nine such guidelines are identified Second, some methods for applying the codes to particular situations must be identified Three such methods are specification, balancing, and finding a creative middle way.

  17. FLYCHK Collisional-Radiative Code

    National Institute of Standards and Technology Data Gateway

    SRD 160 FLYCHK Collisional-Radiative Code (Web, free access)   FLYCHK provides a capability to generate atomic level populations and charge state distributions for low-Z to mid-Z elements under NLTE conditions.

  18. Seals Flow Code Development 1993

    NASA Technical Reports Server (NTRS)

    Liang, Anita D. (Compiler); Hendricks, Robert C. (Compiler)

    1994-01-01

    Seals Workshop of 1993 code releases include SPIRALI for spiral grooved cylindrical and face seal configurations; IFACE for face seals with pockets, steps, tapers, turbulence, and cavitation; GFACE for gas face seals with 'lift pad' configurations; and SCISEAL, a CFD code for research and design of seals of cylindrical configuration. GUI (graphical user interface) and code usage was discussed with hands on usage of the codes, discussions, comparisons, and industry feedback. Other highlights for the Seals Workshop-93 include environmental and customer driven seal requirements; 'what's coming'; and brush seal developments including flow visualization, numerical analysis, bench testing, T-700 engine testing, tribological pairing and ceramic configurations, and cryogenic and hot gas facility brush seal results. Also discussed are seals for hypersonic engines and dynamic results for spiral groove and smooth annular seals.

  19. NFPA's Hydrogen Technologies Code Project

    SciTech Connect

    Rivkin, C. H.

    2008-12-01

    This article discusses the development of National Fire Protection Association 2 (NFPA), a comprehensive hydrogen safety code. It analyses the contents of this document with particular attention focused on new requirements for setting hydrogen storage systems. These new requirements use computational fluid dynamic modeling and risk assessment procedures to develop requirements that are based on both technical analyses and defined risk criteria. The intent is to develop requirements based on procedures that can be replicated based on the information provided in the code document. This code will require documentation of the modeling inputs and risk criteria and analyses in the supporting information. This article also includes a description of the codes and standards that address hydrogen technologies in general.

  20. Property Control through Bar Coding.

    ERIC Educational Resources Information Center

    Kingma, Gerben J.

    1984-01-01

    A public utility company uses laser wands to read bar-coded labels on furniture and equipment. The system allows an 80 percent savings of the time required to create reports for inventory control. (MLF)

  1. The moving mesh code SHADOWFAX

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, B.; De Rijcke, S.

    2016-07-01

    We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.

  2. Edge equilibrium code for tokamaks

    SciTech Connect

    Li, Xujing; Drozdov, Vladimir V.

    2014-01-15

    The edge equilibrium code (EEC) described in this paper is developed for simulations of the near edge plasma using the finite element method. It solves the Grad-Shafranov equation in toroidal coordinate and uses adaptive grids aligned with magnetic field lines. Hermite finite elements are chosen for the numerical scheme. A fast Newton scheme which is the same as implemented in the equilibrium and stability code (ESC) is applied here to adjust the grids.

  3. Computer-Access-Code Matrices

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr.

    1990-01-01

    Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.

  4. Product Work Classification and Coding

    DTIC Science & Technology

    1986-06-01

    detail is much more useful in planning steel welding processes. In this regard remember that mild steel , HSLA steel , and high-yield steel (e.g. HY80 ...manufacturing facility. In Figure 2.3-2, a classification and coding system for steel parts is shown. This classification and coding system sorts steel parts...system would provide a shop which produced steel parts with a means of organizing parts. Rather than attempting to manage all of its parts as a single

  5. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  6. National Combustion Code: Parallel Performance

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2001-01-01

    This report discusses the National Combustion Code (NCC). The NCC is an integrated system of codes for the design and analysis of combustion systems. The advanced features of the NCC meet designers' requirements for model accuracy and turn-around time. The fundamental features at the inception of the NCC were parallel processing and unstructured mesh. The design and performance of the NCC are discussed.

  7. UNIX code management and distribution

    SciTech Connect

    Hung, T.; Kunz, P.F.

    1992-09-01

    We describe a code management and distribution system based on tools freely available for the UNIX systems. At the master site, version control is managed with CVS, which is a layer on top of RCS, and distribution is done via NFS mounted file systems. At remote sites, small modifications to CVS provide for interactive transactions with the CVS system at the master site such that remote developers are true peers in the code development process.

  8. Spaceflight Validation of Hzetrn Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Shinn, J. L.; Singleterry, R. C.; Badavi, F. F.; Badhwar, G. D.; Reitz, G.; Beaujean, R.; Cucinotta, F. A.

    1999-01-01

    HZETRN is being developed as a fast deterministic radiation transport code applicable to neutrons, protons, and multiply charged ions in the space environment. It was recently applied to 50 hours of IMP8 data measured during the August 4, 1972 solar event to map the hourly exposures within the human body under several shield configurations. This calculation required only 18 hours on a VAX 4000 machine. A similar calculation using the Monte Carlo method would have required two years of dedicated computer time. The code has been benchmarked against well documented and tested Monte Carlo proton transport codes with good success. The code will allow important trade studies to be made with relative ease due to the computational speed and will be useful in assessing design alternatives in an integrated system software environment. Since there are no well tested Monte Carlo codes for HZE particles, we have been engaged in flight validation of the HZETRN results. To date we have made comparison with TEPC, CR-39, charge particle telescopes, and Bonner spheres. This broad range of detectors allows us to test a number of functions related to differing physical processes which add to the complicated radiation fields within a spacecraft or the human body, which functions can be calculated by the HZETRN code system. In the present report we will review these results.

  9. Rotating-Pump Design Code

    NASA Technical Reports Server (NTRS)

    Walker, James F.; Chen, Shu-Cheng; Scheer, Dean D.

    2006-01-01

    Pump Design (PUMPDES) is a computer program for designing a rotating pump for liquid hydrogen, liquid oxygen, liquid nitrogen, water, methane, or ethane. Using realistic properties of these fluids provided by another program called GASPAK, this code performs a station-by-station, mean-line analysis along the pump flow path, obtaining thermodynamic properties of the pumped fluid at each station and evaluating hydraulic losses along the flow path. The variables at each station are obtained under constraints that are consistent with the underlying physical principles. The code evaluates the performance of each stage and the overall pump. In addition, by judiciously choosing the givens and the unknowns, the code can perform a geometric inverse design function: that is, it can compute a pump geometry that yields a closest approximation of given design point. The code contains two major parts: one for an axial-rotor/inducer and one for a multistage centrifugal pump. The inducer and the centrifugal pump are functionally integrated. The code can be used in designing and/or evaluating the inducer/centrifugal-pump combination or the centrifugal pump alone. The code is written in standard Fortran 77.

  10. Reflections on Post-16 Strategies in European Countries. Interim Report of the Leonardo da Vinci/Multiplier Effect Project III.3.a. Priority 2: Forging Links between Educational Establishments and Enterprises (1997-2000) ID 27009. Working Papers, No. 9.

    ERIC Educational Resources Information Center

    Stenstrom, Marja-Leena, Ed.

    This four-part publication contains 19 papers on educational practices and promises for post-16 education in European countries. Part I, the introduction, contains these three papers: "Sharpening Post-16 Education Strategies: Building on the Results of the Previous Projects" (Johanna Lasonen); "'Parity of Esteem' and 'Integrated…

  11. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  12. Cleanup MAC and MBA code ATP

    SciTech Connect

    Russell, V.K.

    1994-10-17

    The K Basins Materials Accounting (MAC) and Material Balance (MBA) database system had some minor code cleanup performed to its code. This ATP describes how the code was to be tested to verify its correctness.

  13. Codes That Support Smart Growth Development

    EPA Pesticide Factsheets

    Provides examples of local zoning codes that support smart growth development, categorized by: unified development code, form-based code, transit-oriented development, design guidelines, street design standards, and zoning overlay.

  14. Tribal Green Building Administrative Code Example

    EPA Pesticide Factsheets

    This Tribal Green Building Administrative Code Example can be used as a template for technical code selection (i.e., building, electrical, plumbing, etc.) to be adopted as a comprehensive building code.

  15. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ... National Institute of Standards and Technology International Code Council: The Update Process for the International Codes and Standards AGENCY: National Institute of Standards and Technology, Commerce. ACTION: Notice. SUMMARY: The International Code Council (ICC), promulgator of the International Codes...

  16. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... National Institute of Standards and Technology International Code Council: The Update Process for the International Codes and Standards AGENCY: National Institute of Standards and Technology, Commerce. ACTION: Notice. SUMMARY: The International Code Council (ICC), promulgator of the International Codes...

  17. Evolution of the genetic code.

    PubMed

    Davis, B K

    1999-01-01

    Comparative path lengths in amino acid biosynthesis and other molecular indicators of the timing of codon assignment were examined to reconstruct the main stages of code evolution. The codon tree obtained was rooted in the 4 N-fixing amino acids (Asp, Glu, Asn, Gln) and 16 triplets of the NAN set. This small, locally phased (commaless) code evidently arose from ambiguous translation on a poly(A) collector strand, in a surface reaction network. Copolymerisation of these amino acids yields polyanionic peptide chains, which could anchor uncharged amide residues to a positively charged mineral surface. From RNA virus structure and replication in vitro, the first genes seemed to be RNA segments spliced into tRNA. Expansion of the code reduced the risk of mutation to an unreadable codon. This step was conditional on initiation at the 5'-codon of a translated sequence. Incorporation of increasingly hydrophobic amino acids accompanied expansion. As codons of the NUN set were assigned most slowly, they received the most nonpolar amino acids. The origin of ferredoxin and Gln synthetase was traced to mid-expansion phase. Surface metabolism ceased by the end of code expansion, as cells bounded by a proteo-phospholipid membrane, with a protoATPase, had emerged. Incorporation of positively charged and aromatic amino acids followed. They entered the post-expansion code by codon capture. Synthesis of efficient enzymes with acid-base catalysis was then possible. Both types of aminoacyl-tRNA synthetases were attributed to this stage. tRNA sequence diversity and error rates in RNA replication indicate the code evolved within 20 million yr in the preIsuan era. These findings on the genetic code provide empirical evidence, from a contemporaneous source, that a surface reaction network, centred on C-fixing autocatalytic cycles, rapidly led to cellular life on Earth.

  18. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  19. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.

    1976-01-01

    The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.

  20. 76 FR 77549 - Lummi Nation-Title 20-Code of Laws-Liquor Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ... Bureau of Indian Affairs Lummi Nation--Title 20--Code of Laws--Liquor Code AGENCY: Bureau of Indian...--Code of Laws--Liquor Code. The Code regulates and controls the possession, sale and consumption of... this Code allows for the possession and sale of alcoholic beverages within the Lummi...

  1. International assessment of PCA codes

    SciTech Connect

    Neymotin, L.; Lui, C.; Glynn, J.; Archarya, S.

    1993-11-01

    Over the past three years (1991-1993), an extensive international exercise for intercomparison of a group of six Probabilistic Consequence Assessment (PCA) codes was undertaken. The exercise was jointly sponsored by the Commission of European Communities (CEC) and OECD Nuclear Energy Agency. This exercise was a logical continuation of a similar effort undertaken by OECD/NEA/CSNI in 1979-1981. The PCA codes are currently used by different countries for predicting radiological health and economic consequences of severe accidents at nuclear power plants (and certain types of non-reactor nuclear facilities) resulting in releases of radioactive materials into the atmosphere. The codes participating in the exercise were: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this inter-code comparison effort, two separate groups performed a similar set of calculations using two of the participating codes, MACCS and COSYMA. Results of the intercode and inter-MACCS comparisons are presented in this paper. The MACCS group included four participants: GREECE: Institute of Nuclear Technology and Radiation Protection, NCSR Demokritos; ITALY: ENEL, ENEA/DISP, and ENEA/NUC-RIN; SPAIN: Universidad Politecnica de Madrid (UPM) and Consejo de Seguridad Nuclear; USA: Brookhaven National Laboratory, US NRC and DOE.

  2. Driver Code for Adaptive Optics

    NASA Technical Reports Server (NTRS)

    Rao, Shanti

    2007-01-01

    A special-purpose computer code for a deformable-mirror adaptive-optics control system transmits pixel-registered control from (1) a personal computer running software that generates the control data to (2) a circuit board with 128 digital-to-analog converters (DACs) that generate voltages to drive the deformable-mirror actuators. This program reads control-voltage codes from a text file, then sends them, via the computer s parallel port, to a circuit board with four AD5535 (or equivalent) chips. Whereas a similar prior computer program was capable of transmitting data to only one chip at a time, this program can send data to four chips simultaneously. This program is in the form of C-language code that can be compiled and linked into an adaptive-optics software system. The program as supplied includes source code for integration into the adaptive-optics software, documentation, and a component that provides a demonstration of loading DAC codes from a text file. On a standard Windows desktop computer, the software can update 128 channels in 10 ms. On Real-Time Linux with a digital I/O card, the software can update 1024 channels (8 boards in parallel) every 8 ms.

  3. AEST: Adaptive Eigenvalue Stability Code

    NASA Astrophysics Data System (ADS)

    Zheng, L.-J.; Kotschenreuther, M.; Waelbroeck, F.; van Dam, J. W.; Berk, H.

    2002-11-01

    An adaptive eigenvalue linear stability code is developed. The aim is on one hand to include the non-ideal MHD effects into the global MHD stability calculation for both low and high n modes and on the other hand to resolve the numerical difficulty involving MHD singularity on the rational surfaces at the marginal stability. Our code follows some parts of philosophy of DCON by abandoning relaxation methods based on radial finite element expansion in favor of an efficient shooting procedure with adaptive gridding. The δ W criterion is replaced by the shooting procedure and subsequent matrix eigenvalue problem. Since the technique of expanding a general solution into a summation of the independent solutions employed, the rank of the matrices involved is just a few hundreds. This makes easier to solve the eigenvalue problem with non-ideal MHD effects, such as FLR or even full kinetic effects, as well as plasma rotation effect, taken into account. To include kinetic effects, the approach of solving for the distribution function as a local eigenvalue ω problem as in the GS2 code will be employed in the future. Comparison of the ideal MHD version of the code with DCON, PEST, and GATO will be discussed. The non-ideal MHD version of the code will be employed to study as an application the transport barrier physics in tokamak discharges.

  4. A genetic scale of reading frame coding.

    PubMed

    Michel, Christian J

    2014-08-21

    The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code.

  5. Roadmap to Majorana surface codes

    NASA Astrophysics Data System (ADS)

    Plugge, S.; Landau, L. A.; Sela, E.; Altland, A.; Flensberg, K.; Egger, R.

    2016-11-01

    Surface codes offer a very promising avenue towards fault-tolerant quantum computation. We argue that two-dimensional interacting networks of Majorana bound states in topological superconductor/semiconductor heterostructures hold several key advantages in that direction, concerning both the hardware realization and the actual operation of the code. We here discuss how topologically protected logical qubits in this Majorana surface code architecture can be defined, initialized, manipulated, and read out. All physical ingredients needed to implement these operations are routinely used in topologically trivial quantum devices. By means of quantum interference terms in linear conductance measurements, single-electron pumping protocols, and gate-tunable tunnel barriers, the full set of quantum gates required for universal quantum computation can be achieved. In particular, we show that designated multistep pumping sequences via tunnel-coupled quantum dots realize high-fidelity ancilla states for phase gates.

  6. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  7. A coded tracking telemetry system

    USGS Publications Warehouse

    Howey, P.W.; Seegar, W.S.; Fuller, M.R.; Titus, K.; Amlaner, Charles J.

    1989-01-01

    We describe the general characteristics of an automated radio telemetry system designed to operate for prolonged periods on a single frequency. Each transmitter sends a unique coded signal to a receiving system that encodes and records only the appropriater, pre-programmed codes. A record of the time of each reception is stored on diskettes in a micro-computer. This system enables continuous monitoring of infrequent signals (e.g. one per minute or one per hour), thus extending operation life or allowing size reduction of the transmitter, compared to conventional wildlife telemetry. Furthermore, when using unique codes transmitted on a single frequency, biologists can monitor many individuals without exceeding the radio frequency allocations for wildlife.

  8. FLOWTRAN-TF code benchmarking

    SciTech Connect

    Flach, G.P.

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss Of Coolant Accident (LOCA). A description of the code is given by Flach et al. (1990). This report provides benchmarking results for the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit (Smith et al., 1990a; 1990b). Individual constitutive relations are benchmarked in Sections 2 through 5 while in Sections 6 and 7 integral code benchmarking results are presented. An overall assessment of FLOWTRAN-TF for its intended use in computing the ECS power limit completes the document.

  9. FLOWTRAN-TF code description

    SciTech Connect

    Flach, G.P.

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  10. FLOWTRAN-TF code description

    SciTech Connect

    Flach, G.P.

    1991-09-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  11. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  12. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  13. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  14. Nonlinear Simulation of Alfven Eigenmodes driven by Energetic Particles: Comparison between HMGC and TAEFL Codes

    NASA Astrophysics Data System (ADS)

    Bierwage, Andreas; Spong, Donald A.

    2009-05-01

    Hybrid-MHD-Gyrokinetic Code (HMGC) [1] and the gyrofluid code TAEFL [2,3] are used for nonlinear simulation of Alfven Eigenmodes in Tokamak plasma. We compare results obtained in two cases: (I) a case designed for cross-code benchmark of TAE excitation; (II) a case based on a dedicated DIII-D shot #132707 where RSAE and TAE activity is observed. Differences between the numerical simulation results are discussed and future directions are outlined. [1] S. Briguglio, G. Vlad, F. Zonca and C. Kar, Phys. Plasmas 2 (1995) 3711. [2] D.A. Spong, B.A. Carreras and C.L. Hedrick, Phys. Fluids B4 (1992) 3316. [3] D.A. Spong, B.A. Carreras and C.L. Hedrick, Phys. Plasmas 1 (1994) 1503.

  15. The stellar atmosphere simulation code Bifrost. Code description and validation

    NASA Astrophysics Data System (ADS)

    Gudiksen, B. V.; Carlsson, M.; Hansteen, V. H.; Hayek, W.; Leenaarts, J.; Martínez-Sykora, J.

    2011-07-01

    Context. Numerical simulations of stellar convection and photospheres have been developed to the point where detailed shapes of observed spectral lines can be explained. Stellar atmospheres are very complex, and very different physical regimes are present in the convection zone, photosphere, chromosphere, transition region and corona. To understand the details of the atmosphere it is necessary to simulate the whole atmosphere since the different layers interact strongly. These physical regimes are very diverse and it takes a highly efficient massively parallel numerical code to solve the associated equations. Aims: The design, implementation and validation of the massively parallel numerical code Bifrost for simulating stellar atmospheres from the convection zone to the corona. Methods: The code is subjected to a number of validation tests, among them the Sod shock tube test, the Orzag-Tang colliding shock test, boundary condition tests and tests of how the code treats magnetic field advection, chromospheric radiation, radiative transfer in an isothermal scattering atmosphere, hydrogen ionization and thermal conduction. Results.Bifrost completes the tests with good results and shows near linear efficiency scaling to thousands of computing cores.

  16. Hybrid codes: Methods and applications

    SciTech Connect

    Winske, D. ); Omidi, N. )

    1991-01-01

    In this chapter we discuss hybrid'' algorithms used in the study of low frequency electromagnetic phenomena, where one or more ion species are treated kinetically via standard PIC methods used in particle codes and the electrons are treated as a single charge neutralizing massless fluid. Other types of hybrid models are possible, as discussed in Winske and Quest, but hybrid codes with particle ions and massless fluid electrons have become the most common for simulating space plasma physics phenomena in the last decade, as we discuss in this paper.

  17. Sensor Authentication: Embedded Processor Code

    SciTech Connect

    Svoboda, John

    2012-09-25

    Described is the c code running on the embedded Microchip 32bit PIC32MX575F256H located on the INL developed noise analysis circuit board. The code performs the following functions: Controls the noise analysis circuit board preamplifier voltage gains of 1, 10, 100, 000 Initializes the analog to digital conversion hardware, input channel selection, Fast Fourier Transform (FFT) function, USB communications interface, and internal memory allocations Initiates high resolution 4096 point 200 kHz data acquisition Computes complex 2048 point FFT and FFT magnitude. Services Host command set Transfers raw data to Host Transfers FFT result to host Communication error checking

  18. Guidelines for Coding FORTRAN Programs.

    DTIC Science & Technology

    1982-07-01

    39529 PE63759N I I CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Naval Ocean Research and Development Activity July 1982 Code 320 13. NUMBER OF...Element 63759N. ,-! ri ii Contents 1. INTRODUCTION 1 1.1 Scope 1 2. CODING FORTRAN STATEMENTS 2 2.1 General Remarks and Sug- 2 gestions 2.2 FORTRAN...Assignment 11 5.3 Masking Assignment 11 5.4 Multiple Assignment 11 6. CONTROL STATEMENTS 11 6.1 GO TO Statement 11 6.1.1 Unconditional GO TO 11 Statement

  19. Radio Losses for Concatenated Codes

    NASA Astrophysics Data System (ADS)

    Shambayati, S.

    2002-07-01

    The advent of higher powered spacecraft amplifiers and better ground receivers capable of tracking spacecraft carrier signals with narrower loop bandwidths requires better understanding of the carrier tracking loss (radio loss) mechanism of the concatenated codes used for deep-space missions. In this article, we present results of simulations performed for a (7,1/2), Reed-Solomon (255,223), interleaver depth-5 concatenated code in order to shed some light on this issue. Through these simulations, we obtained the performance of this code over an additive white Gaussian noise (AWGN) channel (the baseline performance) in terms of both its frame-error rate (FER) and its bit-error rate at the output of the Reed-Solomon decoder (RS-BER). After obtaining these results, we curve fitted the baseline performance curves for FER and RS-BER and calculated the high-rate radio losses for this code for an FER of 10^(-4) and its corresponding baseline RS-BER of 2.1 x 10^(-6) for a carrier loop signal-to-noise ratio (SNR) of 14.8 dB. This calculation revealed that even though over the AWGN channel the FER value and the RS-BER value correspond to each other (i.e., these values are obtained by the same bit SNR value), the RS-BER value has higher high-rate losses than does the FER value. Furthermore, this calculation contradicted the previous assumption th at at high data rates concatenated codes have the same radio losses as their constituent convolutional codes. Our results showed much higher losses for the FER and the RS-BER (by as much as 2 dB) than for the corresponding baseline BER of the convolutional code. Further simulations were performed to investigate the effects of changes in the data rate on the code's radio losses. It was observed that as the data rate increased the radio losses for both the FER and the RS-BER approached their respective calculated high-rate values. Furthermore, these simulations showed that a simple two-parameter function could model the increase in the

  20. Coding productivity in Sydney public hospitals.

    PubMed

    Dimitropoulos, Vera; Bennett, Adam; McIntosh, Jean

    2002-01-01

    The aims of this study were to compare Sydney public hospitals regarding medical record coding times to compare observed coding times with coding times necessary to avoid backlog and to evaluate the impact on coding time of casemix complexity, coder age, experience, job satisfaction, employment status, and salary. Coding time (in minutes) for each medical record over a two-week period was documented by 61 coders employed in 13 hospitals: six principal referral (PR), six major metropolitan (MM), and one paediatric specialist (PS) hospitals. The mean coding time for each coder was estimated by averaging across coding times for all records during the two-week period. In order to compare hospital mean coding times, the hospitals were grouped into PR and MM/PS groups. The mean coding time necessary to avoid coding backlog (expected coding time) for each hospital group was based on the total number of annual separations and filled full-time equivalent coding positions. The observed mean coding time was longer in the PR group than in the MM/PS group (p = 0.019); however, the observed coding time was within the expected coding time limit in both the PR and MM/PS groups. Casemix complexity tended to influence coding time, but neither age, experience, job satisfaction, employment status nor salary had any impact. In conclusion, the expected coding times, if reliable, indicate that coders in the two hospital groups were keeping coding up-to-date. Thus, the variation between hospital groups in coding time is of little importance, given that the main objective in coding productivity is to maintain the coding workload.

  1. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    PubMed

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  2. 49 CFR 178.702 - IBC codes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false IBC codes. 178.702 Section 178.702 Transportation...-Oriented Standards § 178.702 IBC codes. (a) Intermediate bulk container code designations consist of: two... the category of intermediate bulk container. (1) IBC code number designations are as follows: Type...

  3. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  4. 7 CFR 201.24 - Code designation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Code designation. 201.24 Section 201.24 Agriculture... REGULATIONS Labeling Agricultural Seeds § 201.24 Code designation. The code designation used in lieu of the... as may be designated by him for the purpose. When used, the code designation shall appear on...

  5. 7 CFR 201.28 - Code designation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Code designation. 201.28 Section 201.28 Agriculture... REGULATIONS Labeling Vegetable Seeds § 201.28 Code designation. The code designation used in lieu of the full... as may be designated by him for the purpose. When used, the code designation shall appear on...

  6. An Interactive Concatenated Turbo Coding System

    NASA Technical Reports Server (NTRS)

    Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  7. Convolutional coding combined with continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Pizzi, S. V.; Wilson, S. G.

    1985-01-01

    Background theory and specific coding designs for combined coding/modulation schemes utilizing convolutional codes and continuous-phase modulation (CPM) are presented. In this paper the case of r = 1/2 coding onto a 4-ary CPM is emphasized, with short-constraint length codes presented for continuous-phase FSK, double-raised-cosine, and triple-raised-cosine modulation. Coding buys several decibels of coding gain over the Gaussian channel, with an attendant increase of bandwidth. Performance comparisons in the power-bandwidth tradeoff with other approaches are made.

  8. Multichannel error correction code decoder

    NASA Technical Reports Server (NTRS)

    Wagner, Paul K.; Ivancic, William D.

    1993-01-01

    A brief overview of a processing satellite for a mesh very-small-aperture (VSAT) communications network is provided. The multichannel error correction code (ECC) decoder system, the uplink signal generation and link simulation equipment, and the time-shared decoder are described. The testing is discussed. Applications of the time-shared decoder are recommended.

  9. Tri-Coding of Information.

    ERIC Educational Resources Information Center

    Simpson, Timothy J.

    Paivio's Dual Coding Theory has received widespread recognition for its connection between visual and aural channels of internal information processing. The use of only two channels, however, cannot satisfactorily explain the effects witnessed every day. This paper presents a study suggesting the presence a third, kinesthetic channel, currently…

  10. Three-dimensional stellarator codes

    PubMed Central

    Garabedian, P. R.

    2002-01-01

    Three-dimensional computer codes have been used to develop quasisymmetric stellarators with modular coils that are promising candidates for a magnetic fusion reactor. The mathematics of plasma confinement raises serious questions about the numerical calculations. Convergence studies have been performed to assess the best configurations. Comparisons with recent data from large stellarator experiments serve to validate the theory. PMID:12140367

  11. Overview of CODE V development

    NASA Astrophysics Data System (ADS)

    Harris, Thomas I.

    1991-01-01

    This paper is part of a session that is aimed at briefly describing some of today''s optical design software packages with emphasis on the program''s philosophy and technology. CODE V is the ongoing result of a development process that began in the 1960''s it is now the result of many people''s efforts. This paper summarizes the roots of the program some of its history dominant philosophies and technologies that have contributed to its usefulness and some that drive its continued development. ROOTS OF CODE V Conceived in the early 60''s This was at a time when there was skepticism that " automatic design" could design lenses equal or better than " hand" methods. The concepts underlying CODE V and its predecessors were based on ten years of experience and exposure to the problems of a group of lens designers in a design-for-manufacture environment. The basic challenge was to show that lens design could be done better easier and faster by high quality computer-assisted design tools. The earliest development was for our own use as an engineering services organization -an in-house tool for custom design. As a tool it had to make us efficient in providing lens design and engineering services as a self-sustaining business. PHILOSOPHY OF OVTIM!ZATION IN CODE V Error function formation Based on experience as a designer we felt very strongly that there should be a clear separation of

  12. Reusable State Machine Code Generator

    NASA Astrophysics Data System (ADS)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  13. Changing Postal ZIP Code Boundaries

    DTIC Science & Technology

    2006-06-23

    for distribution to a specific delivery post office, identified by the fourth and fifth digits. For example, the ZIP Code for Alturas , the county seat...distribution point for some California post offices such as Alturas , Cedarville (96104), Fort Bidwell (96112), and Likely (96116), distinguished by the

  14. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  15. QR Codes: Taking Collections Further

    ERIC Educational Resources Information Center

    Ahearn, Caitlin

    2014-01-01

    With some thought and direction, QR (quick response) codes are a great tool to use in school libraries to enhance access to information. From March through April 2013, Caitlin Ahearn interned at Sanborn Regional High School (SRHS) under the supervision of Pam Harland. As a result of Harland's un-Deweying of the nonfiction collection at SRHS,…

  16. Generating Constant Weight Binary Codes

    ERIC Educational Resources Information Center

    Knight, D.G.

    2008-01-01

    The determination of bounds for A(n, d, w), the maximum possible number of binary vectors of length n, weight w, and pairwise Hamming distance no less than d, is a classic problem in coding theory. Such sets of vectors have many applications. A description is given of how the problem can be used in a first-year undergraduate computational…

  17. Dress Codes and Gang Activity.

    ERIC Educational Resources Information Center

    Gluckman, Ivan B.

    1996-01-01

    Concern with school violence and efforts to reduce gang visibility at school have led to controversy about students' constitutional rights to freedom of expression. This document outlines legal precedents and offers guidelines for developing a sound school policy on dress codes. It answers the following questions: (1) Are gang clothing and symbols…

  18. AEDS Property Classification Code Manual.

    ERIC Educational Resources Information Center

    Association for Educational Data Systems, Washington, DC.

    The control and inventory of property items using data processing machines requires a form of numerical description or code which will allow a maximum of description in a minimum of space on the data card. An adaptation of a standard industrial classification system is given to cover any expendable warehouse item or non-expendable piece of…

  19. Quantum rate-distortion coding

    NASA Astrophysics Data System (ADS)

    Barnum, Howard

    2000-10-01

    I introduce rate-distortion theory for the coding of quantum information, and derive a lower bound, involving the coherent information, on the rate at which qubits must be used to store or compress an entangled quantum source with a given maximum level of distortion per source emission.

  20. Transversal Clifford gates on folded surface codes

    SciTech Connect

    Moussa, Jonathan E.

    2016-10-12

    Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surface codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. Lastly, the specific application of these codes to universal quantum computation based on qubit fusion is also discussed.

  1. Transversal Clifford gates on folded surface codes

    DOE PAGES

    Moussa, Jonathan E.

    2016-10-12

    Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surfacemore » codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. Lastly, the specific application of these codes to universal quantum computation based on qubit fusion is also discussed.« less

  2. Coded Modulations for Mobile Satellite Communication Channels

    NASA Astrophysics Data System (ADS)

    Rhee, Dojun

    1995-01-01

    The mobile satellite (MSAT) channel is subject to multipath fading, shadowing, Doppler frequency shift, and adjacent channel interference (ACI). Therefore, transmitted signals face severe amplitude and phase distortions. This dissertation investigates various high performance and low decoding complexity coded modulation schemes for reliable voice and data transmissions over the shadowed mobile satellite channel and the Rayleigh fading channel. The dissertation consists of four parts. The first part presents a systematic technique for constructing MPSK trellis coded modulation (TCM) codes for voice transmission over the MSAT channel. The multilevel coding method is used for constructing TCM codes using convolutional codes with good free branch distances as the component codes or using both convolutional and block codes as the component codes. Simulation results show that these codes achieve good coding gains over the uncoded reference system and outperform existing TCM codes with the same decoding complexity. In the second part, using the multilevel coding method, multilevel block coded modulation (BCM) codes are constructed for voice transmission over the MSAT channel. Even though BCM is generally less power efficient than TCM for AWGN channels, BCM has a great potential to compete with TCM in the MSAT channel because of its shorter decoding depth and hence more effective interleaving. Binary Reed -Muller (RM) codes of length up to 32 are used as component codes. Simulation results show that these codes achieve good coding gains over the uncoded reference system and outperform TCM codes with the same decoding complexity. In the third part, a simple and systematic technique for constructing multilevel concatenated BCM schemes for data transmission over the shadowed MSAT channel and the Rayleigh fading channel is presented. These schemes are designed to achieve high-performance or large coding gain with reduced decoding complexity. Construction is based on a

  3. Accumulate-Repeat-Accumulate-Accumulate Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Samuel; Thorpe, Jeremy

    2007-01-01

    Accumulate-repeat-accumulate-accumulate (ARAA) codes have been proposed, inspired by the recently proposed accumulate-repeat-accumulate (ARA) codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. ARAA codes can be regarded as serial turbolike codes or as a subclass of low-density parity-check (LDPC) codes, and, like ARA codes they have projected graph or protograph representations; these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms. The objective in proposing ARAA codes as a subclass of ARA codes was to enhance the error-floor performance of ARA codes while maintaining simple encoding structures and low maximum variable node degree.

  4. Transversal Clifford gates on folded surface codes

    NASA Astrophysics Data System (ADS)

    Moussa, Jonathan E.

    2016-10-01

    Surface and color codes are two forms of topological quantum error correction in two spatial dimensions with complementary properties. Surface codes have lower-depth error detection circuits and well-developed decoders to interpret and correct errors, while color codes have transversal Clifford gates and better code efficiency in the number of physical qubits needed to achieve a given code distance. A formal equivalence exists between color codes and folded surface codes, but it does not guarantee the transferability of any of these favorable properties. However, the equivalence does imply the existence of constant-depth circuit implementations of logical Clifford gates on folded surface codes. We achieve and improve this result by constructing two families of folded surface codes with transversal Clifford gates. This construction is presented generally for qudits of any dimension. The specific application of these codes to universal quantum computation based on qubit fusion is also discussed.

  5. Letting the Data Lead the Code

    DTIC Science & Technology

    2015-08-01

    UNCLASSIFIED Page 1 of 4 UNCLASSIFIED: Distribution Statement A. Approved for public release. Letting the Data Lead the Code Darryl Bryk...used can radically simplify the code . A simple change to the way data is stored in a file can make a big difference in the code required to read it...This was made apparent recently with some Visual Basic code which would locate a list of numbers based on certain search criteria. The code was about

  6. Colour cyclic code for Brillouin distributed sensors

    NASA Astrophysics Data System (ADS)

    Le Floch, Sébastien; Sauser, Florian; Llera, Miguel; Rochat, Etienne

    2015-09-01

    For the first time, a colour cyclic coding (CCC) is theoretically and experimentally demonstrated for Brillouin optical time-domain analysis (BOTDA) distributed sensors. Compared to traditional intensity-modulated cyclic codes, the code presents an additional gain of √2 while keeping the same number of sequences as for a colour coding. A comparison with a standard BOTDA sensor is realized and validates the theoretical coding gain.

  7. The chromatin regulatory code: Beyond a histone code

    NASA Astrophysics Data System (ADS)

    Lesne, A.

    2006-03-01

    In this commentary on the contribution by Arndt Benecke in this issue, I discuss why the notion of “chromatin code” introduced and elaborated in this paper is to be preferred to that of “histone code”. Speaking of a code as regards nucleosome conformation and histone tail post-translational modifications only makes sense within the chromatin fiber, where their physico-chemical features can be translated into regulatory programs at the genome level, by means of a complex, multi-level interplay with the fiber architecture and dynamics settled in the course of Evolution. In particular, this chromatin code presumably exploits allosteric transitions of the chromatin fiber. The chromatin structure dependence of its translation suggests two alternative modes of transcription initiation regulation, also proposed in the paper by A. Benecke in this issue for interpreting strikingly bimodal micro-array data.

  8. The neuronal code(s) of the cerebellum.

    PubMed

    Heck, Detlef H; De Zeeuw, Chris I; Jaeger, Dieter; Khodakhah, Kamran; Person, Abigail L

    2013-11-06

    Understanding how neurons encode information in sequences of action potentials is of fundamental importance to neuroscience. The cerebellum is widely recognized for its involvement in the coordination of movements, which requires muscle activation patterns to be controlled with millisecond precision. Understanding how cerebellar neurons accomplish such high temporal precision is critical to understanding cerebellar function. Inhibitory Purkinje cells, the only output neurons of the cerebellar cortex, and their postsynaptic target neurons in the cerebellar nuclei, fire action potentials at high, sustained frequencies, suggesting spike rate modulation as a possible code. Yet, millisecond precise spatiotemporal spike activity patterns in Purkinje cells and inferior olivary neurons have also been observed. These results and ongoing studies suggest that the neuronal code used by cerebellar neurons may span a wide time scale from millisecond precision to slow rate modulations, likely depending on the behavioral context.

  9. Superimposed Code Theorectic Analysis of DNA Codes and DNA Computing

    DTIC Science & Technology

    2010-03-01

    that the hybridization that occurs between a DNA strand and its Watson - Crick complement can be used to perform mathematical computation. This research... Watson - Crick (WC) duplex, e.g., TCGCA TCGCA . Note that non-WC duplexes can form and such a formation is called a cross-hybridization. Cross...5’GAAAGTCGCGTA3’ Watson Crick (WC) Duplexes TACGCGACTTTC Cross Hybridized (CH) Duplexes ATTTTTGCGTTA GAAAAAGAAGAA Coding Strands for Ligation

  10. Genetic coding and gene expression - new Quadruplet genetic coding model

    NASA Astrophysics Data System (ADS)

    Shankar Singh, Rama

    2012-07-01

    Successful demonstration of human genome project has opened the door not only for developing personalized medicine and cure for genetic diseases, but it may also answer the complex and difficult question of the origin of life. It may lead to making 21st century, a century of Biological Sciences as well. Based on the central dogma of Biology, genetic codons in conjunction with tRNA play a key role in translating the RNA bases forming sequence of amino acids leading to a synthesized protein. This is the most critical step in synthesizing the right protein needed for personalized medicine and curing genetic diseases. So far, only triplet codons involving three bases of RNA, transcribed from DNA bases, have been used. Since this approach has several inconsistencies and limitations, even the promise of personalized medicine has not been realized. The new Quadruplet genetic coding model proposed and developed here involves all four RNA bases which in conjunction with tRNA will synthesize the right protein. The transcription and translation process used will be the same, but the Quadruplet codons will help overcome most of the inconsistencies and limitations of the triplet codes. Details of this new Quadruplet genetic coding model and its subsequent potential applications including relevance to the origin of life will be presented.

  11. Code-excited linear predictive coding of multispectral MR images

    NASA Astrophysics Data System (ADS)

    Hu, Jian-Hong; Wang, Yao; Cahill, Patrick

    1996-02-01

    This paper reports a multispectral code excited linear predictive coding method for the compression of well-registered multispectral MR images. Different linear prediction models and the adaptation schemes have been compared. The method which uses forward adaptive autoregressive (AR) model has proven to achieve a good compromise between performance, complexity and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over non-overlapping square macroblocks. Each macro-block is further divided into several micro-blocks and, the best excitation signals for each microblock are determined through an analysis-by-synthesis procedure. To satisfy the high quality requirement for medical images, the error between the original images and the synthesized ones are further specified using a vector quantizer. The MFCELP method has been applied to 26 sets of clinical MR neuro images (20 slices/set, 3 spectral bands/slice, 256 by 256 pixels/image, 12 bits/pixel). It provides a significant improvement over the discrete cosine transform (DCT) based JPEG method, a wavelet transform based embedded zero-tree wavelet (EZW) coding method, as well as the MSARMA method we developed before.

  12. Amino acid codes in mitochondria as possible clues to primitive codes

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  13. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    ERIC Educational Resources Information Center

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  14. Biological Information Transfer Beyond the Genetic Code: The Sugar Code

    NASA Astrophysics Data System (ADS)

    Gabius, H.-J.

    In the era of genetic engineering, cloning, and genome sequencing the focus of research on the genetic code has received an even further accentuation in the public eye. In attempting, however, to understand intra- and intercellular recognition processes comprehensively, the two biochemical dimensions established by nucleic acids and proteins are not sufficient to satisfactorily explain all molecular events in, for example, cell adhesion or routing. The consideration of further code systems is essential to bridge this gap. A third biochemical alphabet forming code words with an information storage capacity second to no other substance class in rather small units (words, sentences) is established by monosaccharides (letters). As hardware oligosaccharides surpass peptides by more than seven orders of magnitude in the theoretical ability to build isomers, when the total of conceivable hexamers is calculated. In addition to the sequence complexity, the use of magnetic resonance spectroscopy and molecular modeling has been instrumental in discovering that even small glycans can often reside in not only one but several distinct low-energy conformations (keys). Intriguingly, conformers can display notably different capacities to fit snugly into the binding site of nonhomologous receptors (locks). This process, experimentally verified for two classes of lectins, is termed "differential conformer selection." It adds potential for shifts of the conformer equilibrium to modulate ligand properties dynamically and reversibly to the well-known changes in sequence (including anomeric positioning and linkage points) and in pattern of substitution, for example, by sulfation. In the intimate interplay with sugar receptors (lectins, enzymes, and antibodies) the message of coding units of the sugar code is deciphered. Their recognition will trigger postbinding signaling and the intended biological response. Knowledge about the driving forces for the molecular rendezvous, i

  15. Maximal dinucleotide and trinucleotide circular codes.

    PubMed

    Michel, Christian J; Pellegrini, Marco; Pirillo, Giuseppe

    2016-01-21

    We determine here the number and the list of maximal dinucleotide and trinucleotide circular codes. We prove that there is no maximal dinucleotide circular code having strictly less than 6 elements (maximum size of dinucleotide circular codes). On the other hand, a computer calculus shows that there are maximal trinucleotide circular codes with less than 20 elements (maximum size of trinucleotide circular codes). More precisely, there are maximal trinucleotide circular codes with 14, 15, 16, 17, 18 and 19 elements and no maximal trinucleotide circular code having less than 14 elements. We give the same information for the maximal self-complementary dinucleotide and trinucleotide circular codes. The amino acid distribution of maximal trinucleotide circular codes is also determined.

  16. Quantum codes with low weight stabilizers

    NASA Astrophysics Data System (ADS)

    Kovalev, Alexey A.; Dumer, Ilya; Pryadko, Leonid P.

    2012-02-01

    We study quantum cyclic stabilizer codes whose stabilizer can be always defined by one or two stabilizer generators. Our main goal is to construct low-weight stabilizer generators that can yield quantum codes with high code rate and simple error correction. To do so, we apply the classical quaternary representation of stabilizer codes and extend our recent study of one-generator cyclic codes [1]. For any stabilizer generator of weight four or five, we formulate a necessary and sufficient condition for its commutativity. We then proceed with a design of additive cyclic codes with such generators. In some cases, we also extend our commutativity condition and code design to generators of weight six. In particular, quantum cyclic codes with stabilizers of weight four are mapped to the generalized toric codes. Here we also extend the notion of toric codes using a translationally invariant generator and periodic boundary conditions on a two dimensional lattice. Some of our numerically constructed codes can be redefined by means of Code Word Stabilized (CWS) representation [1] as quantum versions of repetition codes. We particularly concentrate on codes with a fixed nonzero rate for which the minimum distance asymptotically grows as the blocklength grows.[4pt] [1] arXiv:1108.5490v1

  17. Dual-code quantum computation model

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soo

    2015-08-01

    In this work, we propose the dual-code quantum computation model—a fault-tolerant quantum computation scheme which alternates between two different quantum error-correction codes. Since the chosen two codes have different sets of transversal gates, we can implement a universal set of gates transversally, thereby reducing the overall cost. We use code teleportation to convert between quantum states in different codes. The overall cost is decreased if code teleportation requires fewer resources than the fault-tolerant implementation of the non-transversal gate in a specific code. To analyze the cost reduction, we investigate two cases with different base codes, namely the Steane and Bacon-Shor codes. For the Steane code, neither the proposed dual-code model nor another variation of it achieves any cost reduction since the conventional approach is simple. For the Bacon-Shor code, the three proposed variations of the dual-code model reduce the overall cost. However, as the encoding level increases, the cost reduction decreases and becomes negative. Therefore, the proposed dual-code model is advantageous only when the encoding level is low and the cost of the non-transversal gate is relatively high.

  18. The ATLAS PanDA Monitoring System and its Evolution

    NASA Astrophysics Data System (ADS)

    Klimentov, A.; Nevski, P.; Potekhin, M.; Wenaus, T.

    2011-12-01

    The PanDA (Production and Distributed Analysis) Workload Management System is used for ATLAS distributed production and analysis worldwide. The needs of ATLAS global computing imposed challenging requirements on the design of PanDA in areas such as scalability, robustness, automation, diagnostics, and usability for both production shifters and analysis users. Through a system-wide job database, the PanDA monitor provides a comprehensive and coherent view of the system and job execution, from high level summaries to detailed drill-down job diagnostics. It is (like the rest of PanDA) an Apache-based Python application backed by Oracle. The presentation layer is HTML code generated on the fly in the Python application which is also responsible for managing database queries. However, this approach is lacking in user interface flexibility, simplicity of communication with external systems, and ease of maintenance. A decision was therefore made to migrate the PanDA monitor server to Django Web Application Framework and apply JSON/AJAX technology in the browser front end. This allows us to greatly reduce the amount of application code, separate data preparation from presentation, leverage open source for tools such as authentication and authorization mechanisms, and provide a richer and more dynamic user experience. We describe our approach, design and initial experience with the migration process.

  19. The EGS5 Code System

    SciTech Connect

    Hirayama, Hideo; Namito, Yoshihito; Bielajew, Alex F.; Wilderman, Scott J.; U., Michigan; Nelson, Walter R.; /SLAC

    2005-12-20

    In the nineteen years since EGS4 was released, it has been used in a wide variety of applications, particularly in medical physics, radiation measurement studies, and industrial development. Every new user and every new application bring new challenges for Monte Carlo code designers, and code refinements and bug fixes eventually result in a code that becomes difficult to maintain. Several of the code modifications represented significant advances in electron and photon transport physics, and required a more substantial invocation than code patching. Moreover, the arcane MORTRAN3[48] computer language of EGS4, was highest on the complaint list of the users of EGS4. The size of the EGS4 user base is difficult to measure, as there never existed a formal user registration process. However, some idea of the numbers may be gleaned from the number of EGS4 manuals that were produced and distributed at SLAC: almost three thousand. Consequently, the EGS5 project was undertaken. It was decided to employ the FORTRAN 77 compiler, yet include as much as possible, the structural beauty and power of MORTRAN3. This report consists of four chapters and several appendices. Chapter 1 is an introduction to EGS5 and to this report in general. We suggest that you read it. Chapter 2 is a major update of similar chapters in the old EGS4 report[126] (SLAC-265) and the old EGS3 report[61] (SLAC-210), in which all the details of the old physics (i.e., models which were carried over from EGS4) and the new physics are gathered together. The descriptions of the new physics are extensive, and not for the faint of heart. Detailed knowledge of the contents of Chapter 2 is not essential in order to use EGS, but sophisticated users should be aware of its contents. In particular, details of the restrictions on the range of applicability of EGS are dispersed throughout the chapter. First-time users of EGS should skip Chapter 2 and come back to it later if necessary. With the release of the EGS4 version

  20. Dopamine reward prediction error coding.

    PubMed

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  1. Axisymmetric generalized harmonic evolution code

    SciTech Connect

    Sorkin, Evgeny

    2010-04-15

    We describe the first axisymmetric numerical code based on the generalized harmonic formulation of the Einstein equations, which is regular at the axis. We test the code by investigating gravitational collapse of distributions of complex scalar field in a Kaluza-Klein spacetime. One of the key issues of the harmonic formulation is the choice of the gauge source functions, and we conclude that a damped-wave gauge is remarkably robust in this case. Our preliminary study indicates that evolution of regular initial data leads to formation both of black holes with spherical and cylindrical horizon topologies. Intriguingly, we find evidence that near threshold for black hole formation the number of outcomes proliferates. Specifically, the collapsing matter splits into individual pulses, two of which travel in the opposite directions along the compact dimension and one which is ejected radially from the axis. Depending on the initial conditions, a curvature singularity develops inside the pulses.

  2. Multidimensional Fuel Performance Code: BISON

    SciTech Connect

    2014-09-03

    BISON is a finite element based nuclear fuel performance code applicable to a variety of fuel forms including light water reactor fuel rods, TRISO fuel particles, and metallic rod and plate fuel (Refs. [a, b, c]). It solves the fully-coupled equations of thermomechanics and species diffusion and includes important fuel physics such as fission gas release and material property degradation with burnup. BISON is based on the MOOSE framework (Ref. [d]) and can therefore efficiently solve problems on 1-, 2- or 3-D meshes using standard workstations or large high performance computers. BISON is also coupled to a MOOSE-based mesoscale phase field material property simulation capability (Refs. [e, f]). As described here, BISON includes the code library named FOX, which was developed concurrent with BISON. FOX contains material and behavioral models that are specific to oxide fuels.

  3. CBP PHASE I CODE INTEGRATION

    SciTech Connect

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  4. GeoPhysical Analysis Code

    SciTech Connect

    2011-05-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-Written components: (i) a set of standard finite elements physics solvers for rersolving Darcy fluid flow, explicit mechanics, implicit mechanics, and fluid-mediated fracturing, including resolution of contact both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problems and problems involving hydraulic fracturing, where the mesh topology is dynamically changed. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for, e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release.

  5. Dopamine reward prediction error coding

    PubMed Central

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards—an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware. PMID:27069377

  6. Microgravity computing codes. User's guide

    NASA Astrophysics Data System (ADS)

    1982-01-01

    Codes used in microgravity experiments to compute fluid parameters and to obtain data graphically are introduced. The computer programs are stored on two diskettes, compatible with the floppy disk drives of the Apple 2. Two versions of both disks are available (DOS-2 and DOS-3). The codes are written in BASIC and are structured as interactive programs. Interaction takes place through the keyboard of any Apple 2-48K standard system with single floppy disk drive. The programs are protected against wrong commands given by the operator. The programs are described step by step in the same order as the instructions displayed on the monitor. Most of these instructions are shown, with samples of computation and of graphics.

  7. Computer access security code system

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  8. Impacts of Model Building Energy Codes

    SciTech Connect

    Athalye, Rahul A.; Sivaraman, Deepak; Elliott, Douglas B.; Liu, Bing; Bartlett, Rosemarie

    2016-10-31

    The U.S. Department of Energy (DOE) Building Energy Codes Program (BECP) periodically evaluates national and state-level impacts associated with energy codes in residential and commercial buildings. Pacific Northwest National Laboratory (PNNL), funded by DOE, conducted an assessment of the prospective impacts of national model building energy codes from 2010 through 2040. A previous PNNL study evaluated the impact of the Building Energy Codes Program; this study looked more broadly at overall code impacts. This report describes the methodology used for the assessment and presents the impacts in terms of energy savings, consumer cost savings, and reduced CO2 emissions at the state level and at aggregated levels. This analysis does not represent all potential savings from energy codes in the U.S. because it excludes several states which have codes which are fundamentally different from the national model energy codes or which do not have state-wide codes. Energy codes follow a three-phase cycle that starts with the development of a new model code, proceeds with the adoption of the new code by states and local jurisdictions, and finishes when buildings comply with the code. The development of new model code editions creates the potential for increased energy savings. After a new model code is adopted, potential savings are realized in the field when new buildings (or additions and alterations) are constructed to comply with the new code. Delayed adoption of a model code and incomplete compliance with the code’s requirements erode potential savings. The contributions of all three phases are crucial to the overall impact of codes, and are considered in this assessment.

  9. SLINGSHOT - a Coilgun Design Code

    SciTech Connect

    MARDER, BARRY M.

    2001-09-01

    The Sandia coilgun [1,2,3,4,5] is an inductive electromagnetic launcher. It consists of a sequence of powered, multi-turn coils surrounding a flyway of circular cross-section through which a conducting armature passes. When the armature is properly positioned with respect to a coil, a charged capacitor is switched into the coil circuit. The rising coil currents induce a current in the armature, producing a repulsive accelerating force. The basic numerical tool for modeling the coilgun is the SLINGSHOT code, an expanded, user-friendly successor to WARP-10 [6]. SLINGSHOT computes the currents in the coils and armature, finds the forces produced by those currents, and moves the armature through the array of coils. In this approach, the cylindrically symmetric coils and armature are subdivided into concentric hoops with rectangular cross-section, in each of which the current is assumed to be uniform. The ensemble of hoops are treated as coupled circuits. The specific heats and resistivities of the hoops are found as functions of temperature and used to determine the resistive heating. The code calculates the resistances and inductances for all hoops, and the mutual inductances for all hoop pairs. Using these, it computes the hoop currents from their circuit equations, finds the forces from the products of these currents and the mutual inductance gradient, and moves the armature. Treating the problem as a set of coupled circuits is a fast and accurate approach compared to solving the field equations. Its use, however, is restricted to problems in which the symmetry dictates the current paths. This paper is divided into three parts. The first presents a demonstration of the code. The second describes the input and output. The third part describes the physical models and numerical methods used in the code. It is assumed that the reader is familiar with coilguns.

  10. HADES, A Radiographic Simulation Code

    SciTech Connect

    Aufderheide, M.B.; Slone, D.M.; Schach von Wittenau, A.E.

    2000-08-18

    We describe features of the HADES radiographic simulation code. We begin with a discussion of why it is useful to simulate transmission radiography. The capabilities of HADES are described, followed by an application of HADES to a dynamic experiment recently performed at the Los Alamos Neutron Science Center. We describe quantitative comparisons between experimental data and HADES simulations using a copper step wedge. We conclude with a short discussion of future work planned for HADES.

  11. Future trends in image coding

    NASA Astrophysics Data System (ADS)

    Habibi, Ali

    1993-01-01

    The objective of this article is to present a discussion on the future of image data compression in the next two decades. It is virtually impossible to predict with any degree of certainty the breakthroughs in theory and developments, the milestones in advancement of technology and the success of the upcoming commercial products in the market place which will be the main factors in establishing the future stage to image coding. What we propose to do, instead, is look back at the progress in image coding during the last two decades and assess the state of the art in image coding today. Then, by observing the trends in developments of theory, software, and hardware coupled with the future needs for use and dissemination of imagery data and the constraints on the bandwidth and capacity of various networks, predict the future state of image coding. What seems to be certain today is the growing need for bandwidth compression. The television is using a technology which is half a century old and is ready to be replaced by high definition television with an extremely high digital bandwidth. Smart telephones coupled with personal computers and TV monitors accommodating both printed and video data will be common in homes and businesses within the next decade. Efficient and compact digital processing modules using developing technologies will make bandwidth compressed imagery the cheap and preferred alternative in satellite and on-board applications. In view of the above needs, we expect increased activities in development of theory, software, special purpose chips and hardware for image bandwidth compression in the next two decades. The following sections summarize the future trends in these areas.

  12. Predictive coding of multisensory timing

    PubMed Central

    Shi, Zhuanghua; Burr, David

    2016-01-01

    The sense of time is foundational for perception and action, yet it frequently departs significantly from physical time. In the paper we review recent progress on temporal contextual effects, multisensory temporal integration, temporal recalibration, and related computational models. We suggest that subjective time arises from minimizing prediction errors and adaptive recalibration, which can be unified in the framework of predictive coding, a framework rooted in Helmholtz’s ‘perception as inference’. PMID:27695705

  13. Computer Code for Nanostructure Simulation

    NASA Technical Reports Server (NTRS)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  14. Sensitivity of coded mask telescopes

    SciTech Connect

    Skinner, Gerald K

    2008-05-20

    Simple formulas are often used to estimate the sensitivity of coded mask x-ray or gamma-ray telescopes, but these are strictly applicable only if a number of basic assumptions are met. Complications arise, for example, if a grid structure is used to support the mask elements, if the detector spatial resolution is not good enough to completely resolve all the detail in the shadow of the mask, or if any of a number of other simplifying conditions are not fulfilled. We derive more general expressions for the Poisson-noise-limited sensitivity of astronomical telescopes using the coded mask technique, noting explicitly in what circumstances they are applicable. The emphasis is on using nomenclature and techniques that result in simple and revealing results. Where no convenient expression is available a procedure is given that allows the calculation of the sensitivity. We consider certain aspects of the optimization of the design of a coded mask telescope and show that when the detector spatial resolution and the mask to detector separation are fixed, the best source location accuracy is obtained when the mask elements are equal in size to the detector pixels.

  15. Sparse Coding for Alpha Matting.

    PubMed

    Johnson, Jubin; Varnousfaderani, Ehsan Shahrian; Cholakkal, Hisham; Rajan, Deepu

    2016-07-01

    Existing color sampling-based alpha matting methods use the compositing equation to estimate alpha at a pixel from the pairs of foreground ( F ) and background ( B ) samples. The quality of the matte depends on the selected ( F,B ) pairs. In this paper, the matting problem is reinterpreted as a sparse coding of pixel features, wherein the sum of the codes gives the estimate of the alpha matte from a set of unpaired F and B samples. A non-parametric probabilistic segmentation provides a certainty measure on the pixel belonging to foreground or background, based on which a dictionary is formed for use in sparse coding. By removing the restriction to conform to ( F,B ) pairs, this method allows for better alpha estimation from multiple F and B samples. The same framework is extended to videos, where the requirement of temporal coherence is handled effectively. Here, the dictionary is formed by samples from multiple frames. A multi-frame graph model, as opposed to a single image as for image matting, is proposed that can be solved efficiently in closed form. Quantitative and qualitative evaluations on a benchmark dataset are provided to show that the proposed method outperforms the current stateoftheart in image and video matting.

  16. Sparse Coding for Alpha Matting.

    PubMed

    Johnson, Jubin; Varnousfaderani, Ehsan; Cholakkal, Hisham; Rajan, Deepu

    2016-04-21

    Existing color sampling based alpha matting methods use the compositing equation to estimate alpha at a pixel from pairs of foreground (F) and background (B) samples. The quality of the matte depends on the selected (F,B) pairs. In this paper, the matting problem is reinterpreted as a sparse coding of pixel features, wherein the sum of the codes gives the estimate of the alpha matte from a set of unpaired F and B samples. A non-parametric probabilistic segmentation provides a certainty measure on the pixel belonging to foreground or background, based on which a dictionary is formed for use in sparse coding. By removing the restriction to conform to (F,B) pairs, this method allows for better alpha estimation from multiple F and B samples. The same framework is extended to videos, where the requirement of temporal coherence is handled effectively. Here, the dictionary is formed by samples from multiple frames. A multi-frame graph model, as opposed to a single image as for image matting, is proposed that can be solved efficiently in closed form. Quantitative and qualitative evaluations on a benchmark dataset are provided to show that the proposed method outperforms current state-of-the-art in image and video matting.

  17. Sensitivity of coded mask telescopes.

    PubMed

    Skinner, Gerald K

    2008-05-20

    Simple formulas are often used to estimate the sensitivity of coded mask x-ray or gamma-ray telescopes, but these are strictly applicable only if a number of basic assumptions are met. Complications arise, for example, if a grid structure is used to support the mask elements, if the detector spatial resolution is not good enough to completely resolve all the detail in the shadow of the mask, or if any of a number of other simplifying conditions are not fulfilled. We derive more general expressions for the Poisson-noise-limited sensitivity of astronomical telescopes using the coded mask technique, noting explicitly in what circumstances they are applicable. The emphasis is on using nomenclature and techniques that result in simple and revealing results. Where no convenient expression is available a procedure is given that allows the calculation of the sensitivity. We consider certain aspects of the optimization of the design of a coded mask telescope and show that when the detector spatial resolution and the mask to detector separation are fixed, the best source location accuracy is obtained when the mask elements are equal in size to the detector pixels.

  18. User instructions for the CIDER Dose Code

    SciTech Connect

    Eslinger, P.W.; Lessor, K.S.; Ouderkirk, S.J.

    1994-05-01

    This document provides user instructions for the CIDER (Calculation of Individual Doses from Environmental Radionuclides) computer code. The CIDER code computes estimates of annual doses estimated for both reference individuals with a known residence and food consumption history. This document also provides user instructions for four utility codes used to build input data libraries for CIDER. These utility codes are ENVFAC (environmental factors), FOOFAC (food factors), LIFFAC (lifestyle factors), and ORGFAC (organ factors). Finally, this document provides user instructions for the EXPAND utility code. The EXPAND code processes a result file from CIDER and extracts a summary of the dose information for reporting or plotting purposes.

  19. M-Code Benefits and Availability

    DTIC Science & Technology

    2015-04-29

    Code Benefits and Availability Capt Travis Mills, SMC/GPEP 29 Apr 15 Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...TITLE AND SUBTITLE M- Code Benefits and Availability 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER... Code • Process forM- Code Delivery and Initial Operations • M- Code Available for Use Forecast 2015 04 28 _M-Code_v2 pptx UNCLASSIFIED/APPROVED FOR

  20. Tandem Mirror Reactor Systems Code (Version I)

    SciTech Connect

    Reid, R.L.; Finn, P.A.; Gohar, M.Y.; Barrett, R.J.; Gorker, G.E.; Spampinaton, P.T.; Bulmer, R.H.; Dorn, D.W.; Perkins, L.J.; Ghose, S.

    1985-09-01

    A computer code was developed to model a Tandem Mirror Reactor. Ths is the first Tandem Mirror Reactor model to couple, in detail, the highly linked physics, magnetics, and neutronic analysis into a single code. This report describes the code architecture, provides a summary description of the modules comprising the code, and includes an example execution of the Tandem Mirror Reactor Systems Code. Results from this code for two sensitivity studies are also included. These studies are: (1) to determine the impact of center cell plasma radius, length, and ion temperature on reactor cost and performance at constant fusion power; and (2) to determine the impact of reactor power level on cost.

  1. Combined trellis coding with asymmetric modulations

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Simon, M. K.

    1985-01-01

    The use of asymmetric signal constellations combined with optimized trellis coding to improve the performance of coded systems without increasing the average or peak power, or changing the bandwidth constraints of a system is discussed. The trellis code, asymmetric signal set, and Viterbi decoder of the system model are examined. The procedures for assigning signals to state transitions of the trellis code are described; the performance of the trellis coding system is evaluated. Examples of AM, QAM, and MPSK modulations with short memory trellis codes are presented.

  2. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1985-01-01

    A concatenated coding scheme for error contol in data communications was analyzed. The inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughout efficiency of the proposed error control scheme incorporated with a selective repeat ARQ retransmission strategy is analyzed.

  3. Random Coding Bounds for DNA Codes Based on Fibonacci Ensembles of DNA Sequences

    DTIC Science & Technology

    2008-07-01

    COVERED (From - To) 6 Jul 08 – 11 Jul 08 4. TITLE AND SUBTITLE RANDOM CODING BOUNDS FOR DNA CODES BASED ON FIBONACCI ENSEMBLES OF DNA SEQUENCES ... sequences which are generalizations of the Fibonacci sequences . 15. SUBJECT TERMS DNA Codes, Fibonacci Ensembles, DNA Computing, Code Optimization 16...coding bound on the rate of DNA codes is proved. To obtain the bound, we use some ensembles of DNA sequences which are generalizations of the Fibonacci

  4. An implicit Smooth Particle Hydrodynamic code

    SciTech Connect

    Knapp, Charles E.

    2000-05-01

    An implicit version of the Smooth Particle Hydrodynamic (SPH) code SPHINX has been written and is working. In conjunction with the SPHINX code the new implicit code models fluids and solids under a wide range of conditions. SPH codes are Lagrangian, meshless and use particles to model the fluids and solids. The implicit code makes use of the Krylov iterative techniques for solving large linear-systems and a Newton-Raphson method for non-linear corrections. It uses numerical derivatives to construct the Jacobian matrix. It uses sparse techniques to save on memory storage and to reduce the amount of computation. It is believed that this is the first implicit SPH code to use Newton-Krylov techniques, and is also the first implicit SPH code to model solids. A description of SPH and the techniques used in the implicit code are presented. Then, the results of a number of tests cases are discussed, which include a shock tube problem, a Rayleigh-Taylor problem, a breaking dam problem, and a single jet of gas problem. The results are shown to be in very good agreement with analytic solutions, experimental results, and the explicit SPHINX code. In the case of the single jet of gas case it has been demonstrated that the implicit code can do a problem in much shorter time than the explicit code. The problem was, however, very unphysical, but it does demonstrate the potential of the implicit code. It is a first step toward a useful implicit SPH code.

  5. Accumulate-Repeat-Accumulate-Accumulate-Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Sam; Thorpe, Jeremy

    2004-01-01

    Inspired by recently proposed Accumulate-Repeat-Accumulate (ARA) codes [15], in this paper we propose a channel coding scheme called Accumulate-Repeat-Accumulate-Accumulate (ARAA) codes. These codes can be seen as serial turbo-like codes or as a subclass of Low Density Parity Check (LDPC) codes, and they have a projected graph or protograph representation; this allows for a high-speed iterative decoder implementation using belief propagation. An ARAA code can be viewed as a precoded Repeat-and-Accumulate (RA) code with puncturing in concatenation with another accumulator, where simply an accumulator is chosen as the precoder; thus ARAA codes have a very fast encoder structure. Using density evolution on their associated protographs, we find examples of rate-lJ2 ARAA codes with maximum variable node degree 4 for which a minimum bit-SNR as low as 0.21 dB from the channel capacity limit can be achieved as the block size goes to infinity. Such a low threshold cannot be achieved by RA or Irregular RA (IRA) or unstructured irregular LDPC codes with the same constraint on the maximum variable node degree. Furthermore by puncturing the accumulators we can construct families of higher rate ARAA codes with thresholds that stay close to their respective channel capacity thresholds uniformly. Iterative decoding simulation results show comparable performance with the best-known LDPC codes but with very low error floor even at moderate block sizes.

  6. User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.

    1982-01-01

    This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.

  7. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the

  8. Block truncation signature coding for hyperspectral analysis

    NASA Astrophysics Data System (ADS)

    Chakravarty, Sumit; Chang, Chein-I.

    2008-08-01

    This paper introduces a new signature coding which is designed based on the well-known Block Truncation Coding (BTC). It comprises of bit-maps of the signature blocks generated by different threshold criteria. Two new BTC-based algorithms are developed for signature coding, to be called Block Truncation Signature Coding (BTSC) and 2-level BTSC (2BTSC). In order to compare the developed BTC based algorithms with current binary signature coding schemes such as Spectral Program Analysis Manager (SPAM) developed by Mazer et al. and Spectral Feature-based Binary Coding (SFBC) by Qian et al., three different thresholding functions, local block mean, local block gradient, local block correlation are derived to improve the BTSC performance where the combined bit-maps generated by these thresholds can provide better spectral signature characterization. Experimental results reveal that the new BTC-based signature coding performs more effectively in characterizing spectral variations than currently available binary signature coding methods.

  9. Dinucleotide circular codes and bijective transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-12-07

    The presence of circular codes in mRNA coding sequences is postulated to be involved in informational mechanisms aimed at detecting and maintaining the normal reading frame during protein synthesis. Most of the recent research is focused on trinucleotide circular codes. However, also dinucleotide circular codes are important since dinucleotides are ubiquitous in genomes and associated to important biological functions. In this work we adopt the group theoretic approach used for trinucleotide codes in Fimmel et al. (2015) to study dinucleotide circular codes and highlight their symmetry properties. Moreover, we characterize such codes in terms of n-circularity and provide a graph representation that allows to visualize them geometrically. The results establish a theoretical framework for the study of the biological implications of dinucleotide circular codes in genomic sequences.

  10. Robust coding over noisy overcomplete channels.

    PubMed

    Doi, Eizaburo; Balcan, Doru C; Lewicki, Michael S

    2007-02-01

    We address the problem of robust coding in which the signal information should be preserved in spite of intrinsic noise in the representation. We present a theoretical analysis for 1- and 2-D cases and characterize the optimal linear encoder and decoder in the mean-squared error sense. Our analysis allows for an arbitrary number of coding units, thus including both under- and over-complete representations, and provides insights into optimal coding strategies. In particular, we show how the form of the code adapts to the number of coding units and to different data and noise conditions in order to achieve robustness. We also present numerical solutions of robust coding for high-dimensional image data, demonstrating that these codes are substantially more robust than other linear image coding methods such as PCA, ICA, and wavelets.

  11. A Better Handoff for Code Officials

    SciTech Connect

    Conover, David R.; Yerkes, Sara

    2010-09-24

    The U.S. Department of Energy's Building Energy Codes Program has partnered with ICC to release the new Building Energy Codes Resource Guide: Code Officials Edition. We created this binder of practical materials for a simple reason: code officials are busy learning and enforcing several codes at once for the diverse buildings across their jurisdictions. This doesn’t leave much time to search www.energycodes.gov, www.iccsafe.org, or the range of other helpful web-based resources for the latest energy codes tools, support, and information. So, we decided to bring the most relevant materials to code officials in a way that works best with their daily routine, and point to where they can find even more. Like a coach’s game plan, the Resource Guide is an "energy playbook" for code officials.

  12. Residential construction code impacts on radon

    SciTech Connect

    Galbraith, S.; Brennan, T.; Osborne, M.C.

    1988-04-01

    The paper discusses residential construction-code impacts on radon. It references existing residential construction codes that pertain to the elements of construction that impact either the ability to seal radon out of houses or the ability to achieve good soil ventilation for radon control. Several inconsistencies in the codes that will impact radon resistant construction are identified. Resolution of these resulting radon issues is necessary before specification-style building codes can be developed to achieve radon-resistant construction.

  13. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  14. A Pseudorandom Code Modulated LIDAR

    NASA Astrophysics Data System (ADS)

    Hunt, K. P.; Eichinger, W. E.; Kruger, A.

    2009-12-01

    Typical Light Detection and Ranging (LIDAR) uses high power pulsed lasers to ensure a detectable return signal. For short ranges, modulated diode lasers offer an attractive alternative, particularly in the areas of size, weight, cost, eye safety and use of energy. Flexible electronic modulation of the laser diode allows the development of pseudorandom code (PRC) LIDAR systems that can overcome the disadvantage of low output power and thus low signal to noise ratios. Different PRCs have been proposed. For example, so called M-sequences can be generated simply, but are unbalanced: they have more ones than zeros, which results in a residual noise component. Other sequences such as the A1 and A2 sequences are balanced, but have two autocorrelation peaks, resulting in undesirable pickup of signals from different ranges. In this work, we investigate a new code, an M-sequence with a zero added at the end. The result is still easily generated and has a single autocorrelation peak, but is now balanced. We loaded these sequences into a commercial arbitrary waveform generator (ARB), an Agilent 33250A, which then modulates the laser diode. This allows sequences to be changed quickly and easily, permitting us to design and investigate a wide range of PRC sequences with desirable properties. The ARB modulates a Melles Griot 56ICS near infrared laser diode at a 10 MHz chip rate. Backscatter is collected and focused by a telescope and the detected signal is sampled and correlated with the known PRC. We have gathered data from this LIDAR system and experimentally assessed the performance of this new class of codes.

  15. Power System Optimization Codes Modified

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1999-01-01

    A major modification of and addition to existing Closed Brayton Cycle (CBC) space power system optimization codes was completed. These modifications relate to the global minimum mass search driver programs containing three nested iteration loops comprising iterations on cycle temperature ratio, and three separate pressure ratio iteration loops--one loop for maximizing thermodynamic efficiency, one for minimizing radiator area, and a final loop for minimizing overall power system mass. Using the method of steepest ascent, the code sweeps through the pressure ratio space repeatedly, each time with smaller iteration step sizes, so that the three optimum pressure ratios can be obtained to any desired accuracy for each of the objective functions referred to above (i.e., maximum thermodynamic efficiency, minimum radiator area, and minimum system mass). Two separate options for the power system heat source are available: 1. A nuclear fission reactor can be used. It is provided with a radiation shield 1. (composed of a lithium hydride (LiH) neutron shield and tungsten (W) gamma shield). Suboptions can be used to select the type of reactor (i.e., fast spectrum liquid metal cooled or epithermal high-temperature gas reactor (HTGR)). 2. A solar heat source can be used. This option includes a parabolic concentrator and heat receiver for raising the temperature of the recirculating working fluid. A useful feature of the code modifications is that key cycle parameters are displayed, including the overall system specific mass in kilograms per kilowatt and the system specific power in watts per kilogram, as the results for each temperature ratio are computed. As the minimum mass temperature ratio is encountered, a message is printed out. Several levels of detailed information on cycle state points, subsystem mass results, and radiator temperature profiles are stored for this temperature ratio condition and can be displayed or printed by users.

  16. Design of additive quantum codes via the code-word-stabilized framework

    NASA Astrophysics Data System (ADS)

    Kovalev, Alexey A.; Dumer, Ilya; Pryadko, Leonid P.

    2011-12-01

    We consider design of the quantum stabilizer codes via a two-step, low-complexity approach based on the framework of codeword-stabilized (CWS) codes. In this framework, each quantum CWS code can be specified by a graph and a binary code. For codes that can be obtained from a given graph, we give several upper bounds on the distance of a generic (additive or nonadditive) CWS code, and the lower Gilbert-Varshamov bound for the existence of additive CWS codes. We also consider additive cyclic CWS codes and show that these codes correspond to a previously unexplored class of single-generator cyclic stabilizer codes. We present several families of simple stabilizer codes with relatively good parameters.

  17. Anticounterfeit holographic marks with secret codes

    NASA Astrophysics Data System (ADS)

    Liu, Shou; Zhang, Xiangsu; Lai, Hongkai

    1993-11-01

    The paper introduces the methods of making secret codes in the holograms for the purpose of anti-counterfeiting, especially the production of two kinds of visual holographic secret codes. The optical arrangements for recording are presented, and the effective results from using the visual secret codes into holographic trade marks are reported.

  18. Subband coding for image data archiving

    NASA Technical Reports Server (NTRS)

    Glover, Daniel; Kwatra, S. C.

    1993-01-01

    The use of subband coding on image data is discussed. An overview of subband coding is given. Advantages of subbanding for browsing and progressive resolution are presented. Implementations for lossless and lossy coding are discussed. Algorithm considerations and simple implementations of subband systems are given.

  19. 1 CFR 22.6 - Code designation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Code designation. 22.6 Section 22.6 General... DOCUMENTS PREPARATION OF NOTICES AND PROPOSED RULES Proposed Rules § 22.6 Code designation. The area of the Code of Federal Regulations directly affected by a proposed regulatory action shall be identified...

  20. 75 FR 20833 - Building Energy Codes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-21

    ... of Energy Efficiency and Renewable Energy Building Energy Codes AGENCY: Office of Energy Efficiency..., implement, enforce, and assess compliance with the current model building energy codes or their equivalent... compliance with the latest model building energy codes, Standard 90.1-2007, Energy Standard for...

  1. Ultra-narrow bandwidth voice coding

    DOEpatents

    Holzrichter, John F.; Ng, Lawrence C.

    2007-01-09

    A system of removing excess information from a human speech signal and coding the remaining signal information, transmitting the coded signal, and reconstructing the coded signal. The system uses one or more EM wave sensors and one or more acoustic microphones to determine at least one characteristic of the human speech signal.

  2. The general theory of convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Stanley, R. P.

    1993-01-01

    This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.

  3. State energy codes: An uphill battle

    SciTech Connect

    Bodzin, S.

    1997-03-01

    Energy codes have helped many states and counties achieve higher efficiency in new construction, but builders and efficiency advocates continue to struggle over how and when to change these codes. This article presents state by state residential energy codes as well as a discussion of the problems. 1 fig., 2 tabs.

  4. Subband coding for image data archiving

    NASA Technical Reports Server (NTRS)

    Glover, D.; Kwatra, S. C.

    1992-01-01

    The use of subband coding on image data is discussed. An overview of subband coding is given. Advantages of subbanding for browsing and progressive resolution are presented. Implementations for lossless and lossy coding are discussed. Algorithm considerations and simple implementations of subband are given.

  5. Improved Panel-Method/Potential-Flow Code

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1991-01-01

    Panel code PMARC (Panel Method Ames Research Center) numerically simulates flow field around complex three-dimensional bodies, such as complete aircraft models. Based on potential-flow theory. Written in FORTRAN 77, with exception of namelist extension used for input. Structure facilitates addition of new features to code and tailoring of code to specific problems and computer hardware constraints.

  6. TRACK : the new beam dynamics code.

    SciTech Connect

    Aseev, V. N.; Ostroumov, P. N.; Lessner, E. S.; Mustapha, B.; Physics

    2005-01-01

    The new ray-tracing code TRACK originally developed to fulfill the special requirements of the RIA accelerator systems is a general beam dynamics code. It is currently being used for the design and simulation of future proton and heavy-ion linacs at several Labs. This paper presents a general description of the code TRACK emphasizing its main new features and recent updates.

  7. Neural Elements for Predictive Coding

    PubMed Central

    Shipp, Stewart

    2016-01-01

    Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many ‘illusory’ instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry – that neurons with extrinsically bifurcating axons do not project in both directions – has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic ‘canonical microcircuit’ and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by

  8. Simplified Decoding of Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Reed, I. S.

    1986-01-01

    Some complicated intermediate steps shortened or eliminated. Decoding of convolutional error-correcting digital codes simplified by new errortrellis syndrome technique. In new technique, syndrome vector not computed. Instead, advantage taken of newly-derived mathematical identities simplify decision tree, folding it back on itself into form called "error trellis." This trellis graph of all path solutions of syndrome equations. Each path through trellis corresponds to specific set of decisions as to received digits. Existing decoding algorithms combined with new mathematical identities reduce number of combinations of errors considered and enable computation of correction vector directly from data and check bits as received.

  9. Mosaic of coded aperture arrays

    DOEpatents

    Fenimore, Edward E.; Cannon, Thomas M.

    1980-01-01

    The present invention pertains to a mosaic of coded aperture arrays which is capable of imaging off-axis sources with minimum detector size. Mosaics of the basic array pattern create a circular on periodic correlation of the object on a section of the picture plane. This section consists of elements of the central basic pattern as well as elements from neighboring patterns and is a cyclic version of the basic pattern. Since all object points contribute a complete cyclic version of the basic pattern, a section of the picture, which is the size of the basic aperture pattern, contains all the information necessary to image the object with no artifacts.

  10. Pump CFD code validation tests

    NASA Technical Reports Server (NTRS)

    Brozowski, L. A.

    1993-01-01

    Pump CFD code validation tests were accomplished by obtaining nonintrusive flow characteristic data at key locations in generic current liquid rocket engine turbopump configurations. Data were obtained with a laser two-focus (L2F) velocimeter at scaled design flow. Three components were surveyed: a 1970's-designed impeller, a 1990's-designed impeller, and a four-bladed unshrouded inducer. Two-dimensional velocities were measured upstream and downstream of the two impellers. Three-dimensional velocities were measured upstream, downstream, and within the blade row of the unshrouded inducer.

  11. Princeton spectral equilibrium code: PSEC

    SciTech Connect

    Ling, K.M.; Jardin, S.C.

    1984-03-01

    A fast computer code has been developed to calculate free-boundary solutions to the plasma equilibrium equation that are consistent with the currents in external coils and conductors. The free-boundary formulation is based on the minimization of a mean-square error epsilon while the fixed-boundary solution is based on a variational principle and spectral representation of the coordinates x(psi,theta) and z(psi,theta). Specific calculations using the Columbia University Torus II, the Poloidal Divertor Experiment (PDX), and the Tokamak Fusion Test Reactor (TFTR) geometries are performed.

  12. Using the DEWSBR computer code

    SciTech Connect

    Cable, G.D.

    1989-09-01

    A computer code is described which is designed to determine the fraction of time during which a given ground location is observable from one or more members of a satellite constellation in earth orbit. Ground visibility parameters are determined from the orientation and strength of an appropriate ionized cylinder (used to simulate a beam experiment) at the selected location. Satellite orbits are computed in a simplified two-body approximation computation. A variety of printed and graphical outputs is provided. 9 refs., 50 figs., 2 tabs.

  13. Euclidean Decoders for BCH Codes

    DTIC Science & Technology

    1988-04-01

    decoding problem for BCH codes is to solve this set of 2t ( nonlinear simultaneous) equations for the v unknown error locations XX and the v unknown...the sme -eise as thecins. ofinin ofxb~xad Nx banda the ith iteration ofth elkm-asy lgr h. progam 7. Cto)=i- Qij(asy Weof now asserto an laei hl hw hta...this set of 2t ( nonlinear simultaneous) equations for the v unknown error locations Xv, the v unknown error magnitudes Yj, and the t unknown erasure

  14. Engineering and Design: Interim Procedure for Specifying Earthquake Motions

    DTIC Science & Technology

    2007-11-02

    Earthquake Engineering, Instituto di Scienza e Techn?.ca delle. Construzioni Politechico di Milano, Piazza da Leonardo da Vinci , 32, 20133 Milano, Italia...Construzioni Politechico di Milano, Piazza da Leonardo da Vinci , 32, 20133 Milano, Italia. 17. Seed, H. B., Murarka, R., Lysmer, J., and Idriss, I. M. 1976

  15. Improving Security in the ATLAS PanDA System

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Maeno, T.; Nilsson, P.; Stewart, G.; Potekhin, M.; Wenaus, T.

    2011-12-01

    The security challenges faced by users of the grid are considerably different to those faced in previous environments. The adoption of pilot jobs systems by LHC experiments has mitigated many of the problems associated with the inhomogeneities found on the grid and has greatly improved job reliability; however, pilot jobs systems themselves must then address many security issues, including the execution of multiple users' code under a common 'grid' identity. In this paper we describe the improvements and evolution of the security model in the ATLAS PanDA (Production and Distributed Analysis) system. We describe the security in the PanDA server which is in place to ensure that only authorized members of the VO are allowed to submit work into the system and that jobs are properly audited and monitored. We discuss the security in place between the pilot code itself and the PanDA server, ensuring that only properly authenticated workload is delivered to the pilot for execution. When the code to be executed is from a 'normal' ATLAS user, as opposed to the production system or other privileged actor, then the pilot may use an EGEE developed identity switching tool called gLExec. This changes the grid proxy available to the job and also switches the UNIX user identity to protect the privileges of the pilot code proxy. We describe the problems in using this system and how they are overcome. Finally, we discuss security drills which have been run using PanDA and show how these improved our operational security procedures.

  16. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  17. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  18. Transform coding for space applications

    NASA Technical Reports Server (NTRS)

    Glover, Daniel

    1993-01-01

    Data compression coding requirements for aerospace applications differ somewhat from the compression requirements for entertainment systems. On the one hand, entertainment applications are bit rate driven with the goal of getting the best quality possible with a given bandwidth. Science applications are quality driven with the goal of getting the lowest bit rate for a given level of reconstruction quality. In the past, the required quality level has been nothing less than perfect allowing only the use of lossless compression methods (if that). With the advent of better, faster, cheaper missions, an opportunity has arisen for lossy data compression methods to find a use in science applications as requirements for perfect quality reconstruction runs into cost constraints. This paper presents a review of the data compression problem from the space application perspective. Transform coding techniques are described and some simple, integer transforms are presented. The application of these transforms to space-based data compression problems is discussed. Integer transforms have an advantage over conventional transforms in computational complexity. Space applications are different from broadcast or entertainment in that it is desirable to have a simple encoder (in space) and tolerate a more complicated decoder (on the ground) rather than vice versa. Energy compaction with new transforms are compared with the Walsh-Hadamard (WHT), Discrete Cosine (DCT), and Integer Cosine (ICT) transforms.

  19. GeoPhysical Analysis Code

    SciTech Connect

    2012-12-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-written components: (i) a set of standard finite, discrete, and discontinuous displacement element physics solvers for resolving Darcy fluid flow, explicit mechanics, implicit mechanics, fault rupture and earthquake nucleation, and fluid-mediated fracturing, including resolution of physcial behaviors both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problems; ploblems involving hydraulic fracturing, where the mesh topology is dynamically changed; fault rupture modeling and seismic risk assessment; and general granular materials behavior. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for , e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release. CPAC's secondary applications include modeling fault evolution for predicting the statistical distribution of earthquake events and to capture granular materials behavior under different load paths.

  20. The Clawpack Community of Codes

    NASA Astrophysics Data System (ADS)

    Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.

    2014-12-01

    Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.

  1. Decoding the LIM development code.

    PubMed

    Gill, Gordon N

    2003-01-01

    During development a vast number of distinct cell types arise from dividing progenitor cells. Concentration gradients of ligands that act via cell surface receptors signal transcriptional regulators that repress and activate particular genes. LIM homeodomain proteins are an important class of transcriptional regulators that direct cell fate. Although in C. elegans only a single LIM homeodomain protein is expressed in a particular cell type, in vertebrates combinations of LIM homeodomain proteins are expressed in cells that determine cell fates. We have investigated the molecular basis of the LIM domain "code" that determines cell fates such as wing formation in Drosophilia and motor neuron formation in chicks. The basic code is a homotetramer of 2 LIM homeodomain proteins bridged by the adaptor protein, nuclear LIM interactor (NLI). A more complex molecular language consisting of a hexamer complex involving NLI and 2 LIM homeodomain proteins, Lhx3 and Isl1 determines ventral motor neuron formation. The same molecular "words" adopt different meanings depending on the context of expression of other molecular "words."

  2. DNA: Polymer and molecular code

    NASA Astrophysics Data System (ADS)

    Shivashankar, G. V.

    1999-10-01

    The thesis work focusses upon two aspects of DNA, the polymer and the molecular code. Our approach was to bring single molecule micromanipulation methods to the study of DNA. It included a home built optical microscope combined with an atomic force microscope and an optical tweezer. This combined approach led to a novel method to graft a single DNA molecule onto a force cantilever using the optical tweezer and local heating. With this method, a force versus extension assay of double stranded DNA was realized. The resolution was about 10 picoN. To improve on this force measurement resolution, a simple light backscattering technique was developed and used to probe the DNA polymer flexibility and its fluctuations. It combined the optical tweezer to trap a DNA tethered bead and the laser backscattering to detect the beads Brownian fluctuations. With this technique the resolution was about 0.1 picoN with a millisecond access time, and the whole entropic part of the DNA force-extension was measured. With this experimental strategy, we measured the polymerization of the protein RecA on an isolated double stranded DNA. We observed the progressive decoration of RecA on the l DNA molecule, which results in the extension of l , due to unwinding of the double helix. The dynamics of polymerization, the resulting change in the DNA entropic elasticity and the role of ATP hydrolysis were the main parts of the study. A simple model for RecA assembly on DNA was proposed. This work presents a first step in the study of genetic recombination. Recently we have started a study of equilibrium binding which utilizes fluorescence polarization methods to probe the polymerization of RecA on single stranded DNA. In addition to the study of material properties of DNA and DNA-RecA, we have developed experiments for which the code of the DNA is central. We studied one aspect of DNA as a molecular code, using different techniques. In particular the programmatic use of template specificity makes

  3. Migration of ATLAS PanDA to CERN

    NASA Astrophysics Data System (ADS)

    Stewart, Graeme Andrew; Klimentov, Alexei; Koblitz, Birger; Lamanna, Massimo; Maeno, Tadashi; Nevski, Pavel; Nowak, Marcin; Emanuel De Castro Faria Salgado, Pedro; Wenaus, Torre

    2010-04-01

    The ATLAS Production and Distributed Analysis System (PanDA) is a key component of the ATLAS distributed computing infrastructure. All ATLAS production jobs, and a substantial amount of user and group analysis jobs, pass through the PanDA system, which manages their execution on the grid. PanDA also plays a key role in production task definition and the data set replication request system. PanDA has recently been migrated from Brookhaven National Laboratory (BNL) to the European Organization for Nuclear Research (CERN), a process we describe here. We discuss how the new infrastructure for PanDA, which relies heavily on services provided by CERN IT, was introduced in order to make the service as reliable as possible and to allow it to be scaled to ATLAS's increasing need for distributed computing. The migration involved changing the backend database for PanDA from MySQL to Oracle, which impacted upon the database schemas. The process by which the client code was optimised for the new database backend is discussed. We describe the procedure by which the new database infrastructure was tested and commissioned for production use. Operations during the migration had to be planned carefully to minimise disruption to ongoing ATLAS offline computing. All parts of the migration were fully tested before commissioning the new infrastructure and the gradual migration of computing resources to the new system allowed any problems of scaling to be addressed.

  4. Syndrome source coding and its universal generalization

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1975-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.

  5. A (72, 36; 15) box code

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1993-01-01

    A (72,36;15) box code is constructed as a 9 x 8 matrix whose columns add to form an extended BCH-Hamming (8,4;4) code and whose rows sum to odd or even parity. The newly constructed code, due to its matrix form, is easily decodable for all seven-error and many eight-error patterns. The code comes from a slight modification in the parity (eighth) dimension of the Reed-Solomon (8,4;5) code over GF(512). Error correction uses the row sum parity information to detect errors, which then become erasures in a Reed-Solomon correction algorithm.

  6. Optimal Codes for the Burst Erasure Channel

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2010-01-01

    Deep space communications over noisy channels lead to certain packets that are not decodable. These packets leave gaps, or bursts of erasures, in the data stream. Burst erasure correcting codes overcome this problem. These are forward erasure correcting codes that allow one to recover the missing gaps of data. Much of the recent work on this topic concentrated on Low-Density Parity-Check (LDPC) codes. These are more complicated to encode and decode than Single Parity Check (SPC) codes or Reed-Solomon (RS) codes, and so far have not been able to achieve the theoretical limit for burst erasure protection. A block interleaved maximum distance separable (MDS) code (e.g., an SPC or RS code) offers near-optimal burst erasure protection, in the sense that no other scheme of equal total transmission length and code rate could improve the guaranteed correctible burst erasure length by more than one symbol. The optimality does not depend on the length of the code, i.e., a short MDS code block interleaved to a given length would perform as well as a longer MDS code interleaved to the same overall length. As a result, this approach offers lower decoding complexity with better burst erasure protection compared to other recent designs for the burst erasure channel (e.g., LDPC codes). A limitation of the design is its lack of robustness to channels that have impairments other than burst erasures (e.g., additive white Gaussian noise), making its application best suited for correcting data erasures in layers above the physical layer. The efficiency of a burst erasure code is the length of its burst erasure correction capability divided by the theoretical upper limit on this length. The inefficiency is one minus the efficiency. The illustration compares the inefficiency of interleaved RS codes to Quasi-Cyclic (QC) LDPC codes, Euclidean Geometry (EG) LDPC codes, extended Irregular Repeat Accumulate (eIRA) codes, array codes, and random LDPC codes previously proposed for burst erasure

  7. National Agenda for Hydrogen Codes and Standards

    SciTech Connect

    Blake, C.

    2010-05-01

    This paper provides an overview of hydrogen codes and standards with an emphasis on the national effort supported and managed by the U.S. Department of Energy (DOE). With the help and cooperation of standards and model code development organizations, industry, and other interested parties, DOE has established a coordinated national agenda for hydrogen and fuel cell codes and standards. With the adoption of the Research, Development, and Demonstration Roadmap and with its implementation through the Codes and Standards Technical Team, DOE helps strengthen the scientific basis for requirements incorporated in codes and standards that, in turn, will facilitate international market receptivity for hydrogen and fuel cell technologies.

  8. Understanding Mixed Code and Classroom Code-Switching: Myths and Realities

    ERIC Educational Resources Information Center

    Li, David C. S.

    2008-01-01

    Background: Cantonese-English mixed code is ubiquitous in Hong Kong society, and yet using mixed code is widely perceived as improper. This paper presents evidence of mixed code being socially constructed as bad language behavior. In the education domain, an EDB guideline bans mixed code in the classroom. Teachers are encouraged to stick to…

  9. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    ERIC Educational Resources Information Center

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  10. 24 CFR 200.926c - Model code provisions for use in partially accepted code jurisdictions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 2 2014-04-01 2014-04-01 false Model code provisions for use in... Minimum Property Standards § 200.926c Model code provisions for use in partially accepted code jurisdictions. If a lender or other interested party is notified that a State or local building code has...

  11. Quantum error-correcting codes from algebraic geometry codes of Castle type

    NASA Astrophysics Data System (ADS)

    Munuera, Carlos; Tenório, Wanderson; Torres, Fernando

    2016-10-01

    We study algebraic geometry codes producing quantum error-correcting codes by the CSS construction. We pay particular attention to the family of Castle codes. We show that many of the examples known in the literature in fact belong to this family of codes. We systematize these constructions by showing the common theory that underlies all of them.

  12. Quantum Codes From Negacyclic Codes over Group Ring (Fq + υFq) G

    NASA Astrophysics Data System (ADS)

    Koroglu, Mehmet E.; Siap, Irfan

    2016-10-01

    In this paper, we determine self dual and self orthogonal codes arising from negacyclic codes over the group ring (Fq + υFq) G. By taking a suitable Gray image of these codes we obtain many good parameter quantum error-correcting codes over Fq .

  13. Myelography CPT Coding Updates: Effects of 4 New Codes and Unintended Consequences.

    PubMed

    Chokshi, F H; Tu, R K; Nicola, G N; Hirsch, J A

    2016-06-01

    The Current Procedural Terminology of the American Medical Association has recently introduced coding changes for myelography with the introduction of new bundled codes. The aim of this review was to help neuroradiologists understand these code changes and their unintended consequences and to discuss various scenarios in which permutations of various codes could occur in clinical practice.

  14. Advanced coding and modulation schemes for TDRSS

    NASA Technical Reports Server (NTRS)

    Harrell, Linda; Kaplan, Ted; Berman, Ted; Chang, Susan

    1993-01-01

    This paper describes the performance of the Ungerboeck and pragmatic 8-Phase Shift Key (PSK) Trellis Code Modulation (TCM) coding techniques with and without a (255,223) Reed-Solomon outer code as they are used for Tracking Data and Relay Satellite System (TDRSS) S-Band and Ku-Band return services. The performance of these codes at high data rates is compared to uncoded Quadrature PSK (QPSK) and rate 1/2 convolutionally coded QPSK in the presence of Radio Frequency Interference (RFI), self-interference, and hardware distortions. This paper shows that the outer Reed-Solomon code is necessary to achieve a 10(exp -5) Bit Error Rate (BER) with an acceptable level of degradation in the presence of RFI. This paper also shows that the TCM codes with or without the Reed-Solomon outer code do not perform well in the presence of self-interference. In fact, the uncoded QPSK signal performs better than the TCM coded signal in the self-interference situation considered in this analysis. Finally, this paper shows that the E(sub b)/N(sub 0) degradation due to TDRSS hardware distortions is approximately 1.3 dB with a TCM coded signal or a rate 1/2 convolutionally coded QPSK signal and is 3.2 dB with an uncoded QPSK signal.

  15. The Astrophysics Source Code Library: An Update

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.

    2012-01-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.

  16. On the design of turbo codes

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    In this article, we design new turbo codes that can achieve near-Shannon-limit performance. The design criterion for random interleavers is based on maximizing the effective free distance of the turbo code, i.e., the minimum output weight of codewords due to weight-2 input sequences. An upper bound on the effective free distance of a turbo code is derived. This upper bound can be achieved if the feedback connection of convolutional codes uses primitive polynomials. We review multiple turbo codes (parallel concatenation of q convolutional codes), which increase the so-called 'interleaving gain' as q and the interleaver size increase, and a suitable decoder structure derived from an approximation to the maximum a posteriori probability decision rule. We develop new rate 1/3, 2/3, 3/4, and 4/5 constituent codes to be used in the turbo encoder structure. These codes, for from 2 to 32 states, are designed by using primitive polynomials. The resulting turbo codes have rates b/n (b = 1, 2, 3, 4 and n = 2, 3, 4, 5, 6), and include random interleavers for better asymptotic performance. These codes are suitable for deep-space communications with low throughput and for near-Earth communications where high throughput is desirable. The performance of these codes is within 1 dB of the Shannon limit at a bit-error rate of 10(exp -6) for throughputs from 1/15 up to 4 bits/s/Hz.

  17. Maximal dinucleotide comma-free codes.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2016-01-21

    The problem of retrieval and maintenance of the correct reading frame plays a significant role in RNA transcription. Circular codes, and especially comma-free codes, can help to understand the underlying mechanisms of error-detection in this process. In recent years much attention has been paid to the investigation of trinucleotide circular codes (see, for instance, Fimmel et al., 2014; Fimmel and Strüngmann, 2015a; Michel and Pirillo, 2012; Michel et al., 2012, 2008), while dinucleotide codes had been touched on only marginally, even though dinucleotides are associated to important biological functions. Recently, all maximal dinucleotide circular codes were classified (Fimmel et al., 2015; Michel and Pirillo, 2013). The present paper studies maximal dinucleotide comma-free codes and their close connection to maximal dinucleotide circular codes. We give a construction principle for such codes and provide a graphical representation that allows them to be visualized geometrically. Moreover, we compare the results for dinucleotide codes with the corresponding situation for trinucleotide maximal self-complementary C(3)-codes. Finally, the results obtained are discussed with respect to Crick׳s hypothesis about frame-shift-detecting codes without commas.

  18. Advanced coding and modulation schemes for TDRSS

    NASA Astrophysics Data System (ADS)

    Harrell, Linda; Kaplan, Ted; Berman, Ted; Chang, Susan

    1993-11-01

    This paper describes the performance of the Ungerboeck and pragmatic 8-Phase Shift Key (PSK) Trellis Code Modulation (TCM) coding techniques with and without a (255,223) Reed-Solomon outer code as they are used for Tracking Data and Relay Satellite System (TDRSS) S-Band and Ku-Band return services. The performance of these codes at high data rates is compared to uncoded Quadrature PSK (QPSK) and rate 1/2 convolutionally coded QPSK in the presence of Radio Frequency Interference (RFI), self-interference, and hardware distortions. This paper shows that the outer Reed-Solomon code is necessary to achieve a 10(exp -5) Bit Error Rate (BER) with an acceptable level of degradation in the presence of RFI. This paper also shows that the TCM codes with or without the Reed-Solomon outer code do not perform well in the presence of self-interference. In fact, the uncoded QPSK signal performs better than the TCM coded signal in the self-interference situation considered in this analysis. Finally, this paper shows that the E(sub b)/N(sub 0) degradation due to TDRSS hardware distortions is approximately 1.3 dB with a TCM coded signal or a rate 1/2 convolutionally coded QPSK signal and is 3.2 dB with an uncoded QPSK signal.

  19. Interval coding. II. Dendrite-dependent mechanisms.

    PubMed

    Doiron, Brent; Oswald, Anne-Marie M; Maler, Leonard

    2007-04-01

    The rich temporal structure of neural spike trains provides multiple dimensions to code dynamic stimuli. Popular examples are spike trains from sensory cells where bursts and isolated spikes can serve distinct coding roles. In contrast to analyses of neural coding, the cellular mechanics of burst mechanisms are typically elucidated from the neural response to static input. Bridging the mechanics of bursting with coding of dynamic stimuli is an important step in establishing theories of neural coding. Electrosensory lateral line lobe (ELL) pyramidal neurons respond to static inputs with a complex dendrite-dependent burst mechanism. Here we show that in response to dynamic broadband stimuli, these bursts lack some of the electrophysiological characteristics observed in response to static inputs. A simple leaky integrate-and-fire (LIF)-style model with a dendrite-dependent depolarizing afterpotential (DAP) is sufficient to match both the output statistics and coding performance of experimental spike trains. We use this model to investigate a simplification of interval coding where the burst interspike interval (ISI) codes for the scale of a canonical upstroke rather than a multidimensional stimulus feature. Using this stimulus reduction, we compute a quantization of the burst ISIs and the upstroke scale to show that the mutual information rate of the interval code is maximized at a moderate DAP amplitude. The combination of a reduced description of ELL pyramidal cell bursting and a simplification of the interval code increases the generality of ELL burst codes to other sensory modalities.

  20. Multiplexed coding in the human basal ganglia

    NASA Astrophysics Data System (ADS)

    Andres, D. S.; Cerquetti, D.; Merello, M.

    2016-04-01

    A classic controversy in neuroscience is whether information carried by spike trains is encoded by a time averaged measure (e.g. a rate code), or by complex time patterns (i.e. a time code). Here we apply a tool to quantitatively analyze the neural code. We make use of an algorithm based on the calculation of the temporal structure function, which permits to distinguish what scales of a signal are dominated by a complex temporal organization or a randomly generated process. In terms of the neural code, this kind of analysis makes it possible to detect temporal scales at which a time patterns coding scheme or alternatively a rate code are present. Additionally, finding the temporal scale at which the correlation between interspike intervals fades, the length of the basic information unit of the code can be established, and hence the word length of the code can be found. We apply this algorithm to neuronal recordings obtained from the Globus Pallidus pars interna from a human patient with Parkinson’s disease, and show that a time pattern coding and a rate coding scheme co-exist at different temporal scales, offering a new example of multiplexed neuronal coding.

  1. Breeding quantum error-correcting codes

    SciTech Connect

    Dong Ying; Hu Dan; Yu Sixia

    2010-02-15

    The stabilizer code, one major family of quantum error-correcting codes (QECC), is specified by the joint eigenspace of a commuting set of Pauli observables. It turns out that noncommuting sets of Pauli observables can be used to construct more efficient QECCs, such as the entanglement-assisted QECCs, which are built directly from any linear classical codes whose detailed properties are needed to determine the parameters of the resulting quantum codes. Here we propose another family of QECCs, namely, the breeding QECCs, that also employ noncommuting sets of Pauli observables and can be built from any classical additive codes, either linear or nonlinear, with the advantage that their parameters can be read off directly from the corresponding classical codes. Besides, since nonlinear codes are generally more efficient than linear codes, our breeding codes have better parameters than those codes built from linear codes. The terminology is justified by the fact that our QECCs are related to the ordinary QECCs in exactly the same way that the breeding protocols are related to the hashing protocols in the entanglement purification.

  2. Foliated Quantum Error-Correcting Codes

    NASA Astrophysics Data System (ADS)

    Bolt, A.; Duclos-Cianci, G.; Poulin, D.; Stace, T. M.

    2016-08-01

    We show how to construct a large class of quantum error-correcting codes, known as Calderbank-Steane-Shor codes, from highly entangled cluster states. This becomes a primitive in a protocol that foliates a series of such cluster states into a much larger cluster state, implementing foliated quantum error correction. We exemplify this construction with several familiar quantum error-correction codes and propose a generic method for decoding foliated codes. We numerically evaluate the error-correction performance of a family of finite-rate Calderbank-Steane-Shor codes known as turbo codes, finding that they perform well over moderate depth foliations. Foliated codes have applications for quantum repeaters and fault-tolerant measurement-based quantum computation.

  3. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  4. Performance of convolutionally coded unbalanced QPSK systems

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Yuen, J. H.

    1980-01-01

    An evaluation is presented of the performance of three representative convolutionally coded unbalanced quadri-phase-shift-keying (UQPSK) systems in the presence of noisy carrier reference and crosstalk. The use of a coded UQPSK system for transmitting two telemetry data streams with different rates and different powers has been proposed for the Venus Orbiting Imaging Radar mission. Analytical expressions for bit error rates in the presence of a noisy carrier phase reference are derived for three representative cases: (1) I and Q channels are coded independently; (2) I channel is coded, Q channel is uncoded; and (3) I and Q channels are coded by a common 1/2 code. For rate 1/2 convolutional codes, QPSK modulation can be used to reduce the bandwidth requirement.

  5. Efficient Polar Coding of Quantum Information

    NASA Astrophysics Data System (ADS)

    Renes, Joseph M.; Dupuis, Frédéric; Renner, Renato

    2012-08-01

    Polar coding, introduced 2008 by Arıkan, is the first (very) efficiently encodable and decodable coding scheme whose information transmission rate provably achieves the Shannon bound for classical discrete memoryless channels in the asymptotic limit of large block sizes. Here, we study the use of polar codes for the transmission of quantum information. Focusing on the case of qubit Pauli channels and qubit erasure channels, we use classical polar codes to construct a coding scheme that asymptotically achieves a net transmission rate equal to the coherent information using efficient encoding and decoding operations and code construction. Our codes generally require preshared entanglement between sender and receiver, but for channels with a sufficiently low noise level we demonstrate that the rate of preshared entanglement required is zero.

  6. Foliated Quantum Error-Correcting Codes.

    PubMed

    Bolt, A; Duclos-Cianci, G; Poulin, D; Stace, T M

    2016-08-12

    We show how to construct a large class of quantum error-correcting codes, known as Calderbank-Steane-Shor codes, from highly entangled cluster states. This becomes a primitive in a protocol that foliates a series of such cluster states into a much larger cluster state, implementing foliated quantum error correction. We exemplify this construction with several familiar quantum error-correction codes and propose a generic method for decoding foliated codes. We numerically evaluate the error-correction performance of a family of finite-rate Calderbank-Steane-Shor codes known as turbo codes, finding that they perform well over moderate depth foliations. Foliated codes have applications for quantum repeaters and fault-tolerant measurement-based quantum computation.

  7. Efficiency of a model human image code

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1987-01-01

    Hypothetical schemes for neural representation of visual information can be expressed as explicit image codes. Here, a code modeled on the simple cells of the primate striate cortex is explored. The Cortex transform maps a digital image into a set of subimages (layers) that are bandpass in spatial frequency and orientation. The layers are sampled so as to minimize the number of samples and still avoid aliasing. Samples are quantized in a manner that exploits the bandpass contrast-masking properties of human vision. The entropy of the samples is computed to provide a lower bound on the code size. Finally, the image is reconstructed from the code. Psychophysical methods are derived for comparing the original and reconstructed images to evaluate the sufficiency of the code. When each resolution is coded at the threshold for detection artifacts, the image-code size is about 1 bit/pixel.

  8. Turbo codes for deep-space communications

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    Turbo codes were recently proposed by Berrou, Glavieux, and Thitimajshima, and it has been claimed these codes achieve near-Shannon-limit error correction performance with relatively simple component codes and large interleavers. A required E(b)/N(o) of 0.7 dB was reported for a bit error rate of 10(exp -5), using a rate 1/2 turbo code. However, some important details that are necessary to reproduce these results were omitted. This article confirms the accuracy of these claims, and presents a complete description of an encoder/decoder pair that could be suitable for deep-space applications, where lower rate codes can be used. We describe a new simple method for trellis termination, analyze the effect of interleaver choice on the weight distribution of the code, and introduce the use of unequal rate component codes, which yield better performance.

  9. Verification tests for contaminant transport codes

    SciTech Connect

    Rowe, R.K.; Nadarajah, P.

    1996-12-31

    The importance of verifying contaminant transport codes and the techniques that may be used in this verification process are discussed. Commonly used contaminant transport codes are characterized as belonging to one of several types or classes of solution, such as analytic, finite layer, boundary element, finite difference and finite element. Both the level of approximation and the solution methodology should be verified for each contaminant transport code. One powerful method that may be used in contaminant transport code verification is cross-checking (benchmarking) with other codes. This technique is used to check the results of codes from one solution class with the results of codes from another solution class. In this paper cross-checking is performed for three classes of solution; these are, analytic, finite layer, and finite element.

  10. Evaluation of help model replacement codes

    SciTech Connect

    Whiteside, Tad; Hang, Thong; Flach, Gregory

    2009-07-01

    This work evaluates the computer codes that are proposed to be used to predict percolation of water through the closure-cap and into the waste containment zone at the Department of Energy closure sites. This work compares the currently used water-balance code (HELP) with newly developed computer codes that use unsaturated flow (Richards’ equation). It provides a literature review of the HELP model and the proposed codes, which result in two recommended codes for further evaluation: HYDRUS-2D3D and VADOSE/W. This further evaluation involved performing actual simulations on a simple model and comparing the results of those simulations to those obtained with the HELP code and the field data. From the results of this work, we conclude that the new codes perform nearly the same, although moving forward, we recommend HYDRUS-2D3D.

  11. Genetic code, hamming distance and stochastic matrices.

    PubMed

    He, Matthew X; Petoukhov, Sergei V; Ricci, Paolo E

    2004-09-01

    In this paper we use the Gray code representation of the genetic code C=00, U=10, G=11 and A=01 (C pairs with G, A pairs with U) to generate a sequence of genetic code-based matrices. In connection with these code-based matrices, we use the Hamming distance to generate a sequence of numerical matrices. We then further investigate the properties of the numerical matrices and show that they are doubly stochastic and symmetric. We determine the frequency distributions of the Hamming distances, building blocks of the matrices, decomposition and iterations of matrices. We present an explicit decomposition formula for the genetic code-based matrix in terms of permutation matrices, which provides a hypercube representation of the genetic code. It is also observed that there is a Hamiltonian cycle in a genetic code-based hypercube.

  12. New asymmetric quantum codes over Fq

    NASA Astrophysics Data System (ADS)

    Ma, Yuena; Feng, Xiaoyi; Xu, Gen

    2016-07-01

    Two families of new asymmetric quantum codes are constructed in this paper. The first family is the asymmetric quantum codes with length n=qm-1 over Fq, where qge 5 is a prime power. The second one is the asymmetric quantum codes with length n=3m-1. These asymmetric quantum codes are derived from the CSS construction and pairs of nested BCH codes. Moreover, let the defining set T1=T2^{-q}, then the real Z-distance of our asymmetric quantum codes are much larger than δ _max+1, where δ _max is the maximal designed distance of dual-containing narrow-sense BCH code, and the parameters presented here have better than the ones available in the literature.

  13. CFD Code Development for Combustor Flows

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.

  14. User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 3: ADD code coordinate generator

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.

    1982-01-01

    This User's Manual contains a complete description of the computer codes known as the Axisymmetric Diffuser Duct (ADD) code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.

  15. On the distance of stabilizer quantum codes from J-affine variety codes

    NASA Astrophysics Data System (ADS)

    Galindo, Carlos; Geil, Olav; Hernando, Fernando; Ruano, Diego

    2017-04-01

    Self-orthogonal J-affine variety codes have been successfully used to obtain quantum stabilizer codes with excellent parameters. In a previous paper we gave formulae for the dimension of this family of quantum codes, but no bound for the minimum distance was given. In this work, we show how to derive quantum stabilizer codes with designed minimum distance from J-affine variety codes and their subfield-subcodes. Moreover, this allows us to obtain new quantum codes, some of them either with better parameters, or with larger distances than the previously known codes.

  16. Interface requirements to couple thermal-hydraulic codes to 3D neutronic codes

    SciTech Connect

    Langenbuch, S.; Austregesilo, H.; Velkov, K.

    1997-07-01

    The present situation of thermalhydraulics codes and 3D neutronics codes is briefly described and general considerations for coupling of these codes are discussed. Two different basic approaches of coupling are identified and their relative advantages and disadvantages are discussed. The implementation of the coupling for 3D neutronics codes in the system ATHLET is presented. Meanwhile, this interface is used for coupling three different 3D neutronics codes.

  17. Soft Decoding of Integer Codes and Their Application to Coded Modulation

    NASA Astrophysics Data System (ADS)

    Kostadinov, Hristo; Morita, Hiroyoshi; Iijima, Noboru; Han Vinck, A. J.; Manev, Nikolai

    Integer codes are very flexible and can be applied in different modulation schemes. A soft decoding algorithm for integer codes will be introduced. Comparison of symbol error probability (SEP) versus signal-to-noise ratio (SNR) between soft and hard decoding using integer coded modulation shows us that we can obtain at least 2dB coding gain. Also, we shall compare our results with trellis coded modulation (TCM) because of their similar decoding schemes and complexity.

  18. Painting Victory: A Discussion of Leadership and Its Fundamental Principles.

    DTIC Science & Technology

    2007-11-02

    accordingly is instrumental to the art of leading. Consider how Leonardo da Vinci and Pablo Picasso perceive the human form in their own different and...SARDANAPALUS. 60 6. LEONARDO DA VINCI’S VIRGIN OF THE ROCKS AND PABLO PICASSO’S PORTRAIT OF UHNE 63 7. THEO VAN DOESBURG’ S SKETCHES AND...In contrast, a leader who is less confident with details may not create accomplishments as vivid. Leonardo Da Vinci’s vivid style and realism

  19. Single-cycle Pulse Synthesis by Coherent Superposition of Ultra-broadband Optical Parametric Amplifiers

    DTIC Science & Technology

    2011-08-01

    Giulio Cerullo Politecnico di Milano Department of Physics Piazza Leonardo da Vinci 32 Milano, Italy 20133 EOARD GRANT 09-3101...UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Politecnico di Milano Department of Physics Piazza Leonardo da Vinci 32 Milano...Milano, Piazza L. Da Vinci 32, 20133 Milano, Italy, 4DESY-Center for Free-Electron Laser Science and Hamburg University, Notkestraße 85, D-22607 Hamburg

  20. Canonical microcircuits for predictive coding

    PubMed Central

    Bastos, Andre M.; Usrey, W. Martin; Adams, Rick A.; Mangun, George R.; Fries, Pascal; Friston, Karl J.

    2013-01-01

    Summary This review considers the influential notion of a canonical (cortical) microcircuit in light of recent theories about neuronal processing. Specifically, we conciliate quantitative studies of microcircuitry and the functional logic of neuronal computations. We revisit the established idea that message passing among hierarchical cortical areas implements a form of Bayesian inference – paying careful attention to the implications for intrinsic connections among neuronal populations. By deriving canonical forms for these computations, one can associate specific neuronal populations with specific computational roles. This analysis discloses a remarkable correspondence between the microcircuitry of the cortical column and the connectivity implied by predictive coding. Furthermore, it provides some intuitive insights into the functional asymmetries between feedforward and feedback connections and the characteristic frequencies over which they operate. PMID:23177956

  1. Numerical classification of coding sequences

    NASA Technical Reports Server (NTRS)

    Collins, D. W.; Liu, C. C.; Jukes, T. H.

    1992-01-01

    DNA sequences coding for protein may be represented by counts of nucleotides or codons. A complete reading frame may be abbreviated by its base count, e.g. A76C158G121T74, or with the corresponding codon table, e.g. (AAA)0(AAC)1(AAG)9 ... (TTT)0. We propose that these numerical designations be used to augment current methods of sequence annotation. Because base counts and codon tables do not require revision as knowledge of function evolves, they are well-suited to act as cross-references, for example to identify redundant GenBank entries. These descriptors may be compared, in place of DNA sequences, to extract homologous genes from large databases. This approach permits rapid searching with good selectivity.

  2. NSCool: Neutron star cooling code

    NASA Astrophysics Data System (ADS)

    Page, Dany

    2016-09-01

    NSCool is a 1D (i.e., spherically symmetric) neutron star cooling code written in Fortran 77. The package also contains a series of EOSs (equation of state) to build stars, a series of pre-built stars, and a TOV (Tolman- Oppenheimer-Volkoff) integrator to build stars from an EOS. It can also handle “strange stars” that have a huge density discontinuity between the quark matter and the covering thin baryonic crust. NSCool solves the heat transport and energy balance equations in whole GR, resulting in a time sequence of temperature profiles (and, in particular, a Teff - age curve). Several heating processes are included, and more can easily be incorporated. In particular it can evolve a star undergoing accretion with the resulting deep crustal heating, under a steady or time-variable accretion rate. NSCool is robust, very fast, and highly modular, making it easy to add new subroutines for new processes.

  3. The CALOR93 code system

    SciTech Connect

    Gabriel, T.A.

    1993-12-31

    The purpose of this paper is to describe a program package, CALOR93, that has been developed to design and analyze different detector systems, in particular, calorimeters which are used in high energy physics experiments to determine the energy of particles. One`s ability to design a calorimeter to perform a certain task can have a strong influence upon the validity of experimental results. The validity of the results obtained with CALOR93 has been verified many times by comparison with experimental data. The codes (HETC93, SPECT93, LIGHT, EGS4, MORSE, and MICAP) are quite generalized and detailed enough so that any experimental calorimeter setup can be studied. Due to this generalization, some software development is necessary because of the wide diversity of calorimeter designs.

  4. Noiseless coding for the magnetometer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1987-01-01

    Future unmanned space missions will continue to seek a full understanding of magnetic fields throughout the solar system. Severely constrained data rates during certain portions of these missions could limit the possible science return. This publication investigates the application of universal noiseless coding techniques to more efficiently represent magnetometer data without any loss in data integrity. Performance results indicated that compression factors of 2:1 to 6:1 can be expected. Feasibility for general deep space application was demonstrated by implementing a microprocessor breadboard coder/decoder using the Intel 8086 processor. The Comet Rendezvous Asteroid Flyby mission will incorporate these techniques in a buffer feedback, rate-controlled configuration. The characteristics of this system are discussed.

  5. Advanced Code for Photocathode Design

    SciTech Connect

    Ives, Robert Lawrence; Jensen, Kevin; Montgomery, Eric; Bui, Thuc

    2015-12-15

    The Phase I activity demonstrated that PhotoQE could be upgraded and modified to allow input using a graphical user interface. Specific calls to platform-dependent (e.g. IMSL) function calls were removed, and Fortran77 components were rewritten for Fortran95 compliance. The subroutines, specifically the common block structures and shared data parameters, were reworked to allow the GUI to update material parameter data, and the system was targeted for desktop personal computer operation. The new structures overcomes the previous rigid and unmodifiable library structures by implementing new, materials library data sets and repositioning the library values to external files. Material data may originate from published literature or experimental measurements. Further optimization and restructuring would allow custom and specific emission models for beam codes that rely on parameterized photoemission algorithms. These would be based on simplified and parametric representations updated and extended from previous versions (e.g., Modified Fowler-Dubridge, Modified Three-Step, etc.).

  6. Multichannel Error Correction Code Decoder

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA Lewis Research Center's Digital Systems Technology Branch has an ongoing program in modulation, coding, onboard processing, and switching. Recently, NASA completed a project to incorporate a time-shared decoder into the very-small-aperture terminal (VSAT) onboard-processing mesh architecture. The primary goal was to demonstrate a time-shared decoder for a regenerative satellite that uses asynchronous, frequency-division multiple access (FDMA) uplink channels, thereby identifying hardware and power requirements and fault-tolerant issues that would have to be addressed in a operational system. A secondary goal was to integrate and test, in a system environment, two NASA-sponsored, proof-of-concept hardware deliverables: the Harris Corp. high-speed Bose Chaudhuri-Hocquenghem (BCH) codec and the TRW multichannel demultiplexer/demodulator (MCDD). A beneficial byproduct of this project was the development of flexible, multichannel-uplink signal-generation equipment.

  7. Spin glasses and error-correcting codes

    NASA Technical Reports Server (NTRS)

    Belongie, M. L.

    1994-01-01

    In this article, we study a model for error-correcting codes that comes from spin glass theory and leads to both new codes and a new decoding technique. Using the theory of spin glasses, it has been proven that a simple construction yields a family of binary codes whose performance asymptotically approaches the Shannon bound for the Gaussian channel. The limit is approached as the number of information bits per codeword approaches infinity while the rate of the code approaches zero. Thus, the codes rapidly become impractical. We present simulation results that show the performance of a few manageable examples of these codes. In the correspondence that exists between spin glasses and error-correcting codes, the concept of a thermal average leads to a method of decoding that differs from the standard method of finding the most likely information sequence for a given received codeword. Whereas the standard method corresponds to calculating the thermal average at temperature zero, calculating the thermal average at a certain optimum temperature results instead in the sequence of most likely information bits. Since linear block codes and convolutional codes can be viewed as examples of spin glasses, this new decoding method can be used to decode these codes in a way that minimizes the bit error rate instead of the codeword error rate. We present simulation results that show a small improvement in bit error rate by using the thermal average technique.

  8. The code of ethics for nurses.

    PubMed

    Zahedi, F; Sanjari, M; Aala, M; Peymani, M; Aramesh, K; Parsapour, A; Maddah, Ss Bagher; Cheraghi, Ma; Mirzabeigi, Gh; Larijani, B; Dastgerdi, M Vahid

    2013-01-01

    Nurses are ever-increasingly confronted with complex concerns in their practice. Codes of ethics are fundamental guidance for nursing as many other professions. Although there are authentic international codes of ethics for nurses, the national code would be the additional assistance provided for clinical nurses in their complex roles in care of patients, education, research and management of some parts of health care system in the country. A national code can provide nurses with culturally-adapted guidance and help them to make ethical decisions more closely to the Iranian-Islamic background. Given the general acknowledgement of the need, the National Code of Ethics for Nurses was compiled as a joint project (2009-2011). The Code was approved by the Health Policy Council of the Ministry of Health and Medical Education and communicated to all universities, healthcare centers, hospitals and research centers early in 2011. The focus of this article is on the course of action through which the Code was compiled, amended and approved. The main concepts of the code will be also presented here. No doubt, development of the codes should be considered as an ongoing process. This is an overall responsibility to keep the codes current, updated with the new progresses of science and emerging challenges, and pertinent to the nursing practice.

  9. The trellis complexity of convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Lin, W.

    1995-01-01

    It has long been known that convolutional codes have a natural, regular trellis structure that facilitates the implementation of Viterbi's algorithm. It has gradually become apparent that linear block codes also have a natural, though not in general a regular, 'minimal' trellis structure, which allows them to be decoded with a Viterbi-like algorithm. In both cases, the complexity of the Viterbi decoding algorithm can be accurately estimated by the number of trellis edges per encoded bit. It would, therefore, appear that we are in a good position to make a fair comparison of the Viterbi decoding complexity of block and convolutional codes. Unfortunately, however, this comparison is somewhat muddled by the fact that some convolutional codes, the punctured convolutional codes, are known to have trellis representations that are significantly less complex than the conventional trellis. In other words, the conventional trellis representation for a convolutional code may not be the minimal trellis representation. Thus, ironically, at present we seem to know more about the minimal trellis representation for block than for convolutional codes. In this article, we provide a remedy, by developing a theory of minimal trellises for convolutional codes. (A similar theory has recently been given by Sidorenko and Zyablov). This allows us to make a direct performance-complexity comparison for block and convolutional codes. A by-product of our work is an algorithm for choosing, from among all generator matrices for a given convolutional code, what we call a trellis-minimal generator matrix, from which the minimal trellis for the code can be directly constructed. Another by-product is that, in the new theory, punctured convolutional codes no longer appear as a special class, but simply as high-rate convolutional codes whose trellis complexity is unexpectedly small.

  10. Serial-Turbo-Trellis-Coded Modulation with Rate-1 Inner Code

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Sam; Pollara, Fabrizio

    2004-01-01

    Serially concatenated turbo codes have been proposed to satisfy requirements for low bit- and word-error rates and for low (in comparison with related previous codes) complexity of coding and decoding algorithms and thus low complexity of coding and decoding circuitry. These codes are applicable to such high-level modulations as octonary phase-shift keying (8PSK) and 16-state quadrature amplitude modulation (16QAM); the signal product obtained by applying one of these codes to one of these modulations is denoted, generally, as serially concatenated trellis-coded modulation (SCTCM). These codes could be particularly beneficial for communication systems that must be designed and operated subject to limitations on bandwidth and power. Some background information is prerequisite to a meaningful summary of this development. Trellis-coded modulation (TCM) is now a well-established technique in digital communications. A turbo code combines binary component codes (which typically include trellis codes) with interleaving. A turbo code of the type that has been studied prior to this development is composed of parallel concatenated convolutional codes (PCCCs) implemented by two or more constituent systematic encoders joined through one or more interleavers. The input information bits feed the first encoder and, after having been scrambled by the interleaver, enter the second encoder. A code word of a parallel concatenated code consists of the input bits to the first encoder followed by the parity check bits of both encoders. The suboptimal iterative decoding structure for such a code is modular, and consists of a set of concatenated decoding modules one for each constituent code connected through an interleaver identical to the one in the encoder side. Each decoder performs weighted soft decoding of the input sequence. PCCCs yield very large coding gains at the cost of a reduction in the data rate and/or an increase in bandwidth.

  11. Data coding tools for color-coded vector nanolithography

    NASA Astrophysics Data System (ADS)

    Lekki, Janusz; Kumar, Saveen; Parihar, Sunil S.; Grange, Sebastien; Baur, Charles; Foschia, Raphael; Kulik, Andrzej

    2004-11-01

    We propose and demonstrate the ability and efficiency of using a universal file format for a nanolithography pattern. A problem faced by the physicists working in the field of nanolithography is a lack of a flexible pattern design software (possibly open-source) that could be applied in combination with a broad range of commercial scanning probe microscope (SPM) systems. The current nanolithography software packages are device-specific and not portable. Therefore, it is impossible to make a lithography pattern and share it with fellow physicists working on a networked sub-system. In this paper we describe the software designed to read and interpret a nanolithography pattern stored in a Windows Metafile (WMF) standard graphic format and next to draw it on a substrate using an SPM tip. The nanolithography parameters like height, velocity, feedback force, etc. are coded in the color of the WMF onto the RGB channels of the image establishing a distinct relation between a graphical feature (color) and the used nanolithography scheme (voltage, height, etc.). The concept enables preparation of complex patterns using any standard graphic software and aids an intuitive recognition of the mode and parameters set for a pattern. The advantages of using a WMF over other approaches and the universal scope of the software are discussed.

  12. Multidimensional Trellis Coded Phase Modulation Using a Multilevel Concatenation Approach. Part 1; Code Design

    NASA Technical Reports Server (NTRS)

    Rajpal, Sandeep; Rhee, Do Jun; Lin, Shu

    1997-01-01

    The first part of this paper presents a simple and systematic technique for constructing multidimensional M-ary phase shift keying (MMK) trellis coded modulation (TCM) codes. The construction is based on a multilevel concatenation approach in which binary convolutional codes with good free branch distances are used as the outer codes and block MPSK modulation codes are used as the inner codes (or the signal spaces). Conditions on phase invariance of these codes are derived and a multistage decoding scheme for these codes is proposed. The proposed technique can be used to construct good codes for both the additive white Gaussian noise (AWGN) and fading channels as is shown in the second part of this paper.

  13. HERCULES: A Pattern Driven Code Transformation System

    SciTech Connect

    Kartsaklis, Christos; Hernandez, Oscar R; Hsu, Chung-Hsing; Ilsche, Thomas; Joubert, Wayne; Graham, Richard L

    2012-01-01

    New parallel computers are emerging, but developing efficient scientific code for them remains difficult. A scientist must manage not only the science-domain complexity but also the performance-optimization complexity. HERCULES is a code transformation system designed to help the scientist to separate the two concerns, which improves code maintenance, and facilitates performance optimization. The system combines three technologies, code patterns, transformation scripts and compiler plugins, to provide the scientist with an environment to quickly implement code transformations that suit his needs. Unlike existing code optimization tools, HERCULES is unique in its focus on user-level accessibility. In this paper we discuss the design, implementation and an initial evaluation of HERCULES.

  14. Protograph-Based Raptor-Like Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Chen, Tsung-Yi; Wang, Jiadong; Wesel, Richard D.

    2014-01-01

    Theoretical analysis has long indicated that feedback improves the error exponent but not the capacity of pointto- point memoryless channels. The analytic and empirical results indicate that at short blocklength regime, practical rate-compatible punctured convolutional (RCPC) codes achieve low latency with the use of noiseless feedback. In 3GPP, standard rate-compatible turbo codes (RCPT) did not outperform the convolutional codes in the short blocklength regime. The reason is the convolutional codes for low number of states can be decoded optimally using Viterbi decoder. Despite excellent performance of convolutional codes at very short blocklengths, the strength of convolutional codes does not scale with the blocklength for a fixed number of states in its trellis.

  15. An Experiment in Scientific Code Semantic Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Mark E. M.

    1998-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, distributed expert parsers. These semantic parser are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. The parsers will automatically recognize and document some static, semantic concepts and locate some program semantic errors. Results are shown for a subroutine test case and a collection of combustion code routines. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.

  16. A concatenated coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Kasami, T.; Fujiwara, T.; Lin, S.

    1986-01-01

    In this paper, a concatenated coding scheme for error control in data communications is presented and analyzed. In this scheme, the inner code is used for both error correction and detection; however, the outer code is used only for error detection. A retransmission is requested if either the inner code decoder fails to make a successful decoding or the outer code decoder detects the presence of errors after the inner code decoding. Probability of undetected error (or decoding error) of the proposed scheme is derived. An efficient method for computing this probability is presented. Throughput efficiency of the proposed error control scheme incorporated with a selective-repeat ARQ retransmission strategy is also analyzed. Three specific examples are presented. One of the examples is proposed for error control in the NASA Telecommand System.

  17. On Using Goldbach G0 Codes and Even-Rodeh Codes for Text Compression on Using Goldbach G0 Codes and Even-Rodeh Codes for Text Compression

    NASA Astrophysics Data System (ADS)

    Budiman, M. A.; Rachmawati, D.

    2017-03-01

    This research aims to study the efficiency of two variants of variable-length codes (i.e., Goldbach G0 codes and Even-Rodeh codes) in compressing texts. The parameters being examined are the ratio of compression, the space savings, and the bit rate. As a benchmark, all of the original (uncompressed) texts are assumed to be encoded in American Standard Codes for Information Interchange (ASCII). Several texts, including those derived from some corpora (the Artificial corpus, the Calgary corpus, the Canterbury corpus, the Large corpus, and the Miscellaneous corpus) are tested in the experiment. The overall result shows that the Even-Rodeh codes are consistently more efficient to compress texts than the unoptimzed Goldbach G0 codes.

  18. Wire codes, magnetic fields, and childhood cancer

    SciTech Connect

    Kheifets, L.I.; Kavet, R.; Sussman, S.S.

    1997-05-01

    Childhood cancer has been modestly associated with wire codes, an exposure surrogate for power frequency magnetic fields, but less consistently with measured fields. The authors analyzed data on the population distribution of wire codes and their relationship with several measured magnetic field metrics. In a given geographic area, there is a marked trend for decreased prevalence from low to high wire code categories, but there are differences between areas. For average measured fields, there is a positive relationship between the mean of the distributions and wire codes but a large overlap among the categories. Better discrimination is obtained for the extremes of the measurement values when comparing the highest and the lowest wire code categories. Instability of measurements, intermittent fields, or other exposure conditions do not appear to provide a viable explanation for the differences between wire codes and magnetic fields with respect to the strength and consistency of their respective association with childhood cancer.

  19. Mechanism on brain information processing: Energy coding

    NASA Astrophysics Data System (ADS)

    Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa

    2006-09-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.

  20. Energy coding in biological neural networks.

    PubMed

    Wang, Rubin; Zhang, Zhikang

    2007-09-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function.