van der Mei, Sijrike F; Dijkers, Marcel P J M; Heerkens, Yvonne F
2011-12-01
To examine to what extent the concept and the domains of participation as defined in the International Classification of Functioning, Disability and Health (ICF) are represented in general cancer-specific health-related quality of life (HRQOL) instruments. Using the ICF linking rules, two coders independently extracted the meaningful concepts of ten instruments and linked these to ICF codes. The proportion of concepts that could be linked to ICF codes ranged from 68 to 95%. Although all instruments contained concepts linked to Participation (Chapters d7-d9 of the classification of 'Activities and Participation'), the instruments covered only a small part of all available ICF codes. The proportion of ICF codes in the instruments that were participation related ranged from 3 to 35%. 'Major life areas' (d8) was the most frequently used Participation Chapter, with d850 'remunerative employment' as the most used ICF code. The number of participation-related ICF codes covered in the instruments is limited. General cancer-specific HRQOL instruments only assess social life of cancer patients to a limited degree. This study's information on the content of these instruments may guide researchers in selecting the appropriate instrument for a specific research purpose.
A Concept for Run-Time Support of the Chapel Language
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
A document presents a concept for run-time implementation of other concepts embodied in the Chapel programming language. (Now undergoing development, Chapel is intended to become a standard language for parallel computing that would surpass older such languages in both computational performance in the efficiency with which pre-existing code can be reused and new code written.) The aforementioned other concepts are those of distributions, domains, allocations, and access, as defined in a separate document called "A Semantic Framework for Domains and Distributions in Chapel" and linked to a language specification defined in another separate document called "Chapel Specification 0.3." The concept presented in the instant report is recognition that a data domain that was invented for Chapel offers a novel approach to distributing and processing data in a massively parallel environment. The concept is offered as a starting point for development of working descriptions of functions and data structures that would be necessary to implement interfaces to a compiler for transforming the aforementioned other concepts from their representations in Chapel source code to their run-time implementations.
Zhang, Yinsheng; Zhang, Guoming
2018-01-01
A terminology (or coding system) is a formal set of controlled vocabulary in a specific domain. With a well-defined terminology, each concept in the target domain is assigned with a unique code, which can be identified and processed across different medical systems in an unambiguous way. Though there are lots of well-known biomedical terminologies, there is currently no domain-specific terminology for ROP (retinopathy of prematurity). Based on a collection of historical ROP patients' data in the electronic medical record system, we extracted the most frequent terms in the domain and organized them into a hierarchical coding system-ROP Minimal Standard Terminology, which contains 62 core concepts in 4 categories. This terminology has been successfully used to provide highly structured and semantic-rich clinical data in several ROP-related applications.
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1990-01-01
The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.
The effect of multiple internal representations on context-rich instruction
NASA Astrophysics Data System (ADS)
Lasry, Nathaniel; Aulls, Mark W.
2007-11-01
We discuss n-coding, a theoretical model of multiple internal mental representations. The n-coding construct is developed from a review of cognitive and imaging data that demonstrates the independence of information processed along different modalities such as verbal, visual, kinesthetic, logico-mathematic, and social modalities. A study testing the effectiveness of the n-coding construct in classrooms is presented. Four sections differing in the level of n-coding opportunities were compared. Besides a traditional-instruction section used as a control group, each of the remaining three sections were given context-rich problems, which differed by the level of n-coding opportunities designed into their laboratory environment. To measure the effectiveness of the construct, problem-solving skills were assessed as conceptual learning using the force concept inventory. We also developed several new measures that take students' confidence in concepts into account. Our results show that the n-coding construct is useful in designing context-rich environments and can be used to increase learning gains in problem solving, conceptual knowledge, and concept confidence. Specifically, when using props in designing context-rich problems, we find n-coding to be a useful construct in guiding which additional dimensions need to be attended to.
Supporting Source Code Comprehension during Software Evolution and Maintenance
ERIC Educational Resources Information Center
Alhindawi, Nouh
2013-01-01
This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…
A Thermal Management Systems Model for the NASA GTX RBCC Concept
NASA Technical Reports Server (NTRS)
Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)
2002-01-01
The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.
Computational Infrastructure for Engine Structural Performance Simulation
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1997-01-01
Select computer codes developed over the years to simulate specific aspects of engine structures are described. These codes include blade impact integrated multidisciplinary analysis and optimization, progressive structural fracture, quantification of uncertainties for structural reliability and risk, benefits estimation of new technology insertion and hierarchical simulation of engine structures made from metal matrix and ceramic matrix composites. Collectively these codes constitute a unique infrastructure readiness to credibly evaluate new and future engine structural concepts throughout the development cycle from initial concept, to design and fabrication, to service performance and maintenance and repairs, and to retirement for cause and even to possible recycling. Stated differently, they provide 'virtual' concurrent engineering for engine structures total-life-cycle-cost.
The ICF and Postsurgery Occupational Therapy after Traumatic Hand Injury
ERIC Educational Resources Information Center
Fitinghoff, Helene; Lindqvist, Birgitta; Nygard, Louise; Ekholm, Jan; Schult, Marie-Louise
2011-01-01
Recent studies have examined the effectiveness of hand rehabilitation programmes and have linked the outcomes to the concept of ICF but not to specific ICF category codes. The objective of this study was to gain experience using ICF concepts to describe occupational therapy interventions during postsurgery hand rehabilitation, and to describe…
[Global aspects of medical ethics: conditions and possibilities].
Neitzke, G
2001-01-01
A global or universal code of medical ethics seems paradoxical in the era of pluralism and postmodernism. A different conception of globalisation will be developed in terms of a "procedural universality". According to this philosophical concept, a code of medical ethics does not oblige physicians to accept certain specific, preset, universal values and rules. It rather obliges every culture and society to start a culture-sensitive, continuous, and active discourse on specific issues, mentioned in the codex. This procedure might result in regional, intra-cultural consensus, which should be presented to an inter-cultural dialogue. To exemplify this procedure, current topics of medical ethics (spiritual foundations of medicine, autonomy, definitions concerning life and death, physicians' duties, conduct within therapeutic teams) will be discussed from the point of view of western medicine.
Kavuluru, Ramakanth; Han, Sifei; Harris, Daniel
2017-01-01
Diagnosis codes are extracted from medical records for billing and reimbursement and for secondary uses such as quality control and cohort identification. In the US, these codes come from the standard terminology ICD-9-CM derived from the international classification of diseases (ICD). ICD-9 codes are generally extracted by trained human coders by reading all artifacts available in a patient’s medical record following specific coding guidelines. To assist coders in this manual process, this paper proposes an unsupervised ensemble approach to automatically extract ICD-9 diagnosis codes from textual narratives included in electronic medical records (EMRs). Earlier attempts on automatic extraction focused on individual documents such as radiology reports and discharge summaries. Here we use a more realistic dataset and extract ICD-9 codes from EMRs of 1000 inpatient visits at the University of Kentucky Medical Center. Using named entity recognition (NER), graph-based concept-mapping of medical concepts, and extractive text summarization techniques, we achieve an example based average recall of 0.42 with average precision 0.47; compared with a baseline of using only NER, we notice a 12% improvement in recall with the graph-based approach and a 7% improvement in precision using the extractive text summarization approach. Although diagnosis codes are complex concepts often expressed in text with significant long range non-local dependencies, our present work shows the potential of unsupervised methods in extracting a portion of codes. As such, our findings are especially relevant for code extraction tasks where obtaining large amounts of training data is difficult. PMID:28748227
ERIC Educational Resources Information Center
Meyer, Linda A.; And Others
This manual describes the model--specifically the observation procedures and coding systems--used in a longitudinal study of how children learn to comprehend what they read, with particular emphasis on science texts. Included are procedures for the following: identifying students; observing--recording observations and diagraming the room; writing…
NASA Technical Reports Server (NTRS)
Hamilton, M.
1973-01-01
The entry guidance software functional requirements (requirements design phase), its architectural requirements (specifications design phase), and the entry guidance software verified code are discussed. It was found that the proper integration of designs at both the requirements and specifications levels are of high priority consideration.
Douglas, Danielle; Newsome, Rachel N; Man, Louisa LY
2018-01-01
A significant body of research in cognitive neuroscience is aimed at understanding how object concepts are represented in the human brain. However, it remains unknown whether and where the visual and abstract conceptual features that define an object concept are integrated. We addressed this issue by comparing the neural pattern similarities among object-evoked fMRI responses with behavior-based models that independently captured the visual and conceptual similarities among these stimuli. Our results revealed evidence for distinctive coding of visual features in lateral occipital cortex, and conceptual features in the temporal pole and parahippocampal cortex. By contrast, we found evidence for integrative coding of visual and conceptual object features in perirhinal cortex. The neuroanatomical specificity of this effect was highlighted by results from a searchlight analysis. Taken together, our findings suggest that perirhinal cortex uniquely supports the representation of fully specified object concepts through the integration of their visual and conceptual features. PMID:29393853
Tutorial on Reed-Solomon error correction coding
NASA Technical Reports Server (NTRS)
Geisel, William A.
1990-01-01
This tutorial attempts to provide a frank, step-by-step approach to Reed-Solomon (RS) error correction coding. RS encoding and RS decoding both with and without erasing code symbols are emphasized. There is no need to present rigorous proofs and extreme mathematical detail. Rather, the simple concepts of groups and fields, specifically Galois fields, are presented with a minimum of complexity. Before RS codes are presented, other block codes are presented as a technical introduction into coding. A primitive (15, 9) RS coding example is then completely developed from start to finish, demonstrating the encoding and decoding calculations and a derivation of the famous error-locator polynomial. The objective is to present practical information about Reed-Solomon coding in a manner such that it can be easily understood.
Trajectories for High Specific Impulse High Specific Power Deep Space Exploration
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)
2002-01-01
Flight times and deliverable masses for electric and fusion propulsion systems are difficult to approximate. Numerical integration is required for these continuous thrust systems. Many scientists are not equipped with the tools and expertise to conduct interplanetary and interstellar trajectory analysis for their concepts. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. An analytical method derived in the companion paper was also evaluated. The accuracy of this method is discussed in the paper.
Hypochondria as withdrawal and comedy as cure in Dr. Willibald's Der Hypochondrist (1824).
Potter, Edward T
2012-01-01
Balthasar von Ammann's comedy Der Hypochondrist, published in 1824 under the pseudonym Dr. Willibald, foregrounds the social, sexual, and political implications of hypochondria. The play engages with early nineteenth-century medical and popular conceptions of hypochondria to co-opt potentially subversive elements and to promote a specific social, sexual, and political agenda. The text promotes literature — specifically comedic drama — as a cure for hypochondria. Hypochondria functions as a code for withdrawal. The hypochondriac withdraws medically from healthy society, gaining exceptional status. He withdraws sexually from society by remaining a bachelor, possibly engaged in non-normative sexual behaviour. Furthermore, the politically disenfranchised protagonist voices his political frustrations via a coded medical metaphor. The hypochondriac poses a threefold challenge to the social, sexual, and political order, and the play engages with contemporary conceptions of the disease to provide the solution: comedy. The text, presented as a cure for hypochondria, replaces the coded questioning of the social order via hypochondria with the less threatening code of heraldry. A comedy-within-the-comedy uses the hypochondriac's love of heraldry to cure him, resulting in the elimination of his medical problems and exceptional status, in the purification of his bachelorhood from non-normative elements, and in the pre-emption of political frustrations.
Crew interface specifications preparation for in-flight maintenance and stowage functions
NASA Technical Reports Server (NTRS)
Parker, F. W.; Carlton, B. E.
1972-01-01
The findings and data products developed during the Phase 2 crew interface specification study are presented. Five new NASA general specifications were prepared: operations location coding system for crew interfaces; loose equipment and stowage management requirements; loose equipment and stowage data base information requirements; spacecraft loose equipment stowage drawing requirements; and inflight stowage management data requirements. Additional data was developed defining inflight maintenance processes and related data concepts for inflight troubleshooting, remove/repair/replace and scheduled maintenance activities. The process of maintenance task and equipment definition during spacecraft design and development was also defined and related data concepts were identified for futher development into formal NASA specifications during future follow-on study phases of the contract.
Decoding the neural representation of fine-grained conceptual categories.
Ghio, Marta; Vaghi, Matilde Maria Serena; Perani, Daniela; Tettamanti, Marco
2016-05-15
Neuroscientific research on conceptual knowledge based on the grounded cognition framework has shed light on the organization of concrete concepts into semantic categories that rely on different types of experiential information. Abstract concepts have traditionally been investigated as an undifferentiated whole, and have only recently been addressed in a grounded cognition perspective. The present fMRI study investigated the involvement of brain systems coding for experiential information in the conceptual processing of fine-grained semantic categories along the abstract-concrete continuum. These categories consisted of mental state-, emotion-, mathematics-, mouth action-, hand action-, and leg action-related meanings. Thirty-five sentences for each category were used as stimuli in a 1-back task performed by 36 healthy participants. A univariate analysis failed to reveal category-specific activations. Multivariate pattern analyses, in turn, revealed that fMRI data contained sufficient information to disentangle all six fine-grained semantic categories across participants. However, the category-specific activity patterns showed no overlap with the regions coding for experiential information. These findings demonstrate the possibility of detecting specific patterns of neural representation associated with the processing of fine-grained conceptual categories, crucially including abstract ones, though bearing no anatomical correspondence with regions coding for experiential information as predicted by the grounded cognition hypothesis. Copyright © 2016 Elsevier Inc. All rights reserved.
Spontaneous self-descriptions and ethnic identities in individualistic and collectivistic cultures.
Rhee, E; Uleman, J S; Lee, H K; Roman, R J
1995-07-01
The Twenty Statements Test (TST) was administered in Seoul and New York, to 454 students from 2 cultures that emphasize collectivism and individualism, respectively. Responses, coded into 33 categories, were classified as either abstract or specific and as either autonomous or social. These 2 dichotomies were more independent in Seoul than in New York. The New York sample included Asian American whose spontaneous social identities differed. They either never listed ethnicity-nationality on the TST, or listed it once or twice. Unidentified Asian Americans' self-concepts resembled Euro-Americans' self-concepts, and twice identified Asian Americans' self-concepts resembled Koreans' self-concepts, in both abstractness-specificity and autonomy-sociality. Differential acculturation did not account for these results. Implications for social identity, self-categorization, and acculturation theory are discussed.
Variation of SNOMED CT coding of clinical research concepts among coding experts.
Andrews, James E; Richesson, Rachel L; Krischer, Jeffrey
2007-01-01
To compare consistency of coding among professional SNOMED CT coders representing three commercial providers of coding services when coding clinical research concepts with SNOMED CT. A sample of clinical research questions from case report forms (CRFs) generated by the NIH-funded Rare Disease Clinical Research Network (RDCRN) were sent to three coding companies with instructions to code the core concepts using SNOMED CT. The sample consisted of 319 question/answer pairs from 15 separate studies. The companies were asked to select SNOMED CT concepts (in any form, including post-coordinated) that capture the core concept(s) reflected in the question. Also, they were asked to state their level of certainty, as well as how precise they felt their coding was. Basic frequencies were calculated to determine raw level agreement among the companies and other descriptive information. Krippendorff's alpha was used to determine a statistical measure of agreement among the coding companies for several measures (semantic, certainty, and precision). No significant level of agreement among the experts was found. There is little semantic agreement in coding of clinical research data items across coders from 3 professional coding services, even using a very liberal definition of agreement.
Research into language concepts for the mission control center
NASA Technical Reports Server (NTRS)
Dellenback, Steven W.; Barton, Timothy J.; Ratner, Jeremiah M.
1990-01-01
A final report is given on research into language concepts for the Mission Control Center (MCC). The Specification Driven Language research is described. The state of the image processing field and how image processing techniques could be applied toward automating the generation of the language known as COmputation Development Environment (CODE or Comp Builder) are discussed. Also described is the development of a flight certified compiler for Comps.
2014-01-01
Linear algebraic concept of subspace plays a significant role in the recent techniques of spectrum estimation. In this article, the authors have utilized the noise subspace concept for finding hidden periodicities in DNA sequence. With the vast growth of genomic sequences, the demand to identify accurately the protein-coding regions in DNA is increasingly rising. Several techniques of DNA feature extraction which involves various cross fields have come up in the recent past, among which application of digital signal processing tools is of prime importance. It is known that coding segments have a 3-base periodicity, while non-coding regions do not have this unique feature. One of the most important spectrum analysis techniques based on the concept of subspace is the least-norm method. The least-norm estimator developed in this paper shows sharp period-3 peaks in coding regions completely eliminating background noise. Comparison of proposed method with existing sliding discrete Fourier transform (SDFT) method popularly known as modified periodogram method has been drawn on several genes from various organisms and the results show that the proposed method has better as well as an effective approach towards gene prediction. Resolution, quality factor, sensitivity, specificity, miss rate, and wrong rate are used to establish superiority of least-norm gene prediction method over existing method. PMID:24386895
Comparison of procedure coding systems for level 1 and 2 hospitals in South Africa.
Montewa, Lebogang; Hanmer, Lyn; Reagon, Gavin
2013-01-01
The ability of three procedure coding systems to reflect the procedure concepts extracted from patient records from six hospitals was compared, in order to inform decision making about a procedure coding standard for South Africa. A convenience sample of 126 procedure concepts was extracted from patient records at three level 1 hospitals and three level 2 hospitals. Each procedure concept was coded using ICPC-2, ICD-9-CM, and CCSA-2001. The extent to which each code assigned actually reflected the procedure concept was evaluated (between 'no match' and 'complete match'). For the study sample, CCSA-2001 was found to reflect the procedure concepts most completely, followed by ICD-9-CM and then ICPC-2. In practice, decision making about procedure coding standards would depend on multiple factors in addition to coding accuracy.
Simulation of Trajectories for High Specific Impulse Deep Space Exploration
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Adams, Robert B.; Brady, Hugh J. (Technical Monitor)
2002-01-01
Difficulties in approximating flight times and deliverable masses for continuous thrust propulsion systems have complicated comparison and evaluation of proposed propulsion concepts. These continuous thrust propulsion systems are of interest to many groups, not the least of which are the electric propulsion and fusion communities. Several charts plotting the results of well-known trajectory simulation codes were developed and are contained in this paper. These charts illustrate the dependence of time of flight and payload ratio on jet power, initial mass, specific impulse and specific power. These charts are intended to be a tool by which people in the propulsion community can explore the possibilities of their propulsion system concepts. Trajectories were simulated using the tools VARITOP and IPOST. VARITOP is a well known trajectory optimization code that involves numerical integration based on calculus of variations. IPOST has several methods of trajectory simulation; the one used in this paper is Cowell's method for full integration of the equations of motion. The analytical method derived in the companion paper was also used to simulate the trajectory. The accuracy of this method is discussed in the paper.
The Role of Ontologies in Schema-based Program Synthesis
NASA Technical Reports Server (NTRS)
Bures, Tomas; Denney, Ewen; Fischer, Bernd; Nistor, Eugen C.
2004-01-01
Program synthesis is the process of automatically deriving executable code from (non-executable) high-level specifications. It is more flexible and powerful than conventional code generation techniques that simply translate algorithmic specifications into lower-level code or only create code skeletons from structural specifications (such as UML class diagrams). Key to building a successful synthesis system is specializing to an appropriate application domain. The AUTOBAYES and AUTOFILTER systems, under development at NASA Ames, operate in the two domains of data analysis and state estimation, respectively. The central concept of both systems is the schema, a representation of reusable computational knowledge. This can take various forms, including high-level algorithm templates, code optimizations, datatype refinements, or architectural information. A schema also contains applicability conditions that are used to determine when it can be applied safely. These conditions can refer to the initial specification, to intermediate results, or to elements of the partially-instantiated code. Schema-based synthesis uses AI technology to recursively apply schemas to gradually refine a specification into executable code. This process proceeds in two main phases. A front-end gradually transforms the problem specification into a program represented in an abstract intermediate code. A backend then compiles this further down into a concrete target programming language of choice. A core engine applies schemas on the initial problem specification, then uses the output of those schemas as the input for other schemas, until the full implementation is generated. Since there might be different schemas that implement different solutions to the same problem this process can generate an entire solution tree. AUTOBAYES and AUTOFILTER have reached the level of maturity where they enable users to solve interesting application problems, e.g., the analysis of Hubble Space Telescope images. They are large (in total around 100kLoC Prolog), knowledge intensive systems that employ complex symbolic reasoning to generate a wide range of non-trivial programs for complex application do- mains. Their schemas can have complex interactions, which make it hard to change them in isolation or even understand what an existing schema actually does. Adding more capabilities by increasing the number of schemas will only worsen this situation, ultimately leading to the entropy death of the synthesis system. The root came of this problem is that the domain knowledge is scattered throughout the entire system and only represented implicitly in the schema implementations. In our current work, we are addressing this problem by making explicit the knowledge from Merent parts of the synthesis system. Here; we discuss how Gruber's definition of an ontology as an explicit specification of a conceptualization matches our efforts in identifying and explicating the domain-specific concepts. We outline the dual role ontologies play in schema-based synthesis and argue that they address different audiences and serve different purposes. Their first role is descriptive: they serve as explicit documentation, and help to understand the internal structure of the system. Their second role is prescriptive: they provide the formal basis against which the other parts of the system (e.g., schemas) can be checked. Their final role is referential: ontologies also provide semantically meaningful "hooks" which allow schemas and tools to access the internal state of the program derivation process (e.g., fragments of the generated code) in domain-specific rather than language-specific terms, and thus to modify it in a controlled fashion. For discussion purposes we use AUTOLINEAR, a small synthesis system we are currently experimenting with, which can generate code for solving a system of linear equations, Az = b.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naqvi, S
2014-06-15
Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physicalmore » principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as virtual experiments that give deeper and long lasting understanding of core principles. The student can then make sound judgements in novel situations encountered beyond routine clinical activities.« less
Base heating methodology improvements, volume 1
NASA Technical Reports Server (NTRS)
Bender, Robert L.; Reardon, John E.; Somers, Richard E.; Fulton, Michael S.; Smith, Sheldon D.; Pergament, Harold
1992-01-01
This document is the final report for NASA MSFC Contract NAS8-38141. The contracted effort had the broad objective of improving the launch vehicles ascent base heating methodology to improve and simplify the determination of that environment for Advanced Launch System (ALS) concepts. It was pursued as an Advanced Development Plan (ADP) for the Joint DoD/NASA ALS program office with project management assigned to NASA/MSFC. The original study was to be completed in 26 months beginning Sep. 1989. Because of several program changes and emphasis on evolving launch vehicle concepts, the period of performance was extended to the current completion date of Nov. 1992. A computer code incorporating the methodology improvements into a quick prediction tool was developed and is operational for basic configuration and propulsion concepts. The code and its users guide are also provided as part of the contract documentation. Background information describing the specific objectives, limitations, and goals of the contract is summarized. A brief chronology of the ALS/NLS program history is also presented to provide the reader with an overview of the many variables influencing the development of the code over the past three years.
Assessing Teachers' Science Content Knowledge: A Strategy for Assessing Depth of Understanding
NASA Astrophysics Data System (ADS)
McConnell, Tom J.; Parker, Joyce M.; Eberhardt, Jan
2013-06-01
One of the characteristics of effective science teachers is a deep understanding of science concepts. The ability to identify, explain and apply concepts is critical in designing, delivering and assessing instruction. Because some teachers have not completed extensive courses in some areas of science, especially in middle and elementary grades, many professional development programs attempt to strengthen teachers' content knowledge. Assessing this content knowledge is challenging. Concept inventories are reliable and efficient, but do not reveal depth of knowledge. Interviews and observations are time-consuming. The Problem Based Learning Project for Teachers implemented a strategy that includes pre-post instruments in eight content strands that permits blind coding of responses and comparison across teachers and groups of teachers. The instruments include two types of open-ended questions that assess both general knowledge and the ability to apply Big Ideas related to specific science topics. The coding scheme is useful in revealing patterns in prior knowledge and learning, and identifying ideas that are challenging or not addressed by learning activities. The strengths and limitations of the scoring scheme are identified through comparison of the findings to case studies of four participating teachers from middle and elementary schools. The cases include examples of coded pre- and post-test responses to illustrate some of the themes seen in teacher learning. The findings raise questions for future investigation that can be conducted using analyses of the coded responses.
Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic
NASA Technical Reports Server (NTRS)
Leucht, Kurt W.; Semmel, Glenn S.
2008-01-01
The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.
New French Regulation for NPPs and Code Consequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faidy, Claude
2006-07-01
On December 2005, the French regulator issued a new regulation for French nuclear power plants, in particular for pressure equipment (PE). This regulation need first to agree with non-nuclear PE regulation and add to that some specific requirements, in particular radiation protection requirements. Different advantages are in these proposal, it's more qualitative risk oriented and it's an important link with non-nuclear industry. Only few components are nuclear specific. But, the general philosophy of the existing Codes (RCC-M [15], KTA [16] or ASME [17]) have to be improved. For foreign Codes, it's plan to define the differences in the user specifications.more » In parallel to that, a new safety classification has been developed by French utility. The consequences is the need to cross all these specifications to define a minimum quality level for each components or systems. In the same time a new concept has been developed to replace the well known 'Leak Before Break methodology': the 'Break Exclusion' methodology. This paper will summarize the key aspects of these different topics. (authors)« less
The Trouble with Legal Ethics.
ERIC Educational Resources Information Center
Simon, William H.
1991-01-01
The conceptions of legal ethics or professional responsibility as (1) disciplinary rules or codes; and (2) as the personal moralities of individual lawyers prevail. However, it is the application of general norms to specific circumstances through complex, creative judgment that is the ethical component of the ideal of legal professionalism. (MSE)
High density arrays of micromirrors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Folta, J. M.; Decker, J. Y.; Kolman, J.
We established and achieved our goal to (1) fabricate and evaluate test structures based on the micromirror design optimized for maskless lithography applications, (2) perform system analysis and code development for the maskless lithography concept, and (3) identify specifications for micromirror arrays (MMAs) for LLNL's adaptive optics (AO) applications and conceptualize new devices.
Concept of a photon-counting camera based on a diffraction-addressed Gray-code mask
NASA Astrophysics Data System (ADS)
Morel, Sébastien
2004-09-01
A new concept of photon counting camera for fast and low-light-level imaging applications is introduced. The possible spectrum covered by this camera ranges from visible light to gamma rays, depending on the device used to transform an incoming photon into a burst of visible photons (photo-event spot) localized in an (x,y) image plane. It is actually an evolution of the existing "PAPA" (Precision Analog Photon Address) Camera that was designed for visible photons. This improvement comes from a simplified optics. The new camera transforms, by diffraction, each photo-event spot from an image intensifier or a scintillator into a cross-shaped pattern, which is projected onto a specific Gray code mask. The photo-event position is then extracted from the signal given by an array of avalanche photodiodes (or photomultiplier tubes, alternatively) downstream of the mask. After a detailed explanation of this camera concept that we have called "DIAMICON" (DIffraction Addressed Mask ICONographer), we briefly discuss about technical solutions to build such a camera.
NASA Technical Reports Server (NTRS)
Manson, S. S.
1972-01-01
The strainrange partitioning concept divides the imposed strain into four basic ranges involving time-dependent and time-independent components. It is shown that some of the results presented at the symposium can be better correlated on the basis of this concept than by alternative methods. It is also suggested that methods of data generation and analysis can be helpfully guided by this approach. Potential applicability of the concept to the treatment of frequency and hold-time effects, environmental influence, crack initiation and growth, thermal fatigue, and code specifications are briefly considered. A required experimental program is outlined.
Coding and traceability: cells and tissues in North America.
Brubaker, Scott A; Wilson, Diane
2010-11-01
Cell and tissue banking professionals in North America have long understood the value of labeling their allografts with descriptive names that make them easily recognized. They have also understood that advantages exist in possessing the capability to track them internally and externally to better understand tissue handling from donation through distribution. An added insight that can assist with strategic planning is to know who uses them, how many, and for what purpose or application. Uniquely coding allografts naturally aids tracking in event of recall or the rare need to link them if implicated in an adverse outcome report. These values relate to an ability or inability to sufficiently track specific cell/tissue types throughout the allograft's lifetime. These concepts easily fit into the functions of a Quality Program and promote recipient safety. It is management oversight that drives the direction taken and either optimizes this knowledge or limits it. How concepts related to coding and tracing human cells and tissues for transplantation have evolved in North America, and where they may be headed, are described in this manuscript. Many protocols are in place but they exist in numerous operational silos. Quality Management System concepts should drive decision-making and include considerations for future planning beyond our own professional lifetimes.
An Idealized, Single Radial Swirler, Lean-Direct-Injection (LDI) Concept Meshing Script
NASA Technical Reports Server (NTRS)
Iannetti, Anthony C.; Thompson, Daniel
2008-01-01
To easily study combustor design parameters using computational fluid dynamics codes (CFD), a Gridgen Glyph-based macro (based on the Tcl scripting language) dubbed BladeMaker has been developed for the meshing of an idealized, single radial swirler, lean-direct-injection (LDI) combustor. BladeMaker is capable of taking in a number of parameters, such as blade width, blade tilt with respect to the perpendicular, swirler cup radius, and grid densities, and producing a three-dimensional meshed radial swirler with a can-annular (canned) combustor. This complex script produces a data format suitable for but not specific to the National Combustion Code (NCC), a state-of-the-art CFD code developed for reacting flow processes.
A lncRNA Perspective into (Re)Building the Heart.
Frank, Stefan; Aguirre, Aitor; Hescheler, Juergen; Kurian, Leo
2016-01-01
Our conception of the human genome, long focused on the 2% that codes for proteins, has profoundly changed since its first draft assembly in 2001. Since then, an unanticipatedly expansive functionality and convolution has been attributed to the majority of the genome that is transcribed in a cell-type/context-specific manner into transcripts with no apparent protein coding ability. While the majority of these transcripts, currently annotated as long non-coding RNAs (lncRNAs), are functionally uncharacterized, their prominent role in embryonic development and tissue homeostasis, especially in the context of the heart, is emerging. In this review, we summarize and discuss the latest advances in understanding the relevance of lncRNAs in (re)building the heart.
Evolvability Is an Evolved Ability: The Coding Concept as the Arch-Unit of Natural Selection.
Janković, Srdja; Ćirković, Milan M
2016-03-01
Physical processes that characterize living matter are qualitatively distinct in that they involve encoding and transfer of specific types of information. Such information plays an active part in the control of events that are ultimately linked to the capacity of the system to persist and multiply. This algorithmicity of life is a key prerequisite for its Darwinian evolution, driven by natural selection acting upon stochastically arising variations of the encoded information. The concept of evolvability attempts to define the total capacity of a system to evolve new encoded traits under appropriate conditions, i.e., the accessible section of total morphological space. Since this is dependent on previously evolved regulatory networks that govern information flow in the system, evolvability itself may be regarded as an evolved ability. The way information is physically written, read and modified in living cells (the "coding concept") has not changed substantially during the whole history of the Earth's biosphere. This biosphere, be it alone or one of many, is, accordingly, itself a product of natural selection, since the overall evolvability conferred by its coding concept (nucleic acids as information carriers with the "rulebook of meanings" provided by codons, as well as all the subsystems that regulate various conditional information-reading modes) certainly played a key role in enabling this biosphere to survive up to the present, through alterations of planetary conditions, including at least five catastrophic events linked to major mass extinctions. We submit that, whatever the actual prebiotic physical and chemical processes may have been on our home planet, or may, in principle, occur at some time and place in the Universe, a particular coding concept, with its respective potential to give rise to a biosphere, or class of biospheres, of a certain evolvability, may itself be regarded as a unit (indeed the arch-unit) of natural selection.
Coding and decoding in a point-to-point communication using the polarization of the light beam.
Kavehvash, Z; Massoumian, F
2008-05-10
A new technique for coding and decoding of optical signals through the use of polarization is described. In this technique the concept of coding is translated to polarization. In other words, coding is done in such a way that each code represents a unique polarization. This is done by implementing a binary pattern on a spatial light modulator in such a way that the reflected light has the required polarization. Decoding is done by the detection of the received beam's polarization. By linking the concept of coding to polarization we can use each of these concepts in measuring the other one, attaining some gains. In this paper the construction of a simple point-to-point communication where coding and decoding is done through polarization will be discussed.
ERIC Educational Resources Information Center
Vallesi, Antonino; Binns, Malcolm A.; Shallice, Tim
2008-01-01
The present study addresses the question of how such an abstract concept as time is represented by our cognitive system. Specifically, the aim was to assess whether temporal information is cognitively represented through left-to-right spatial coordinates, as already shown for other ordered sequences (e.g., numbers). In Experiment 1, the…
Increasing productivity through Total Reuse Management (TRM)
NASA Technical Reports Server (NTRS)
Schuler, M. P.
1991-01-01
Total Reuse Management (TRM) is a new concept currently being promoted by the NASA Langley Software Engineering and Ada Lab (SEAL). It uses concepts similar to those promoted in Total Quality Management (TQM). Both technical and management personnel are continually encouraged to think in terms of reuse. Reuse is not something that is aimed for after a product is completed, but rather it is built into the product from inception through development. Lowering software development costs, reducing risk, and increasing code reliability are the more prominent goals of TRM. Procedures and methods used to adopt and apply TRM are described. Reuse is frequently thought of as only being applicable to code. However, reuse can apply to all products and all phases of the software life cycle. These products include management and quality assurance plans, designs, and testing procedures. Specific examples of successfully reused products are given and future goals are discussed.
Assigning clinical codes with data-driven concept representation on Dutch clinical free text.
Scheurwegs, Elyne; Luyckx, Kim; Luyten, Léon; Goethals, Bart; Daelemans, Walter
2017-05-01
Clinical codes are used for public reporting purposes, are fundamental to determining public financing for hospitals, and form the basis for reimbursement claims to insurance providers. They are assigned to a patient stay to reflect the diagnosis and performed procedures during that stay. This paper aims to enrich algorithms for automated clinical coding by taking a data-driven approach and by using unsupervised and semi-supervised techniques for the extraction of multi-word expressions that convey a generalisable medical meaning (referred to as concepts). Several methods for extracting concepts from text are compared, two of which are constructed from a large unannotated corpus of clinical free text. A distributional semantic model (i.c. the word2vec skip-gram model) is used to generalize over concepts and retrieve relations between them. These methods are validated on three sets of patient stay data, in the disease areas of urology, cardiology, and gastroenterology. The datasets are in Dutch, which introduces a limitation on available concept definitions from expert-based ontologies (e.g. UMLS). The results show that when expert-based knowledge in ontologies is unavailable, concepts derived from raw clinical texts are a reliable alternative. Both concepts derived from raw clinical texts perform and concepts derived from expert-created dictionaries outperform a bag-of-words approach in clinical code assignment. Adding features based on tokens that appear in a semantically similar context has a positive influence for predicting diagnostic codes. Furthermore, the experiments indicate that a distributional semantics model can find relations between semantically related concepts in texts but also introduces erroneous and redundant relations, which can undermine clinical coding performance. Copyright © 2017. Published by Elsevier Inc.
Some Practical Universal Noiseless Coding Techniques
NASA Technical Reports Server (NTRS)
Rice, Robert F.
1994-01-01
Report discusses noiseless data-compression-coding algorithms, performance characteristics and practical consideration in implementation of algorithms in coding modules composed of very-large-scale integrated circuits. Report also has value as tutorial document on data-compression-coding concepts. Coding techniques and concepts in question "universal" in sense that, in principle, applicable to streams of data from variety of sources. However, discussion oriented toward compression of high-rate data generated by spaceborne sensors for lower-rate transmission back to earth.
Human Research Initiative (HRI)
NASA Technical Reports Server (NTRS)
Motil, Brian
2003-01-01
A code U initiative starting in the FY04 budget includes specific funding for 'Phase Change' and 'Multiphase Flow Research' on the ISS. NASA GRC developed a concept for two facilities based on funding/schedule constraints: 1) Two Phase Flow Facility (TphiFFy) which assumes integrating into FIR; 2) Contact Line Dynamics Experiment Facility (CLiDE) which assumes integration into MSG. Each facility will accommodate multiple experiments conducted by NRA selected PIs with an overall goal of enabling specific NASA strategic objectives. There may also be a significant ground-based component.
Automated UMLS-Based Comparison of Medical Forms
Dugas, Martin; Fritz, Fleur; Krumm, Rainer; Breil, Bernhard
2013-01-01
Medical forms are very heterogeneous: on a European scale there are thousands of data items in several hundred different systems. To enable data exchange for clinical care and research purposes there is a need to develop interoperable documentation systems with harmonized forms for data capture. A prerequisite in this harmonization process is comparison of forms. So far – to our knowledge – an automated method for comparison of medical forms is not available. A form contains a list of data items with corresponding medical concepts. An automatic comparison needs data types, item names and especially item with these unique concept codes from medical terminologies. The scope of the proposed method is a comparison of these items by comparing their concept codes (coded in UMLS). Each data item is represented by item name, concept code and value domain. Two items are called identical, if item name, concept code and value domain are the same. Two items are called matching, if only concept code and value domain are the same. Two items are called similar, if their concept codes are the same, but the value domains are different. Based on these definitions an open-source implementation for automated comparison of medical forms in ODM format with UMLS-based semantic annotations was developed. It is available as package compareODM from http://cran.r-project.org. To evaluate this method, it was applied to a set of 7 real medical forms with 285 data items from a large public ODM repository with forms for different medical purposes (research, quality management, routine care). Comparison results were visualized with grid images and dendrograms. Automated comparison of semantically annotated medical forms is feasible. Dendrograms allow a view on clustered similar forms. The approach is scalable for a large set of real medical forms. PMID:23861827
New Millenium Inflatable Structures Technology
NASA Technical Reports Server (NTRS)
Mollerick, Ralph
1997-01-01
Specific applications where inflatable technology can enable or enhance future space missions are tabulated. The applicability of the inflatable technology to large aperture infra-red astronomy missions is discussed. Space flight validation and risk reduction are emphasized along with the importance of analytical tools in deriving structurally sound concepts and performing optimizations using compatible codes. Deployment dynamics control, fabrication techniques, and system testing are addressed.
Coordinated crew performance in commercial aircraft operations
NASA Technical Reports Server (NTRS)
Murphy, M. R.
1977-01-01
A specific methodology is proposed for an improved system of coding and analyzing crew member interaction. The complexity and lack of precision of many crew and task variables suggest the usefulness of fuzzy linguistic techniques for modeling and computer simulation of the crew performance process. Other research methodologies and concepts that have promise for increasing the effectiveness of research on crew performance are identified.
Continuation of research into language concepts for the mission support environment: Source code
NASA Technical Reports Server (NTRS)
Barton, Timothy J.; Ratner, Jeremiah M.
1991-01-01
Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.
Information coding with frequency of oscillations in Belousov-Zhabotinsky encapsulated disks
NASA Astrophysics Data System (ADS)
Gorecki, J.; Gorecka, J. N.; Adamatzky, Andrew
2014-04-01
Information processing with an excitable chemical medium, like the Belousov-Zhabotinsky (BZ) reaction, is typically based on information coding in the presence or absence of excitation pulses. Here we present a new concept of Boolean coding that can be applied to an oscillatory medium. A medium represents the logical TRUE state if a selected region oscillates with a high frequency. If the frequency fails below a specified value, it represents the logical FALSE state. We consider a medium composed of disks encapsulating an oscillatory mixture of reagents, as related to our recent experiments with lipid-coated BZ droplets. We demonstrate that by using specific geometrical arrangements of disks containing the oscillatory medium one can perform logical operations on variables coded in oscillation frequency. Realizations of a chemical signal diode and of a single-bit memory with oscillatory disks are also discussed.
Epigenetics: a new frontier in dentistry.
Williams, S D; Hughes, T E; Adler, C J; Brook, A H; Townsend, G C
2014-06-01
In 2007, only four years after the completion of the Human Genome Project, the journal Science announced that epigenetics was the 'breakthrough of the year'. Time magazine placed it second in the top 10 discoveries of 2009. While our genetic code (i.e. our DNA) contains all of the information to produce the elements we require to function, our epigenetic code determines when and where genes in the genetic code are expressed. Without the epigenetic code, the genetic code is like an orchestra without a conductor. Although there is now a substantial amount of published research on epigenetics in medicine and biology, epigenetics in dental research is in its infancy. However, epigenetics promises to become increasingly relevant to dentistry because of the role it plays in gene expression during development and subsequently potentially influencing oral disease susceptibility. This paper provides a review of the field of epigenetics aimed specifically at oral health professionals. It defines epigenetics, addresses the underlying concepts and provides details about specific epigenetic molecular mechanisms. Further, we discuss some of the key areas where epigenetics is implicated, and review the literature on epigenetics research in dentistry, including its relevance to clinical disciplines. This review considers some implications of epigenetics for the future of dental practice, including a 'personalized medicine' approach to the management of common oral diseases. © 2014 Australian Dental Association.
An Infrastructure for UML-Based Code Generation Tools
NASA Astrophysics Data System (ADS)
Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.
The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.
Use of General-purpose Negation Detection to Augment Concept Indexing of Medical Documents
Mutalik, Pradeep G.; Deshpande, Aniruddha; Nadkarni, Prakash M.
2001-01-01
Objectives: To test the hypothesis that most instances of negated concepts in dictated medical documents can be detected by a strategy that relies on tools developed for the parsing of formal (computer) languages—specifically, a lexical scanner (“lexer”) that uses regular expressions to generate a finite state machine, and a parser that relies on a restricted subset of context-free grammars, known as LALR(1) grammars. Methods: A diverse training set of 40 medical documents from a variety of specialties was manually inspected and used to develop a program (Negfinder) that contained rules to recognize a large set of negated patterns occurring in the text. Negfinder's lexer and parser were developed using tools normally used to generate programming language compilers. The input to Negfinder consisted of medical narrative that was preprocessed to recognize UMLS concepts: the text of a recognized concept had been replaced with a coded representation that included its UMLS concept ID. The program generated an index with one entry per instance of a concept in the document, where the presence or absence of negation of that concept was recorded. This information was used to mark up the text of each document by color-coding it to make it easier to inspect. The parser was then evaluated in two ways: 1) a test set of 60 documents (30 discharge summaries, 30 surgical notes) marked-up by Negfinder was inspected visually to quantify false-positive and false-negative results; and 2) a different test set of 10 documents was independently examined for negatives by a human observer and by Negfinder, and the results were compared. Results: In the first evaluation using marked-up documents, 8,358 instances of UMLS concepts were detected in the 60 documents, of which 544 were negations detected by the program and verified by human observation (true-positive results, or TPs). Thirteen instances were wrongly flagged as negated (false-positive results, or FPs), and the program missed 27 instances of negation (false-negative results, or FNs), yielding a sensitivity of 95.3 percent and a specificity of 97.7 percent. In the second evaluation using independent negation detection, 1,869 concepts were detected in 10 documents, with 135 TPs, 12 FPs, and 6 FNs, yielding a sensitivity of 95.7 percent and a specificity of 91.8 percent. One of the words “no,” “denies/denied,” “not,” or “without” was present in 92.5 percent of all negations. Conclusions: Negation of most concepts in medical narrative can be reliably detected by a simple strategy. The reliability of detection depends on several factors, the most important being the accuracy of concept matching. PMID:11687566
Gene and genon concept: coding versus regulation
2007-01-01
We analyse here the definition of the gene in order to distinguish, on the basis of modern insight in molecular biology, what the gene is coding for, namely a specific polypeptide, and how its expression is realized and controlled. Before the coding role of the DNA was discovered, a gene was identified with a specific phenotypic trait, from Mendel through Morgan up to Benzer. Subsequently, however, molecular biologists ventured to define a gene at the level of the DNA sequence in terms of coding. As is becoming ever more evident, the relations between information stored at DNA level and functional products are very intricate, and the regulatory aspects are as important and essential as the information coding for products. This approach led, thus, to a conceptual hybrid that confused coding, regulation and functional aspects. In this essay, we develop a definition of the gene that once again starts from the functional aspect. A cellular function can be represented by a polypeptide or an RNA. In the case of the polypeptide, its biochemical identity is determined by the mRNA prior to translation, and that is where we locate the gene. The steps from specific, but possibly separated sequence fragments at DNA level to that final mRNA then can be analysed in terms of regulation. For that purpose, we coin the new term “genon”. In that manner, we can clearly separate product and regulative information while keeping the fundamental relation between coding and function without the need to introduce a conceptual hybrid. In mRNA, the program regulating the expression of a gene is superimposed onto and added to the coding sequence in cis - we call it the genon. The complementary external control of a given mRNA by trans-acting factors is incorporated in its transgenon. A consequence of this definition is that, in eukaryotes, the gene is, in most cases, not yet present at DNA level. Rather, it is assembled by RNA processing, including differential splicing, from various pieces, as steered by the genon. It emerges finally as an uninterrupted nucleic acid sequence at mRNA level just prior to translation, in faithful correspondence with the amino acid sequence to be produced as a polypeptide. After translation, the genon has fulfilled its role and expires. The distinction between the protein coding information as materialised in the final polypeptide and the processing information represented by the genon allows us to set up a new information theoretic scheme. The standard sequence information determined by the genetic code expresses the relation between coding sequence and product. Backward analysis asks from which coding region in the DNA a given polypeptide originates. The (more interesting) forward analysis asks in how many polypeptides of how many different types a given DNA segment is expressed. This concerns the control of the expression process for which we have introduced the genon concept. Thus, the information theoretic analysis can capture the complementary aspects of coding and regulation, of gene and genon. PMID:18087760
Common spaceborne multicomputer operating system and development environment
NASA Technical Reports Server (NTRS)
Craymer, L. G.; Lewis, B. F.; Hayes, P. J.; Jones, R. L.
1994-01-01
A preliminary technical specification for a multicomputer operating system is developed. The operating system is targeted for spaceborne flight missions and provides a broad range of real-time functionality, dynamic remote code-patching capability, and system fault tolerance and long-term survivability features. Dataflow concepts are used for representing application algorithms. Functional features are included to ensure real-time predictability for a class of algorithms which require data-driven execution on an iterative steady state basis. The development environment supports the development of algorithm code, design of control parameters, performance analysis, simulation of real-time dataflow applications, and compiling and downloading of the resulting application.
Tucker, Carole A; Escorpizo, Reuben; Cieza, Alarcos; Lai, Jin Shei; Stucki, Gerold; Ustun, T. Bedirhan; Kostanjsek, Nenad; Cella, David; Forrest, Christopher B.
2014-01-01
Background The Patient Reported Outcomes Measurement Information System (PROMIS®) is a U.S. National Institutes of Health initiative that has produced self-reported item banks for physical, mental, and social health. Objective To describe the content of PROMIS at the item level using the World Health Organization’s International Classification of Functioning, Disability and Health (ICF). Methods All PROMIS adult items (publicly available as of 2012) were assigned to relevant ICF concepts. The content of the PROMIS adult item banks were then described using the mapped ICF code descriptors. Results The 1006 items in the PROMIS instruments could all be mapped to ICF concepts at the second level of classification, with the exception of 3 items of global or general health that mapped across the first-level classification of ICF activity and participation component (d categories). Individual PROMIS item banks mapped from 1 to 5 separate ICF codes indicating one-to-one, one-to-many and many-to-one mappings between PROMIS item banks and ICF second level classification codes. PROMIS supports measurement of the majority of major concepts in the ICF Body Functions (b) and Activity & Participation (d) components using PROMIS item banks or subsets of PROMIS items that could, with care, be used to develop customized instruments. Given the focus of PROMIS is on measurement of person health outcomes, concepts in body structures (s) and some body functions (b), as well as many ICF environmental factor have minimal coverage in PROMIS. Discussion The PROMIS-ICF mapped items provide a basis for users to evaluate the ICF related content of specific PROMIS instruments, and to select PROMIS instruments in ICF based measurement applications. PMID:24760532
Working research codes into fluid dynamics education: a science gateway approach
NASA Astrophysics Data System (ADS)
Mason, Lachlan; Hetherington, James; O'Reilly, Martin; Yong, May; Jersakova, Radka; Grieve, Stuart; Perez-Suarez, David; Klapaukh, Roman; Craster, Richard V.; Matar, Omar K.
2017-11-01
Research codes are effective for illustrating complex concepts in educational fluid dynamics courses, compared to textbook examples, an interactive three-dimensional visualisation can bring a problem to life! Various barriers, however, prevent the adoption of research codes in teaching: codes are typically created for highly-specific `once-off' calculations and, as such, have no user interface and a steep learning curve. Moreover, a code may require access to high-performance computing resources that are not readily available in the classroom. This project allows academics to rapidly work research codes into their teaching via a minimalist `science gateway' framework. The gateway is a simple, yet flexible, web interface allowing students to construct and run simulations, as well as view and share their output. Behind the scenes, the common operations of job configuration, submission, monitoring and post-processing are customisable at the level of shell scripting. In this talk, we demonstrate the creation of an example teaching gateway connected to the Code BLUE fluid dynamics software. Student simulations can be run via a third-party cloud computing provider or a local high-performance cluster. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).
A Strategy for Reusing the Data of Electronic Medical Record Systems for Clinical Research.
Matsumura, Yasushi; Hattori, Atsushi; Manabe, Shiro; Tsuda, Tsutomu; Takeda, Toshihiro; Okada, Katsuki; Murata, Taizo; Mihara, Naoki
2016-01-01
There is a great need to reuse data stored in electronic medical records (EMR) databases for clinical research. We previously reported the development of a system in which progress notes and case report forms (CRFs) were simultaneously recorded using a template in the EMR in order to exclude redundant data entry. To make the data collection process more efficient, we are developing a system in which the data originally stored in the EMR database can be populated within a frame in a template. We developed interface plugin modules that retrieve data from the databases of other EMR applications. A universal keyword written in a template master is converted to a local code using a data conversion table, then the objective data is retrieved from the corresponding database. The template element data, which are entered by a template, are stored in the template element database. To retrieve the data entered by other templates, the objective data is designated by the template element code with the template code, or by the concept code if it is written for the element. When the application systems in the EMR generate documents, they also generate a PDF file and a corresponding document profile XML, which includes important data, and send them to the document archive server and the data sharing saver, respectively. In the data sharing server, the data are represented by an item with an item code with a document class code and its value. By linking a concept code to an item identifier, an objective data can be retrieved by designating a concept code. We employed a flexible strategy in which a unique identifier for a hospital is initially attached to all of the data that the hospital generates. The identifier is secondarily linked with concept codes. The data that are not linked with a concept code can also be retrieved using the unique identifier of the hospital. This strategy makes it possible to reuse any of a hospital's data.
Miller, Andrew D
2015-02-01
A sense peptide can be defined as a peptide whose sequence is coded by the nucleotide sequence (read 5' → 3') of the sense (positive) strand of DNA. Conversely, an antisense (complementary) peptide is coded by the corresponding nucleotide sequence (read 5' → 3') of the antisense (negative) strand of DNA. Research has been accumulating steadily to suggest that sense peptides are capable of specific interactions with their corresponding antisense peptides. Unfortunately, although more and more examples of specific sense-antisense peptide interactions are emerging, the very idea of such interactions does not conform to standard biology dogma and so there remains a sizeable challenge to lift this concept from being perceived as a peripheral phenomenon if not worse, into becoming part of the scientific mainstream. Specific interactions have now been exploited for the inhibition of number of widely different protein-protein and protein-receptor interactions in vitro and in vivo. Further, antisense peptides have also been used to induce the production of antibodies targeted to specific receptors or else the production of anti-idiotypic antibodies targeted against auto-antibodies. Such illustrations of utility would seem to suggest that observed sense-antisense peptide interactions are not just the consequence of a sequence of coincidental 'lucky-hits'. Indeed, at the very least, one might conclude that sense-antisense peptide interactions represent a potentially new and different source of leads for drug discovery. But could there be more to come from studies in this area? Studies on the potential mechanism of sense-antisense peptide interactions suggest that interactions may be driven by amino acid residue interactions specified from the genetic code. If so, such specified amino acid residue interactions could form the basis for an even wider amino acid residue interaction code (proteomic code) that links gene sequences to actual protein structure and function, even entire genomes to entire proteomes. The possibility that such a proteomic code should exist is discussed. So too the potential implications for biology and pharmaceutical science are also discussed were such a code to exist.
Zydziak, Nicolas; Konrad, Waldemar; Feist, Florian; Afonin, Sergii; Weidner, Steffen; Barner-Kowollik, Christopher
2016-11-30
Designing artificial macromolecules with absolute sequence order represents a considerable challenge. Here we report an advanced light-induced avenue to monodisperse sequence-defined functional linear macromolecules up to decamers via a unique photochemical approach. The versatility of the synthetic strategy-combining sequential and modular concepts-enables the synthesis of perfect macromolecules varying in chemical constitution and topology. Specific functions are placed at arbitrary positions along the chain via the successive addition of monomer units and blocks, leading to a library of functional homopolymers, alternating copolymers and block copolymers. The in-depth characterization of each sequence-defined chain confirms the precision nature of the macromolecules. Decoding of the functional information contained in the molecular structure is achieved via tandem mass spectrometry without recourse to their synthetic history, showing that the sequence information can be read. We submit that the presented photochemical strategy is a viable and advanced concept for coding individual monomer units along a macromolecular chain.
Literature-based concept profiles for gene annotation: the issue of weighting.
Jelier, Rob; Schuemie, Martijn J; Roes, Peter-Jan; van Mulligen, Erik M; Kors, Jan A
2008-05-01
Text-mining has been used to link biomedical concepts, such as genes or biological processes, to each other for annotation purposes or the generation of new hypotheses. To relate two concepts to each other several authors have used the vector space model, as vectors can be compared efficiently and transparently. Using this model, a concept is characterized by a list of associated concepts, together with weights that indicate the strength of the association. The associated concepts in the vectors and their weights are derived from a set of documents linked to the concept of interest. An important issue with this approach is the determination of the weights of the associated concepts. Various schemes have been proposed to determine these weights, but no comparative studies of the different approaches are available. Here we compare several weighting approaches in a large scale classification experiment. Three different techniques were evaluated: (1) weighting based on averaging, an empirical approach; (2) the log likelihood ratio, a test-based measure; (3) the uncertainty coefficient, an information-theory based measure. The weighting schemes were applied in a system that annotates genes with Gene Ontology codes. As the gold standard for our study we used the annotations provided by the Gene Ontology Annotation project. Classification performance was evaluated by means of the receiver operating characteristics (ROC) curve using the area under the curve (AUC) as the measure of performance. All methods performed well with median AUC scores greater than 0.84, and scored considerably higher than a binary approach without any weighting. Especially for the more specific Gene Ontology codes excellent performance was observed. The differences between the methods were small when considering the whole experiment. However, the number of documents that were linked to a concept proved to be an important variable. When larger amounts of texts were available for the generation of the concepts' vectors, the performance of the methods diverged considerably, with the uncertainty coefficient then outperforming the two other methods.
2012-01-01
Background Procedures documented by general practitioners in primary care have not been studied in relation to procedure coding systems. We aimed to describe procedures documented by Swedish general practitioners in electronic patient records and to compare them to the Swedish Classification of Health Interventions (KVÅ) and SNOMED CT. Methods Procedures in 200 record entries were identified, coded, assessed in relation to two procedure coding systems and analysed. Results 417 procedures found in the 200 electronic patient record entries were coded with 36 different Classification of Health Interventions categories and 148 different SNOMED CT concepts. 22.8% of the procedures could not be coded with any Classification of Health Interventions category and 4.3% could not be coded with any SNOMED CT concept. 206 procedure-concept/category pairs were assessed as a complete match in SNOMED CT compared to 10 in the Classification of Health Interventions. Conclusions Procedures documented by general practitioners were present in nearly all electronic patient record entries. Almost all procedures could be coded using SNOMED CT. Classification of Health Interventions covered the procedures to a lesser extent and with a much lower degree of concordance. SNOMED CT is a more flexible terminology system that can be used for different purposes for procedure coding in primary care. PMID:22230095
Coherent concepts are computed in the anterior temporal lobes.
Lambon Ralph, Matthew A; Sage, Karen; Jones, Roy W; Mayberry, Emily J
2010-02-09
In his Philosophical Investigations, Wittgenstein famously noted that the formation of semantic representations requires more than a simple combination of verbal and nonverbal features to generate conceptually based similarities and differences. Classical and contemporary neuroscience has tended to focus upon how different neocortical regions contribute to conceptualization through the summation of modality-specific information. The additional yet critical step of computing coherent concepts has received little attention. Some computational models of semantic memory are able to generate such concepts by the addition of modality-invariant information coded in a multidimensional semantic space. By studying patients with semantic dementia, we demonstrate that this aspect of semantic memory becomes compromised following atrophy of the anterior temporal lobes and, as a result, the patients become increasingly influenced by superficial rather than conceptual similarities.
Reliability of SNOMED-CT Coding by Three Physicians using Two Terminology Browsers
Chiang, Michael F.; Hwang, John C.; Yu, Alexander C.; Casper, Daniel S.; Cimino, James J.; Starren, Justin
2006-01-01
SNOMED-CT has been promoted as a reference terminology for electronic health record (EHR) systems. Many important EHR functions are based on the assumption that medical concepts will be coded consistently by different users. This study is designed to measure agreement among three physicians using two SNOMED-CT terminology browsers to encode 242 concepts from five ophthalmology case presentations in a publicly-available clinical journal. Inter-coder reliability, based on exact coding match by each physician, was 44% using one browser and 53% using the other. Intra-coder reliability testing revealed that a different SNOMED-CT code was obtained up to 55% of the time when the two browsers were used by one user to encode the same concept. These results suggest that the reliability of SNOMED-CT coding is imperfect, and may be a function of browsing methodology. A combination of physician training, terminology refinement, and browser improvement may help increase the reproducibility of SNOMED-CT coding. PMID:17238317
An Experiment in Scientific Code Semantic Analysis
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.
1998-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, distributed expert parsers. These semantic parser are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. The parsers will automatically recognize and document some static, semantic concepts and locate some program semantic errors. Results are shown for a subroutine test case and a collection of combustion code routines. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.
2015-02-01
with other nations.2 By May 2012, General Raymond T. Odierno, Chief of Staff of the Army (CSA) had expanded the concept of Regionally Aligned Brigades...includes the Reserve Components (RC), which consists of the Army Reserve (AR) and the National Guard (NG). Army civilians may also be included.9 Each...force component may deal in different ways with various general and specific authorizations. Title 10 of United States Code provides the legal basis
1993-04-01
specify what happens if an error is encountered. This is most usefurl, for examiiple, iii the( specification of a, user iiiterfac. ShTi %e de lo ( r t...93-38 has been reviewed and is approved for publication. DONALD M. ELEFANTE Project Engineer FOR THE COMMANDER: •ER Chief Scientist Command, Control...NOTES Rome Laboratory Project Engineer: Donald M. Elefante /C3CA/(315)330-3565. 12a. DISTRIBUTION/AVAILABILUTY STATEMENT 12b. DISTRIBUTION CODE
Musick, Charles R [Castro Valley, CA; Critchlow, Terence [Livermore, CA; Ganesh, Madhaven [San Jose, CA; Slezak, Tom [Livermore, CA; Fidelis, Krzysztof [Brentwood, CA
2006-12-19
A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.
Performance (Off-Design) Cycle Analysis for a Turbofan Engine With Interstage Turbine Burner
NASA Technical Reports Server (NTRS)
Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.
2005-01-01
This report presents the performance of a steady-state, dual-spool, separate-exhaust turbofan engine, with an interstage turbine burner (ITB) serving as a secondary combustor. The ITB, which is located in the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet-engine propulsion. A detailed off-design performance analysis of ITB engines is written in Microsoft(Registered Trademark) Excel (Redmond, Washington) macrocode with Visual Basic Application to calculate engine performances over the entire operating envelope. Several design-point engine cases are pre-selected using a parametric cycle-analysis code developed previously in Microsoft(Registered Trademark) Excel, for off-design analysis. The off-design code calculates engine performances (i.e. thrust and thrust-specific-fuel-consumption) at various flight conditions and throttle settings.
Generating Safety-Critical PLC Code From a High-Level Application Software Specification
NASA Technical Reports Server (NTRS)
2008-01-01
The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is shown.
ERIC Educational Resources Information Center
Lynch, Beth Eloise
This study was conducted to determine whether the filmic coding elements of split screen, slow motion, generated line cues, the zoom of a camera, and rotation could aid in the development of the Euclidean space concepts of horizontality and verticality, and to explore presence and development of spatial skills involving these two concepts in…
Decree No. 2737 issuing the Code of Minors, 27 November 1989.
1989-01-01
This document contains major provisions of the 1989 Code of Minors of Colombia. This Code spells out the rights of minors to protection, care, and adequate physical, mental, and social development. These rights go into force from the moment of conception. Minors have a specified right to life; to a defined filiation; to grow up within a family; to receive an education (compulsory to the ninth grade and free of charge); to be protected from abuse; to health care; to freedom of speech and to know their rights; to liberty of thought, conscience, and religion; to rest, recreation, and play; to participate in sports and the arts; and to be protected from labor exploitation. Handicapped minors have the right to care, education, and special training. Minors also have the right to be protected from the use of dependency-creating drugs. Any minor in an "irregular situation" will receive protective services. The Code defines abandoned minors and those in danger and provides specific protective measures which can be taken. Rules and procedures covering adoption are included in the Code, because adoption is viewed as primarily a protective measure.
Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine
2014-03-01
Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less
Hypersonic vehicle simulation model: Winged-cone configuration
NASA Technical Reports Server (NTRS)
Shaughnessy, John D.; Pinckney, S. Zane; Mcminn, John D.; Cruz, Christopher I.; Kelley, Marie-Louise
1990-01-01
Aerodynamic, propulsion, and mass models for a generic, horizontal-takeoff, single-stage-to-orbit (SSTO) configuration are presented which are suitable for use in point mass as well as batch and real-time six degree-of-freedom simulations. The simulations can be used to investigate ascent performance issues and to allow research, refinement, and evaluation of integrated guidance/flight/propulsion/thermal control systems, design concepts, and methodologies for SSTO missions. Aerodynamic force and moment coefficients are given as functions of angle of attack, Mach number, and control surface deflections. The model data were estimated by using a subsonic/supersonic panel code and a hypersonic local surface inclination code. Thrust coefficient and engine specific impulse were estimated using a two-dimensional forebody, inlet, nozzle code and a one-dimensional combustor code and are given as functions of Mach number, dynamic pressure, and fuel equivalence ratio. Rigid-body mass moments of inertia and center of gravity location are functions of vehicle weight which is in turn a function of fuel flow.
Sonea, Laura; Buse, Mihail; Gulei, Diana; Onaciu, Anca; Simon, Ioan; Braicu, Cornelia; Berindan-Neagoe, Ioana
2018-05-01
Lung cancer continues to be the leading topic concerning global mortality rate caused by can-cer; it needs to be further investigated to reduce these dramatic unfavorable statistic data. Non-coding RNAs (ncRNAs) have been shown to be important cellular regulatory factors and the alteration of their expression levels has become correlated to extensive number of pathologies. Specifically, their expres-sion profiles are correlated with development and progression of lung cancer, generating great interest for further investigation. This review focuses on the complex role of non-coding RNAs, namely miR-NAs, piwi-interacting RNAs, small nucleolar RNAs, long non-coding RNAs and circular RNAs in the process of developing novel biomarkers for diagnostic and prognostic factors that can then be utilized for personalized therapies toward this devastating disease. To support the concept of personalized medi-cine, we will focus on the roles of miRNAs in lung cancer tumorigenesis, their use as diagnostic and prognostic biomarkers and their application for patient therapy.
NASA Technical Reports Server (NTRS)
1991-01-01
In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.
NASA Technical Reports Server (NTRS)
McGuire, Tim
1998-01-01
In this paper, we report the results of our recent research on the application of a multiprocessor Cray T916 supercomputer in modeling super-thermal electron transport in the earth's magnetic field. In general, this mathematical model requires numerical solution of a system of partial differential equations. The code we use for this model is moderately vectorized. By using Amdahl's Law for vector processors, it can be verified that the code is about 60% vectorized on a Cray computer. Speedup factors on the order of 2.5 were obtained compared to the unvectorized code. In the following sections, we discuss the methodology of improving the code. In addition to our goal of optimizing the code for solution on the Cray computer, we had the goal of scalability in mind. Scalability combines the concepts of portabilty with near-linear speedup. Specifically, a scalable program is one whose performance is portable across many different architectures with differing numbers of processors for many different problem sizes. Though we have access to a Cray at this time, the goal was to also have code which would run well on a variety of architectures.
NASA Astrophysics Data System (ADS)
Braun, Walter; Eglin, Peter; Abello, Ricard
1993-02-01
Spread Spectrum Code Division Multiplex is an attractive scheme for the transmission of multiple signals over a satellite transponder. By using orthogonal or quasi-orthogonal spreading codes the interference between the users can be virtually eliminated. However, the acquisition and tracking of the spreading code phase can not take advantage of the code orthogonality since sequential acquisition and Delay-Locked loop tracking depend on correlation with code phases other than the optimal despreading phase. Hence, synchronization is a critical issue in such a system. A demonstration hardware for the verification of the orthogonal CDM synchronization and data transmission concept is being designed and implemented. The system concept, the synchronization scheme, and the implementation are described. The performance of the system is discussed based on computer simulations.
Kaiser Permanente's Convergent Medical Terminology.
Dolin, Robert H; Mattison, John E; Cohn, Simon; Campbell, Keith E; Wiesenthal, Andrew M; Hochhalter, Brad; LaBerge, Diane; Barsoum, Rita; Shalaby, James; Abilla, Alan; Clements, Robert J; Correia, Carol M; Esteva, Diane; Fedack, John M; Goldberg, Bruce J; Gopalarao, Sridhar; Hafeza, Eza; Hendler, Peter; Hernandez, Enrique; Kamangar, Ron; Kahn, Rafique A; Kurtovich, Georgina; Lazzareschi, Gerry; Lee, Moon H; Lee, Tracy; Levy, David; Lukoff, Jonathan Y; Lundberg, Cyndie; Madden, Michael P; Ngo, Trongtu L; Nguyen, Ben T; Patel, Nikhilkumar P; Resneck, Jim; Ross, David E; Schwarz, Kathleen M; Selhorst, Charles C; Snyder, Aaron; Umarji, Mohamed I; Vilner, Max; Zer-Chen, Roy; Zingo, Chris
2004-01-01
This paper describes Kaiser Permanente's (KP) enterprise-wide medical terminology solution, referred to as our Convergent Medical Terminology (CMT). Initially developed to serve the needs of a regional electronic health record, CMT has evolved into a core KP asset, serving as the common terminology across all applications. CMT serves as the definitive source of concept definitions for the organization, provides a consistent structure and access method to all codes used by the organization, and is KP's language of interoperability, with cross-mappings to regional ancillary systems and administrative billing codes. The core of CMT is comprised of SNOMED CT, laboratory LOINC, and First DataBank drug terminology. These are integrated into a single poly-hierarchically structured knowledge base. Cross map sets provide bi-directional translations between CMT and ancillary applications and administrative billing codes. Context sets provide subsets of CMT for use in specific contexts. Our experience with CMT has lead us to conclude that a successful terminology solution requires that: (1) usability considerations are an organizational priority; (2) "interface" terminology is differentiated from "reference" terminology; (3) it be easy for clinicians to find the concepts they need; (4) the immediate value of coded data be apparent to clinician user; (5) there be a well defined approach to terminology extensions. Over the past several years, there has been substantial progress made in the domain coverage and standardization of medical terminology. KP has learned to exploit that terminology in ways that are clinician-acceptable and that provide powerful options for data analysis and reporting.
Inclusion of pressure and flow in a new 3D MHD equilibrium code
NASA Astrophysics Data System (ADS)
Raburn, Daniel; Fukuyama, Atsushi
2012-10-01
Flow and nonsymmetric effects can play a large role in plasma equilibria and energy confinement. A concept for such a 3D equilibrium code was developed and presented in 2011. The code is called the Kyoto ITerative Equilibrium Solver (KITES) [1], and the concept is based largely on the PIES code [2]. More recently, the work-in-progress KITES code was used to calculate force-free equilibria. Here, progress and results on the inclusion of pressure and flow in the code are presented. [4pt] [1] Daniel Raburn and Atsushi Fukuyama, Plasma and Fusion Research: Regular Articles, 7:240381 (2012).[0pt] [2] H. S. Greenside, A. H. Reiman, and A. Salas, J. Comput. Phys, 81(1):102-136 (1989).
Fedakar, Recep; Aydiner, Ahmet Hüsamettin; Ercan, Ilker
2007-07-01
To compare accuracy and to check the suitability of the Glasgow Coma Scale (GCS), the Revised Trauma Score (RTS), the Injury Severity Score (ISS), the New Injury Severity Score (NISS) and the Trauma and Injury Severity Score (TRISS), the scoring systems widely used in international trauma studies, in the evaluation of the "life threatening injury" concept established by the Turkish Penal Code. The age, sex, type of trauma, type and localizations of wounds, GCS, RTS, ISS, NISS and TRISS values, the decision of life threatening injury of 627 trauma patients admitted to Emergency Department of the Uludag University Medical School Hospital in year 2003 were examined. A life-threatening injury was present in 35.2% of the cases examined. GCS, RTS, ISS, NISS and TRISS confirmed the decision of life threatening injury with percentages of 74.8%, 76.9%, 88.7%, 86.6% and 68.6%, respectively. The best cut-off point 14 was determined in the ISS system with 79.6% sensitivity and 93.6% specificity. All of the cases with sole linear skull fracture officially decided as life threatening injury had an ISS of 5, a NISS of 6 and the best scores of GCS (15), RTS (7.8408) and TRISS (100%). ISS and NISS appeared to be the best trauma scoring systems that can be used for the decision of life threatening injury, compared with GCS, RTS and TRISS. Thus, ISS and NISS can be acceptable for using the evaluation of the life threatening injury concept established by the Turkish Penal Code.
NASA Astrophysics Data System (ADS)
Zhao, Hui; Wei, Jingxuan
2014-09-01
The key to the concept of tunable wavefront coding lies in detachable phase masks. Ojeda-Castaneda et al. (Progress in Electronics Research Symposium Proceedings, Cambridge, USA, July 5-8, 2010) described a typical design in which two components with cosinusoidal phase variation operate together to make defocus sensitivity tunable. The present study proposes an improved design and makes three contributions: (1) A mathematical derivation based on the stationary phase method explains why the detachable phase mask of Ojeda-Castaneda et al. tunes the defocus sensitivity. (2) The mathematical derivations show that the effective bandwidth wavefront coded imaging system is also tunable by making each component of the detachable phase mask move asymmetrically. An improved Fisher information-based optimization procedure was also designed to ascertain the optimal mask parameters corresponding to specific bandwidth. (3) Possible applications of the tunable bandwidth are demonstrated by simulated imaging.
Formal proof of the AVM-1 microprocessor using the concept of generic interpreters
NASA Technical Reports Server (NTRS)
Windley, P.; Levitt, K.; Cohen, G. C.
1991-01-01
A microprocessor designated AVM-1 was designed to demonstrate the use of generic interpreters in verifying hierarchically decomposed microprocessor specifications. This report is intended to document the high-order language (HOL) code verifying AVM-1. The organization of the proof is discussed and some technical details concerning the execution of the proof scripts in HOL are presented. The proof scripts used to verify AVM-1 are also presented.
Ecologic study of children's use of a computer nutrition education program.
Matheson, D; Achterberg, C
2001-01-01
The purpose of this research was to describe the context created by students as they worked in groups on a nutrition computer-assisted instruction (CAI) program. Students worked on the program in groups of three. Observational methods were used to collect data from students in two sixth-grade classrooms that were part of an experimental program designed to restructure the educational process. Thirty-two students, from 12 groups, were observed as they completed the program. The groups were assigned by the teachers according to standard principles of cooperative learning. Students completed "Ship to Shore," a program designed specifically for this research. The program required three to five 50-minute classroom periods to complete. The objectives of the program were to change children's knowledge structure of basic nutrition concepts and to increase children's critical thinking skills related to nutrition concepts. We collected observational data focused on three domains: (1) student-computer interaction, (2) student-student interaction, and (3) students' thinking and learning skills. Grounded theory methods were used to analyze the data. Specifically, the constant-comparative method was used to develop open coding categories, defined by properties and described by dimensions. The open coding categories were in turn used in axial coding to differentiate students' learning styles. Five styles of student interaction were defined. These included (1) dominant directors (n = 6; 19%), (2) passive actors (n = 5; 16%), (3) action-oriented students (n = 7; 22%), (4) content-oriented students (n = 8; 25%), and (5) problem solvers (n = 5; 16%). The "student style" groups were somewhat gender specific. The dominant directors and passive actors were girls and the action-oriented and content-oriented students were boys. The problem solvers group was mixed gender. Children's responses to computer-based nutrition education are highly variable. Based on the results of this research, nutrition educators may recommend that nutrition CAI programs be implemented in mixed gender groups.
Cipriano, Andrea; Ballarino, Monica
2018-01-01
The completion of the human genome sequence together with advances in sequencing technologies have shifted the paradigm of the genome, as composed of discrete and hereditable coding entities, and have shown the abundance of functional noncoding DNA. This part of the genome, previously dismissed as “junk” DNA, increases proportionally with organismal complexity and contributes to gene regulation beyond the boundaries of known protein-coding genes. Different classes of functionally relevant nonprotein-coding RNAs are transcribed from noncoding DNA sequences. Among them are the long noncoding RNAs (lncRNAs), which are thought to participate in the basal regulation of protein-coding genes at both transcriptional and post-transcriptional levels. Although knowledge of this field is still limited, the ability of lncRNAs to localize in different cellular compartments, to fold into specific secondary structures and to interact with different molecules (RNA or proteins) endows them with multiple regulatory mechanisms. It is becoming evident that lncRNAs may play a crucial role in most biological processes such as the control of development, differentiation and cell growth. This review places the evolution of the concept of the gene in its historical context, from Darwin's hypothetical mechanism of heredity to the post-genomic era. We discuss how the original idea of protein-coding genes as unique determinants of phenotypic traits has been reconsidered in light of the existence of noncoding RNAs. We summarize the technological developments which have been made in the genome-wide identification and study of lncRNAs and emphasize the methodologies that have aided our understanding of the complexity of lncRNA-protein interactions in recent years. PMID:29560353
Mission Analysis for High Specific Impulse Deep Space Exploration
NASA Technical Reports Server (NTRS)
Adams, Robert B.; Polsgrove, Tara; Brady, Hugh J. (Technical Monitor)
2002-01-01
This paper describes trajectory calculations for high specific impulse engines. Specific impulses on the order of 10,000 to 100,000 sec are predicted in a variety of fusion powered propulsion systems. This paper and its companion paper seek to build on analyses in the literature to yield an analytical routine for determining time of flight and payload fraction to a predetermined destination. The companion paper will compare the results of this analysis to the trajectories determined by several trajectory codes. The major parameters that affect time of flight and payload fraction will be identified and their sensitivities quantified. A review of existing fusion propulsion concepts and their capabilities will also be tabulated.
Code Mixing and Modernization across Cultures.
ERIC Educational Resources Information Center
Kamwangamalu, Nkonko M.
A review of recent studies addressed the functional uses of code mixing across cultures. Expressions of code mixing (CM) are not random; in fact, a number of functions of code mixing can easily be delineated, for example, the concept of "modernization.""Modernization" is viewed with respect to how bilingual code mixers perceive…
Design and construction of functional AAV vectors.
Gray, John T; Zolotukhin, Serge
2011-01-01
Using the basic principles of molecular biology and laboratory techniques presented in this chapter, researchers should be able to create a wide variety of AAV vectors for both clinical and basic research applications. Basic vector design concepts are covered for both protein coding gene expression and small non-coding RNA gene expression cassettes. AAV plasmid vector backbones (available via AddGene) are described, along with critical sequence details for a variety of modular expression components that can be inserted as needed for specific applications. Protocols are provided for assembling the various DNA components into AAV vector plasmids in Escherichia coli, as well as for transferring these vector sequences into baculovirus genomes for large-scale production of AAV in the insect cell production system.
Photoactivatable Mussel-Based Underwater Adhesive Proteins by an Expanded Genetic Code.
Hauf, Matthias; Richter, Florian; Schneider, Tobias; Faidt, Thomas; Martins, Berta M; Baumann, Tobias; Durkin, Patrick; Dobbek, Holger; Jacobs, Karin; Möglich, Andreas; Budisa, Nediljko
2017-09-19
Marine mussels exhibit potent underwater adhesion abilities under hostile conditions by employing 3,4-dihydroxyphenylalanine (DOPA)-rich mussel adhesive proteins (MAPs). However, their recombinant production is a major biotechnological challenge. Herein, a novel strategy based on genetic code expansion has been developed by engineering efficient aminoacyl-transfer RNA synthetases (aaRSs) for the photocaged noncanonical amino acid ortho-nitrobenzyl DOPA (ONB-DOPA). The engineered ONB-DOPARS enables in vivo production of MAP type 5 site-specifically equipped with multiple instances of ONB-DOPA to yield photocaged, spatiotemporally controlled underwater adhesives. Upon exposure to UV light, these proteins feature elevated wet adhesion properties. This concept offers new perspectives for the production of recombinant bioadhesives. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Soponyono, Eko; Deva Bernadhi, Brav
2017-04-01
Development of national legal systems is aimed to establish the public welfare and the protection of the public. Many attempts has been carried out to renew material criminal law and those efforts results in the formulation of the concept of the draft Law Book of the Law of Criminal Law in the form of concept criminal code draft. The basic ideas in drafting rules and regulation based on the values inside the idology of Pancasila are balance among various norm and rules in society. The design concept of the New Criminal Code Act is anticipatory and proactive to formulate provisions on Crime in Cyberspace and Crime on Information and Electronic Transactions. Several issues compiled in this paper are whether the policy in formulation of cyber crime is embodied in the provisions of the current legislation and what the policies formulation of cyber crime is in the concept of the bill book of law - criminal law recently?.
Wall interference assessment and corrections
NASA Technical Reports Server (NTRS)
Newman, P. A.; Kemp, W. B., Jr.; Garriz, J. A.
1989-01-01
Wind tunnel wall interference assessment and correction (WIAC) concepts, applications, and typical results are discussed in terms of several nonlinear transonic codes and one panel method code developed for and being implemented at NASA-Langley. Contrasts between 2-D and 3-D transonic testing factors which affect WIAC procedures are illustrated using airfoil data from the 0.3 m Transonic Cryogenic Tunnel and Pathfinder 1 data from the National Transonic Facility. Initial results from the 3-D WIAC codes are encouraging; research on and implementation of WIAC concepts continue.
Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy
ERIC Educational Resources Information Center
Hutchison, Amy; Nadolny, Larysa; Estapa, Anne
2016-01-01
In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…
A Semantic Analysis Method for Scientific and Engineering Code
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.
1998-01-01
This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.
Thermomechanical analysis of fast-burst reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, J.D.
1994-08-01
Fast-burst reactors are designed to provide intense, short-duration pulses of neutrons. The fission reaction also produces extreme time-dependent heating of the nuclear fuel. An existing transient-dynamic finite element code was modified specifically to compute the time-dependent stresses and displacements due to thermal shock loads of reactors. Thermomechanical analysis was then applied to determine structural feasibility of various concepts for an EDNA-type reactor and to optimize the mechanical design of the new SPR III-M reactor.
[Representation of knowledge in respiratory medicine: ontology should help the coding process].
Blanc, F-X; Baneyx, A; Charlet, J; Housset, B
2010-09-01
Access to medical knowledge is a major issue for health professionals and requires the development of terminologies. The objective of the reported work was to construct an ontology of respiratory medicine, i.e. an organized and formalized terminology composed by specific knowledge. The purpose is to help the medico-economical coding process and to represent the relevant knowledge about the patient. Our researches cover the whole life cycle of an ontology, from the development of a methodology, to building it from texts, to its use in an operational system. A computerized tool, based on the ontology, allows both a medico-economical coding and a graphical medical one. This second one will be used to index hospital reports. Our ontology counts 1913 concepts and contains all the knowledge included in the PMSI part of the SPLF thesaurus. Our tool has been evaluated and showed a recall of 80% and an accuracy of 85% regarding the medico-economical coding. The work presented in this paper justifies the approach that has been used. It must be continued on a large scale to validate our coding principles and the possibility of making enquiries on patient reports concerning clinical research. Copyright © 2010. Published by Elsevier Masson SAS.
The Use of a Code-generating System for the Derivation of the Equations for Wind Turbine Dynamics
NASA Astrophysics Data System (ADS)
Ganander, Hans
2003-10-01
For many reasons the size of wind turbines on the rapidly growing wind energy market is increasing. Relations between aeroelastic properties of these new large turbines change. Modifications of turbine designs and control concepts are also influenced by growing size. All these trends require development of computer codes for design and certification. Moreover, there is a strong desire for design optimization procedures, which require fast codes. General codes, e.g. finite element codes, normally allow such modifications and improvements of existing wind turbine models. This is done relatively easy. However, the calculation times of such codes are unfavourably long, certainly for optimization use. The use of an automatic code generating system is an alternative for relevance of the two key issues, the code and the design optimization. This technique can be used for rapid generation of codes of particular wind turbine simulation models. These ideas have been followed in the development of new versions of the wind turbine simulation code VIDYN. The equations of the simulation model were derived according to the Lagrange equation and using Mathematica®, which was directed to output the results in Fortran code format. In this way the simulation code is automatically adapted to an actual turbine model, in terms of subroutines containing the equations of motion, definitions of parameters and degrees of freedom. Since the start in 1997, these methods, constituting a systematic way of working, have been used to develop specific efficient calculation codes. The experience with this technique has been very encouraging, inspiring the continued development of new versions of the simulation code as the need has arisen, and the interest for design optimization is growing.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
NASA Astrophysics Data System (ADS)
Rua, Melissa Jo
The present study examined the understandings held by 5th, 8th, and 11th-grade students, their teachers and medical professionals about germs. Specifically, this study describes the content and structure of students' and adults' conceptions in the areas of germ contraction, transmission, and treatment of infectious and non-infectious diseases caused by microorganisms. Naturalistic and empirical research methods were used to investigate participants' conceptions. Between and within group similarities were found using data from concept maps on the topic "flu," drawings of germs, a 20 word card sort related to germs and illness, and a semi-structured interview. Concept maps were coded according to techniques by Novak and Gowan (1984). Drawings of germs were coded into four main categories (bacteria, viruses, animal cell, other) and five subcategories (disease, caricature, insect, protozoa, unclassified). Cluster patterns for the card sorts of each group were found using multidimensional scaling techniques. Six coding categories emerged from the interview transcripts: (a) transmission, (b) treatment, (c) effect of weather on illness, (d) immune response, (e) location of germs, and (f) similarities and differences between bacteria and viruses. The findings showed students, teachers and medical professionals have different understandings about bacteria and viruses and the structures of those understandings vary. Gaps or holes in the participants knowledge were found in areas such as: (a) how germs are transmitted, (b) where germs are found, (c) how the body transports and uses medicine, (d) how the immune system functions, (e) the difference between vaccines and non-prescription medicines, (f) differences that exist between bacteria and viruses, and (g) bacterial resistance to medication. The youngest students relied heavily upon personal experiences with germs rather than formal instruction when explaining their conceptions. As a result, the influence of media was evident in the students' understandings and images of microbes. Students also viewed germs as a human problem rather than seeing microorganisms as an independent member of the ecosystem. Teachers' explanations about germs varied in explicitness based on the grade level they taught while medical professionals based their understandings on formal knowledge and tended to use explicit technical language in their explanations of the phenomena.
A Partial Least Squares Based Procedure for Upstream Sequence Classification in Prokaryotes.
Mehmood, Tahir; Bohlin, Jon; Snipen, Lars
2015-01-01
The upstream region of coding genes is important for several reasons, for instance locating transcription factor, binding sites, and start site initiation in genomic DNA. Motivated by a recently conducted study, where multivariate approach was successfully applied to coding sequence modeling, we have introduced a partial least squares (PLS) based procedure for the classification of true upstream prokaryotic sequence from background upstream sequence. The upstream sequences of conserved coding genes over genomes were considered in analysis, where conserved coding genes were found by using pan-genomics concept for each considered prokaryotic species. PLS uses position specific scoring matrix (PSSM) to study the characteristics of upstream region. Results obtained by PLS based method were compared with Gini importance of random forest (RF) and support vector machine (SVM), which is much used method for sequence classification. The upstream sequence classification performance was evaluated by using cross validation, and suggested approach identifies prokaryotic upstream region significantly better to RF (p-value < 0.01) and SVM (p-value < 0.01). Further, the proposed method also produced results that concurred with known biological characteristics of the upstream region.
The MIMIC Code Repository: enabling reproducibility in critical care research.
Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J
2018-01-01
Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Kindlmann, Gordon; Chiw, Charisee; Seltzer, Nicholas; Samuels, Lamont; Reppy, John
2016-01-01
Many algorithms for scientific visualization and image analysis are rooted in the world of continuous scalar, vector, and tensor fields, but are programmed in low-level languages and libraries that obscure their mathematical foundations. Diderot is a parallel domain-specific language that is designed to bridge this semantic gap by providing the programmer with a high-level, mathematical programming notation that allows direct expression of mathematical concepts in code. Furthermore, Diderot provides parallel performance that takes advantage of modern multicore processors and GPUs. The high-level notation allows a concise and natural expression of the algorithms and the parallelism allows efficient execution on real-world datasets.
The World in a Tomato: Revisiting the Use of "Codes" in Freire's Problem-Posing Education.
ERIC Educational Resources Information Center
Barndt, Deborah
1998-01-01
Gives examples of the use of Freire's notion of codes or generative themes in problem-posing literacy education. Describes how these applications expand Freire's conceptions by involving students in code production, including multicultural perspectives, and rethinking codes as representations. (SK)
A Physicist's view on Chopin's Études
NASA Astrophysics Data System (ADS)
Blasone, Massimo
2017-07-01
We propose the use of specific dynamical processes and more in general of ideas from Physics to model the evolution in time of musical structures. We apply this approach to two Études by F. Chopin, namely Op.10 n.3 and Op.25 n.1, proposing some original description based on concepts of symmetry breaking/restoration and quantum coherence, which could be useful for interpretation. In this analysis, we take advantage of colored musical scores, obtained by implementing Scriabin's color code for sounds to musical notation.
Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.
Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin
2012-01-01
Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.
Optical encryption and QR codes: secure and noise-free information retrieval.
Barrera, John Fredy; Mira, Alejandro; Torroba, Roberto
2013-03-11
We introduce for the first time the concept of an information "container" before a standard optical encrypting procedure. The "container" selected is a QR code which offers the main advantage of being tolerant to pollutant speckle noise. Besides, the QR code can be read by smartphones, a massively used device. Additionally, QR code includes another secure step to the encrypting benefits the optical methods provide. The QR is generated by means of worldwide free available software. The concept development probes that speckle noise polluting the outcomes of normal optical encrypting procedures can be avoided, then making more attractive the adoption of these techniques. Actual smartphone collected results are shown to validate our proposal.
Conceptualisations of infinity by primary pre-service teachers
NASA Astrophysics Data System (ADS)
Date-Huxtable, Elizabeth; Cavanagh, Michael; Coady, Carmel; Easey, Michael
2018-05-01
As part of the Opening Real Science: Authentic Mathematics and Science Education for Australia project, an online mathematics learning module embedding conceptual thinking about infinity in science-based contexts, was designed and trialled with a cohort of 22 pre-service teachers during 1 week of intensive study. This research addressed the question: "How do pre-service teachers conceptualise infinity mathematically?" Participants argued the existence of infinity in a summative reflective task, using mathematical and empirical arguments that were coded according to five themes: definition, examples, application, philosophy and teaching; and 17 codes. Participants' reflections were differentiated as to whether infinity was referred to as an abstract (A) or a real (R) concept or whether both (B) codes were used. Principal component analysis of the reflections, using frequency of codings, revealed that A and R codes occurred at different frequencies in three groups of reflections. Distinct methods of argument were associated with each group of reflections: mathematical numerical examples and empirical measurement comparisons characterised arguments for infinity as an abstract concept, geometric and empirical dynamic examples and belief statements characterised arguments for infinity as a real concept and empirical measurement and mathematical examples and belief statements characterised arguments for infinity as both an abstract and a real concept. An implication of the results is that connections between mathematical and empirical applications of infinity may assist pre-service teachers to contrast finite with infinite models of the world.
Metasurfaced Reverberation Chamber.
Sun, Hengyi; Li, Zhuo; Gu, Changqing; Xu, Qian; Chen, Xinlei; Sun, Yunhe; Lu, Shengchen; Martin, Ferran
2018-01-25
The concept of metasurfaced reverberation chamber (RC) is introduced in this paper. It is shown that by coating the chamber wall with a rotating 1-bit random coding metasurface, it is possible to enlarge the test zone of the RC while maintaining the field uniformity as good as that in a traditional RC with mechanical stirrers. A 1-bit random coding diffusion metasurface is designed to obtain all-direction backscattering under normal incidence. Three specific cases are studied for comparisons, including a (traditional) mechanical stirrer RC, a mechanical stirrer RC with a fixed diffusion metasurface, and a RC with a rotating diffusion metasurface. Simulation results show that the compact rotating diffusion metasurface can act as a stirrer with good stirring efficiency. By using such rotating diffusion metasurface, the test region of the RC can be greatly extended.
Code Switching and Code Superimposition in Music. Working Papers in Sociolinguistics, No. 63.
ERIC Educational Resources Information Center
Slobin, Mark
This paper illustrates how the sociolinguistic concept of code switching applies to the use of different styles of music. The two bases for the analogy are Labov's definition of code-switching as "moving from one consistent set of co-occurring rules to another," and the finding of sociolinguistics that code switching tends to be part of…
NASA Technical Reports Server (NTRS)
Benedetto, S.; Divsalar, D.; Montorsi, G.; Pollara, F.
1998-01-01
Soft-input soft-output building blocks (modules) are presented to construct and iteratively decode in a distributed fashion code networks, a new concept that includes, and generalizes, various forms of concatenated coding schemes.
GRC RBCC Concept Multidisciplinary Analysis
NASA Technical Reports Server (NTRS)
Suresh, Ambady
2001-01-01
This report outlines the GRC RBCC Concept for Multidisciplinary Analysis. The multidisciplinary coupling procedure is presented, along with technique validations and axisymmetric multidisciplinary inlet and structural results. The NPSS (Numerical Propulsion System Simulation) test bed developments and code parallelization are also presented. These include milestones and accomplishments, a discussion of running R4 fan application on the PII cluster as compared to other platforms, and the National Combustor Code speedup.
A Fault-Tolerant Radiation-Robust Mass Storage Concept for Highly Scaled Flash Memory
NASA Astrophysics Data System (ADS)
Fuchs, Cristian M.; Trinitis, Carsten; Appel, Nicolas; Langer, Martin
2015-09-01
Future spacemissions will require vast amounts of data to be stored and processed aboard spacecraft. While satisfying operational mission requirements, storage systems must guarantee data integrity and recover damaged data throughout the mission. NAND-flash memories have become popular for space-borne high performance mass memory scenarios, though future storage concepts will rely upon highly scaled flash or other memory technologies. With modern flash memory, single bit erasure coding and RAID based concepts are insufficient. Thus, a fully run-time configurable, high performance, dependable storage concept, requiring a minimal set of logic or software. The solution is based on composite erasure coding and can be adjusted for altered mission duration or changing environmental conditions.
Time for change: a roadmap to guide the implementation of the World Anti-Doping Code 2015
Dvorak, Jiri; Baume, Norbert; Botré, Francesco; Broséus, Julian; Budgett, Richard; Frey, Walter O; Geyer, Hans; Harcourt, Peter Rex; Ho, Dave; Howman, David; Isola, Victor; Lundby, Carsten; Marclay, François; Peytavin, Annie; Pipe, Andrew; Pitsiladis, Yannis P; Reichel, Christian; Robinson, Neil; Rodchenkov, Grigory; Saugy, Martial; Sayegh, Souheil; Segura, Jordi; Thevis, Mario; Vernec, Alan; Viret, Marjolaine; Vouillamoz, Marc; Zorzoli, Mario
2014-01-01
A medical and scientific multidisciplinary consensus meeting was held from 29 to 30 November 2013 on Anti-Doping in Sport at the Home of FIFA in Zurich, Switzerland, to create a roadmap for the implementation of the 2015 World Anti-Doping Code. The consensus statement and accompanying papers set out the priorities for the antidoping community in research, science and medicine. The participants achieved consensus on a strategy for the implementation of the 2015 World Anti-Doping Code. Key components of this strategy include: (1) sport-specific risk assessment, (2) prevalence measurement, (3) sport-specific test distribution plans, (4) storage and reanalysis, (5) analytical challenges, (6) forensic intelligence, (7) psychological approach to optimise the most deterrent effect, (8) the Athlete Biological Passport (ABP) and confounding factors, (9) data management system (Anti-Doping Administration & Management System (ADAMS), (10) education, (11) research needs and necessary advances, (12) inadvertent doping and (13) management and ethics: biological data. True implementation of the 2015 World Anti-Doping Code will depend largely on the ability to align thinking around these core concepts and strategies. FIFA, jointly with all other engaged International Federations of sports (Ifs), the International Olympic Committee (IOC) and World Anti-Doping Agency (WADA), are ideally placed to lead transformational change with the unwavering support of the wider antidoping community. The outcome of the consensus meeting was the creation of the ad hoc Working Group charged with the responsibility of moving this agenda forward. PMID:24764550
Time for change: a roadmap to guide the implementation of the World Anti-Doping Code 2015.
Dvorak, Jiri; Baume, Norbert; Botré, Francesco; Broséus, Julian; Budgett, Richard; Frey, Walter O; Geyer, Hans; Harcourt, Peter Rex; Ho, Dave; Howman, David; Isola, Victor; Lundby, Carsten; Marclay, François; Peytavin, Annie; Pipe, Andrew; Pitsiladis, Yannis P; Reichel, Christian; Robinson, Neil; Rodchenkov, Grigory; Saugy, Martial; Sayegh, Souheil; Segura, Jordi; Thevis, Mario; Vernec, Alan; Viret, Marjolaine; Vouillamoz, Marc; Zorzoli, Mario
2014-05-01
A medical and scientific multidisciplinary consensus meeting was held from 29 to 30 November 2013 on Anti-Doping in Sport at the Home of FIFA in Zurich, Switzerland, to create a roadmap for the implementation of the 2015 World Anti-Doping Code. The consensus statement and accompanying papers set out the priorities for the antidoping community in research, science and medicine. The participants achieved consensus on a strategy for the implementation of the 2015 World Anti-Doping Code. Key components of this strategy include: (1) sport-specific risk assessment, (2) prevalence measurement, (3) sport-specific test distribution plans, (4) storage and reanalysis, (5) analytical challenges, (6) forensic intelligence, (7) psychological approach to optimise the most deterrent effect, (8) the Athlete Biological Passport (ABP) and confounding factors, (9) data management system (Anti-Doping Administration & Management System (ADAMS), (10) education, (11) research needs and necessary advances, (12) inadvertent doping and (13) management and ethics: biological data. True implementation of the 2015 World Anti-Doping Code will depend largely on the ability to align thinking around these core concepts and strategies. FIFA, jointly with all other engaged International Federations of sports (Ifs), the International Olympic Committee (IOC) and World Anti-Doping Agency (WADA), are ideally placed to lead transformational change with the unwavering support of the wider antidoping community. The outcome of the consensus meeting was the creation of the ad hoc Working Group charged with the responsibility of moving this agenda forward.
Processing Motion: Using Code to Teach Newtonian Physics
NASA Astrophysics Data System (ADS)
Massey, M. Ryan
Prior to instruction, students often possess a common-sense view of motion, which is inconsistent with Newtonian physics. Effective physics lessons therefore involve conceptual change. To provide a theoretical explanation for concepts and how they change, the triangulation model brings together key attributes of prototypes, exemplars, theories, Bayesian learning, ontological categories, and the causal model theory. The triangulation model provides a theoretical rationale for why coding is a viable method for physics instruction. As an experiment, thirty-two adolescent students participated in summer coding academies to learn how to design Newtonian simulations. Conceptual and attitudinal data was collected using the Force Concept Inventory and the Colorado Learning Attitudes about Science Survey. Results suggest that coding is an effective means for teaching Newtonian physics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca
Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behaviormore » of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.« less
Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials.
Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin; Cui, Tie Jun
2017-09-01
Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits "0" and "1" to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency-spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments.
Basic concepts of quantum interference and electron transport in single-molecule electronics.
Lambert, C J
2015-02-21
This tutorial outlines the basic theoretical concepts and tools which underpin the fundamentals of phase-coherent electron transport through single molecules. The key quantity of interest is the transmission coefficient T(E), which yields the electrical conductance, current-voltage relations, the thermopower S and the thermoelectric figure of merit ZT of single-molecule devices. Since T(E) is strongly affected by quantum interference (QI), three manifestations of QI in single-molecules are discussed, namely Mach-Zehnder interferometry, Breit-Wigner resonances and Fano resonances. A simple MATLAB code is provided, which allows the novice reader to explore QI in multi-branched structures described by a tight-binding (Hückel) Hamiltonian. More generally, the strengths and limitations of materials-specific transport modelling based on density functional theory are discussed.
Developments in the safe design of LNG tanks
NASA Astrophysics Data System (ADS)
Fulford, N. J.; Slatter, M. D.
The objective of this paper is to discuss how the gradual development of design concepts for liquefied natural gas (LNG) storage systems has helped to enhance storage safety and economy. The experience in the UK is compared with practice in other countries with similar LNG storage requirements. Emphasis is placed on the excellent record of safety and reliability exhibited by tanks with a primary metal container designed and constructed to approved standards. The work carried out to promote the development of new materials, fire protection, and monitoring systems for use in LNG storage is also summarized, and specific examples described from British Gas experience. Finally, the trends in storage tank design world-wide and options for future design concepts are discussed, bearing in mind planned legislation and design codes governing hazardous installations.
Flight dynamics software in a distributed network environment
NASA Technical Reports Server (NTRS)
Jeletic, J.; Weidow, D.; Boland, D.
1995-01-01
As with all NASA facilities, the announcement of reduced budgets, reduced staffing, and the desire to implement smaller/quicker/cheaper missions has required the Agency's organizations to become more efficient in what they do. To accomplish these objectives, the FDD has initiated the development of the Flight Dynamics Distributed System (FDDS). The underlying philosophy of FDDS is to build an integrated system that breaks down the traditional barriers of attitude, mission planning, and navigation support software to provide a uniform approach to flight dynamics applications. Through the application of open systems concepts and state-of-the-art technologies, including object-oriented specification concepts, object-oriented software, and common user interface, communications, data management, and executive services, the FDD will reengineer most of its six million lines of code.
Multilingual natural language generation as part of a medical terminology server.
Wagner, J C; Solomon, W D; Michel, P A; Juge, C; Baud, R H; Rector, A L; Scherrer, J R
1995-01-01
Re-usable and sharable, and therefore language-independent concept models are of increasing importance in the medical domain. The GALEN project (Generalized Architecture for Languages Encyclopedias and Nomenclatures in Medicine) aims at developing language-independent concept representation systems as the foundations for the next generation of multilingual coding systems. For use within clinical applications, the content of the model has to be mapped to natural language. A so-called Multilingual Information Module (MM) establishes the link between the language-independent concept model and different natural languages. This text generation software must be versatile enough to cope at the same time with different languages and with different parts of a compositional model. It has to meet, on the one hand, the properties of the language as used in the medical domain and, on the other hand, the specific characteristics of the underlying model and its representation formalism. We propose a semantic-oriented approach to natural language generation that is based on linguistic annotations to a concept model. This approach is realized as an integral part of a Terminology Server, built around the concept model and offering different terminological services for clinical applications.
Expert system for maintenance management of a boiling water reactor power plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong Shen; Liou, L.W.; Levine, S.
1992-01-01
An expert system code has been developed for the maintenance of two boiling water reactor units in Berwick, Pennsylvania, that are operated by the Pennsylvania Power and Light Company (PP and L). The objective of this expert system code, where the knowledge of experienced operators and engineers is captured and implemented, is to support the decisions regarding which components can be safely and reliably removed from service for maintenance. It can also serve as a query-answering facility for checking the plant system status and for training purposes. The operating and maintenance information of a large number of support systems, whichmore » must be available for emergencies and/or in the event of an accident, is stored in the data base of the code. It identifies the relevant technical specifications and management rules for shutting down any one of the systems or removing a component from service to support maintenance. Because of the complexity and time needed to incorporate a large number of systems and their components, the first phase of the expert system develops a prototype code, which includes only the reactor core isolation coolant system, the high-pressure core injection system, the instrument air system, the service water system, and the plant electrical system. The next phase is scheduled to expand the code to include all other systems. This paper summarizes the prototype code and the design concept of the complete expert system code for maintenance management of all plant systems and components.« less
The general theory of convolutional codes
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Stanley, R. P.
1993-01-01
This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.
NASA Astrophysics Data System (ADS)
Tarditi, Alfonso G.; Shebalin, John V.
2002-11-01
A simulation study with the NIMROD code [1] is being carried on to investigate the efficiency of the thrust generation process and the properties of the plasma detachment in a magnetic nozzle. In the simulation, hot plasma is injected in the magnetic nozzle, modeled as a 2D, axi-symmetric domain. NIMROD has two-fluid, 3D capabilities but the present runs are being conducted within the MHD, 2D approximation. As the plasma travels through the magnetic field, part of its thermal energy is converted into longitudinal kinetic energy, along the axis of the nozzle. The plasma eventually detaches from the magnetic field at a certain distance from the nozzle throat where the kinetic energy becomes larger than the magnetic energy. Preliminary NIMROD 2D runs have been benchmarked with a particle trajectory code showing satisfactory results [2]. Further testing is here reported with the emphasis on the analysis of the diffusion rate across the field lines and of the overall nozzle efficiency. These simulation runs are specifically designed for obtaining comparisons with laboratory measurements of the VASIMR experiment, by looking at the evolution of the radial plasma density and temperature profiles in the nozzle. VASIMR (Variable Specific Impulse Magnetoplasma Rocket, [3]) is an advanced space propulsion concept currently under experimental development at the Advanced Space Propulsion Laboratory, NASA Johnson Space Center. A plasma (typically ionized Hydrogen or Helium) is generated by a RF (Helicon) discharge and heated by an Ion Cyclotron Resonance Heating antenna. The heated plasma is then guided into a magnetic nozzle to convert the thermal plasma energy into effective thrust. The VASIMR system has no electrodes and a solenoidal magnetic field produced by an asymmetric mirror configuration ensures magnetic insulation of the plasma from the material surfaces. By powering the plasma source and the heating antenna at different levels it is possible to vary smoothly of the thrust-to-specific impulse ratio while maintaining maximum power utilization. [1] http://www.nimrodteam.org [2] A. V. Ilin et al., Proc. 40th AIAA Aerospace Sciences Meeting, Reno, NV, Jan. 2002 [3] F. R. Chang-Diaz, Scientific American, p. 90, Nov. 2000
Nevada Administrative Code for Special Education Programs.
ERIC Educational Resources Information Center
Nevada State Dept. of Education, Carson City. Special Education Branch.
This document presents excerpts from Chapter 388 of the Nevada Administrative Code, which concerns definitions, eligibility, and programs for students who are disabled or gifted/talented. The first section gathers together 36 relevant definitions from the Code for such concepts as "adaptive behavior,""autism,""gifted and…
The Effects of a Concept Map-Based Support Tool on Simulation-Based Inquiry Learning
ERIC Educational Resources Information Center
Hagemans, Mieke G.; van der Meij, Hans; de Jong, Ton
2013-01-01
Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations, while dynamic color coding of the concepts displayed…
Korošec, Peter; Eftimov, Tome; Ocke, Marga; van der Laan, Jan; Roe, Mark; Berry, Rachel; Turrini, Aida; Krems, Carolin; Slimani, Nadia; Finglas, Paul
2018-01-01
This paper identifies the requirements for computer-supported food matching, in order to address not only national and European but also international current related needs and represents an integrated research contribution of the FP7 EuroDISH project. The available classification and coding systems and the specific problems of food matching are summarized and a new concept for food matching based on optimization methods and machine-based learning is proposed. To illustrate and test this concept, a study has been conducted in four European countries (i.e., Germany, The Netherlands, Italy and the UK) using different classification and coding systems. This real case study enabled us to evaluate the new food matching concept and provide further recommendations for future work. In the first stage of the study, we prepared subsets of food consumption data described and classified using different systems, that had already been manually matched with national food composition data. Once the food matching algorithm was trained using this data, testing was performed on another subset of food consumption data. Experts from different countries validated food matching between consumption and composition data by selecting best matches from the options given by the matching algorithm without seeing the result of the previously made manual match. The evaluation of study results stressed the importance of the role and quality of the food composition database as compared to the selected classification and/or coding systems and the need to continue compiling national food composition data as eating habits and national dishes still vary between countries. Although some countries managed to collect extensive sets of food consumption data, these cannot be easily matched with food composition data if either food consumption or food composition data are not properly classified and described using any classification and coding systems. The study also showed that the level of human expertise played an important role, at least in the training stage. Both sets of data require continuous development to improve their quality in dietary assessment. PMID:29601516
Interdependence, Reflexivity, Fidelity, Impedance Matching, and the Evolution of Genetic Coding
Carter, Charles W; Wills, Peter R
2018-01-01
Abstract Genetic coding is generally thought to have required ribozymes whose functions were taken over by polypeptide aminoacyl-tRNA synthetases (aaRS). Two discoveries about aaRS and their interactions with tRNA substrates now furnish a unifying rationale for the opposite conclusion: that the key processes of the Central Dogma of molecular biology emerged simultaneously and naturally from simple origins in a peptide•RNA partnership, eliminating the epistemological utility of a prior RNA world. First, the two aaRS classes likely arose from opposite strands of the same ancestral gene, implying a simple genetic alphabet. The resulting inversion symmetries in aaRS structural biology would have stabilized the initial and subsequent differentiation of coding specificities, rapidly promoting diversity in the proteome. Second, amino acid physical chemistry maps onto tRNA identity elements, establishing reflexive, nanoenvironmental sensing in protein aaRS. Bootstrapping of increasingly detailed coding is thus intrinsic to polypeptide aaRS, but impossible in an RNA world. These notions underline the following concepts that contradict gradual replacement of ribozymal aaRS by polypeptide aaRS: 1) aaRS enzymes must be interdependent; 2) reflexivity intrinsic to polypeptide aaRS production dynamics promotes bootstrapping; 3) takeover of RNA-catalyzed aminoacylation by enzymes will necessarily degrade specificity; and 4) the Central Dogma’s emergence is most probable when replication and translation error rates remain comparable. These characteristics are necessary and sufficient for the essentially de novo emergence of a coupled gene–replicase–translatase system of genetic coding that would have continuously preserved the functional meaning of genetically encoded protein genes whose phylogenetic relationships match those observed today. PMID:29077934
Parallelising a molecular dynamics algorithm on a multi-processor workstation
NASA Astrophysics Data System (ADS)
Müller-Plathe, Florian
1990-12-01
The Verlet neighbour-list algorithm is parallelised for a multi-processor Hewlett-Packard/Apollo DN10000 workstation. The implementation makes use of memory shared between the processors. It is a genuine master-slave approach by which most of the computational tasks are kept in the master process and the slaves are only called to do part of the nonbonded forces calculation. The implementation features elements of both fine-grain and coarse-grain parallelism. Apart from three calls to library routines, two of which are standard UNIX calls, and two machine-specific language extensions, the whole code is written in standard Fortran 77. Hence, it may be expected that this parallelisation concept can be transfered in parts or as a whole to other multi-processor shared-memory computers. The parallel code is routinely used in production work.
Concepts of Integration for UAS Operations in the NAS
NASA Technical Reports Server (NTRS)
Consiglio, Maria C.; Chamberlain, James P.; Munoz, Cesar A.; Hoffler, Keith D.
2012-01-01
One of the major challenges facing the integration of Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS) is the lack of an onboard pilot that can comply with the legal requirement identified in the US Code of Federal Regulations (CFR) that pilots see and avoid other aircraft. UAS will be expected to demonstrate the means to perform the function of see and avoid while preserving the safety level of the airspace and the efficiency of the air traffic system. This paper introduces a Sense and Avoid (SAA) concept for integration of UAS into the NAS that is currently being developed by the National Aeronautics and Space Administration (NASA) and identifies areas that require additional experimental evaluation to further inform various elements of the concept. The concept design rests on interoperability principles that take into account both the Air Traffic Control (ATC) environment as well as existing systems such as the Traffic Alert and Collision Avoidance System (TCAS). Specifically, the concept addresses the determination of well clear values that are large enough to avoid issuance of TCAS corrective Resolution Advisories, undue concern by pilots of proximate aircraft and issuance of controller traffic alerts. The concept also addresses appropriate declaration times for projected losses of well clear conditions and maneuvers to regain well clear separation.
Swept Impact Seismic Technique (SIST)
Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.
1996-01-01
A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1992-01-01
A generic unit cell model which includes a unique fiber substructuring concept is proposed for the development of micromechanics equations for continuous fiber reinforcement ceramic composites. The unit cell consists of three constituents: fiber, matrix, and an interphase. In the present approach, the unit cell is further subdivided into several slices and the equations of micromechanics are derived for each slice. These are subsequently integrated to obtain ply level properties. A stand alone computer code containing the micromechanics model as a module is currently being developed specifically for the analysis of ceramic matrix composites. Towards this development, equivalent ply property results for a SiC/Ti-15-3 composite with 0.5 fiber volume ratio are presented and compared with those obtained from customary micromechanics models to illustrate the concept. Also, comparisons with limited experimental data for the ceramic matrix composite, SiC/RBSN (Reaction Bonded Silicon Nitride) with a 0.3 fiber volume ratio are given to validate the concepts.
Seals Code Development Workshop
NASA Technical Reports Server (NTRS)
Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)
1996-01-01
Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.
Cluster Analysis of Rat Olfactory Bulb Responses to Diverse Odorants
Falasconi, Matteo; Leon, Michael; Johnson, Brett A.; Marco, Santiago
2012-01-01
In an effort to deepen our understanding of mammalian olfactory coding, we have used an objective method to analyze a large set of odorant-evoked activity maps collected systematically across the rat olfactory bulb to determine whether such an approach could identify specific glomerular regions that are activated by related odorants. To that end, we combined fuzzy c-means clustering methods with a novel validity approach based on cluster stability to evaluate the significance of the fuzzy partitions on a data set of glomerular layer responses to a large diverse group of odorants. Our results confirm the existence of glomerular response clusters to similar odorants. They further indicate a partial hierarchical chemotopic organization wherein larger glomerular regions can be subdivided into smaller areas that are rather specific in their responses to particular functional groups of odorants. These clusters bear many similarities to, as well as some differences from, response domains previously proposed for the glomerular layer of the bulb. These data also provide additional support for the concept of an identity code in the mammalian olfactory system. PMID:22459165
Two-Dimensional Parson's Puzzles: The Concept, Tools, and First Observations
ERIC Educational Resources Information Center
Ihantola, Petri; Karavirta, Ville
2011-01-01
Parson's programming puzzles are a family of code construction assignments where lines of code are given, and the task is to form the solution by sorting and possibly selecting the correct code lines. We introduce a novel family of Parson's puzzles where the lines of code need to be sorted in two dimensions. The vertical dimension is used to order…
A European mobile satellite system concept exploiting CDMA and OBP
NASA Technical Reports Server (NTRS)
Vernucci, A.; Craig, A. D.
1993-01-01
This paper describes a novel Land Mobile Satellite System (LMSS) concept applicable to networks allowing access to a large number of gateway stations ('Hubs'), utilizing low-cost Very Small Aperture Terminals (VSAT's). Efficient operation of the Forward-Link (FL) repeater can be achieved by adopting a synchronous Code Division Multiple Access (CDMA) technique, whereby inter-code interference (self-noise) is virtually eliminated by synchronizing orthogonal codes. However, with a transparent FL repeater, the requirements imposed by the highly decentralized ground segment can lead to significant efficiency losses. The adoption of a FL On-Board Processing (OBP) repeater is proposed as a means of largely recovering this efficiency impairment. The paper describes the network architecture, the system design and performance, the OBP functions and impact on implementation. The proposed concept, applicable to a future generation of the European LMSS, was developed in the context of a European Space Agency (ESA) study contract.
Controlling Energy Radiations of Electromagnetic Waves via Frequency Coding Metamaterials
Wu, Haotian; Liu, Shuo; Wan, Xiang; Zhang, Lei; Wang, Dan; Li, Lianlin
2017-01-01
Metamaterials are artificial structures composed of subwavelength unit cells to control electromagnetic (EM) waves. The spatial coding representation of metamaterial has the ability to describe the material in a digital way. The spatial coding metamaterials are typically constructed by unit cells that have similar shapes with fixed functionality. Here, the concept of frequency coding metamaterial is proposed, which achieves different controls of EM energy radiations with a fixed spatial coding pattern when the frequency changes. In this case, not only different phase responses of the unit cells are considered, but also different phase sensitivities are also required. Due to different frequency sensitivities of unit cells, two units with the same phase response at the initial frequency may have different phase responses at higher frequency. To describe the frequency coding property of unit cell, digitalized frequency sensitivity is proposed, in which the units are encoded with digits “0” and “1” to represent the low and high phase sensitivities, respectively. By this merit, two degrees of freedom, spatial coding and frequency coding, are obtained to control the EM energy radiations by a new class of frequency‐spatial coding metamaterials. The above concepts and physical phenomena are confirmed by numerical simulations and experiments. PMID:28932671
Water NSTF Design, Instrumentation, and Test Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lisowski, Darius D.; Gerardi, Craig D.; Hu, Rui
The following report serves as a formal introduction to the water-based Natural convection Shutdown heat removal Test Facility (NSTF) program at Argonne. Since 2005, this US Department of Energy (DOE) sponsored program has conducted large scale experimental testing to generate high-quality and traceable validation data for guiding design decisions of the Reactor Cavity Cooling System (RCCS) concept for advanced reactor designs. The most recent facility iteration, and focus of this report, is the operation of a 1/2 scale model of a water-RCCS concept. Several features of the NSTF prototype align with the conceptual design that has been publicly released formore » the AREVA 625 MWt SC-HTGR. The design of the NSTF also retains all aspects common to a fundamental boiling water thermosiphon, and thus is well poised to provide necessary experimental data to advance basic understanding of natural circulation phenomena and contribute to computer code validation. Overall, the NSTF program operates to support the DOE vision of aiding US vendors in design choices of future reactor concepts, advancing the maturity of codes for licensing, and ultimately developing safe and reliable reactor technologies. In this report, the top-level program objectives, testing requirements, and unique considerations for the water cooled test assembly are discussed, and presented in sufficient depth to support defining the program’s overall scope and purpose. A discussion of the proposed 6-year testing program is then introduced, which outlines the specific strategy and testing plan for facility operations. The proposed testing plan has been developed to meet the toplevel objective of conducting high-quality test operations that span across a broad range of single- and two-phase operating conditions. Details of characterization, baseline test cases, accident scenario, and parametric variations are provided, including discussions of later-stage test cases that examine the influence of geometric variations and off-normal configurations. The facility design follows, including as-built dimensions and specifications of the various mechanical and liquid systems, design choices for the test section, water storage tank, and network piping. Specifications of the instrumentation suite are then presented, along with specific information on performance windows, measurement uncertainties, and installation locations. Finally, descriptions of the control systems and heat removal networks are provided, which have been engineered to support precise quantification of energy balances and facilitate well-controlled test operations.« less
Error-Rate Bounds for Coded PPM on a Poisson Channel
NASA Technical Reports Server (NTRS)
Moision, Bruce; Hamkins, Jon
2009-01-01
Equations for computing tight bounds on error rates for coded pulse-position modulation (PPM) on a Poisson channel at high signal-to-noise ratio have been derived. These equations and elements of the underlying theory are expected to be especially useful in designing codes for PPM optical communication systems. The equations and the underlying theory apply, more specifically, to a case in which a) At the transmitter, a linear outer code is concatenated with an inner code that includes an accumulator and a bit-to-PPM-symbol mapping (see figure) [this concatenation is known in the art as "accumulate-PPM" (abbreviated "APPM")]; b) The transmitted signal propagates on a memoryless binary-input Poisson channel; and c) At the receiver, near-maximum-likelihood (ML) decoding is effected through an iterative process. Such a coding/modulation/decoding scheme is a variation on the concept of turbo codes, which have complex structures, such that an exact analytical expression for the performance of a particular code is intractable. However, techniques for accurately estimating the performances of turbo codes have been developed. The performance of a typical turbo code includes (1) a "waterfall" region consisting of a steep decrease of error rate with increasing signal-to-noise ratio (SNR) at low to moderate SNR, and (2) an "error floor" region with a less steep decrease of error rate with increasing SNR at moderate to high SNR. The techniques used heretofore for estimating performance in the waterfall region have differed from those used for estimating performance in the error-floor region. For coded PPM, prior to the present derivations, equations for accurate prediction of the performance of coded PPM at high SNR did not exist, so that it was necessary to resort to time-consuming simulations in order to make such predictions. The present derivation makes it unnecessary to perform such time-consuming simulations.
Understanding network concepts in modules
2007-01-01
Background Network concepts are increasingly used in biology and genetics. For example, the clustering coefficient has been used to understand network architecture; the connectivity (also known as degree) has been used to screen for cancer targets; and the topological overlap matrix has been used to define modules and to annotate genes. Dozens of potentially useful network concepts are known from graph theory. Results Here we study network concepts in special types of networks, which we refer to as approximately factorizable networks. In these networks, the pairwise connection strength (adjacency) between 2 network nodes can be factored into node specific contributions, named node 'conformity'. The node conformity turns out to be highly related to the connectivity. To provide a formalism for relating network concepts to each other, we define three types of network concepts: fundamental-, conformity-based-, and approximate conformity-based concepts. Fundamental concepts include the standard definitions of connectivity, density, centralization, heterogeneity, clustering coefficient, and topological overlap. The approximate conformity-based analogs of fundamental network concepts have several theoretical advantages. First, they allow one to derive simple relationships between seemingly disparate networks concepts. For example, we derive simple relationships between the clustering coefficient, the heterogeneity, the density, the centralization, and the topological overlap. The second advantage of approximate conformity-based network concepts is that they allow one to show that fundamental network concepts can be approximated by simple functions of the connectivity in module networks. Conclusion Using protein-protein interaction, gene co-expression, and simulated data, we show that a) many networks comprised of module nodes are approximately factorizable and b) in these types of networks, simple relationships exist between seemingly disparate network concepts. Our results are implemented in freely available R software code, which can be downloaded from the following webpage: http://www.genetics.ucla.edu/labs/horvath/ModuleConformity/ModuleNetworks PMID:17547772
A code of ethics for evidence-based research with ancient human remains.
Kreissl Lonfat, Bettina M; Kaufmann, Ina Maria; Rühli, Frank
2015-06-01
As clinical research constantly advances and the concept of evolution becomes a strong and influential part of basic medical research, the absence of a discourse that deals with the use of ancient human remains in evidence-based research is becoming unbearable. While topics such as exhibition and excavation of human remains are established ethical fields of discourse, when faced with instrumentalization of ancient human remains for research (i.e., ancient DNA extractions for disease marker analyses) the answers from traditional ethics or even more practical fields of bio-ethics or more specific biomedical ethics are rare to non-existent. The Centre for Evolutionary Medicine at the University of Zurich solved their needs for discursive action through the writing of a self-given code of ethics which was written in dialogue with the researchers at the Institute and was published online in Sept. 2011: http://evolutionäremedizin.ch/coe/. The philosophico-ethical basis for this a code of conduct and ethics and the methods are published in this article. © 2015 Wiley Periodicals, Inc.
Sequential Syndrome Decoding of Convolutional Codes
NASA Technical Reports Server (NTRS)
Reed, I. S.; Truong, T. K.
1984-01-01
The algebraic structure of convolutional codes are reviewed and sequential syndrome decoding is applied to those codes. These concepts are then used to realize by example actual sequential decoding, using the stack algorithm. The Fano metric for use in sequential decoding is modified so that it can be utilized to sequentially find the minimum weight error sequence.
Proceduracy: Computer Code Writing in the Continuum of Literacy
ERIC Educational Resources Information Center
Vee, Annette
2010-01-01
This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…
Source Code Plagiarism--A Student Perspective
ERIC Educational Resources Information Center
Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.
2011-01-01
This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…
NASA Astrophysics Data System (ADS)
Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.
2016-12-01
Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).
Rodrigues, J M; Trombert-Paviot, B; Baud, R; Wagner, J; Meusnier-Carriot, F
1998-01-01
GALEN has developed a language independent common reference model based on a medically oriented ontology and practical tools and techniques for managing healthcare terminology including natural language processing. GALEN-IN-USE is the current phase which applied the modelling and the tools to the development or the updating of coding systems for surgical procedures in different national coding centers co-operating within the European Federation of Coding Centre (EFCC) to create a language independent knowledge repository for multicultural Europe. We used an integrated set of artificial intelligence terminology tools named CLAssification Manager workbench to process French professional medical language rubrics into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation we generate controlled French natural language. The French national coding centre is then able to retrieve the initial professional rubrics with different categories of concepts, to compare the professional language proposed by expert clinicians to the French generated controlled vocabulary and to finalize the linguistic labels of the coding system in relation with the meanings of the conceptual system structure.
Upgrades to the NESS (Nuclear Engine System Simulation) Code
NASA Technical Reports Server (NTRS)
Fittje, James E.
2007-01-01
In support of the President's Vision for Space Exploration, the Nuclear Thermal Rocket (NTR) concept is being evaluated as a potential propulsion technology for human expeditions to the moon and Mars. The need for exceptional propulsion system performance in these missions has been documented in numerous studies, and was the primary focus of a considerable effort undertaken during the 1960's and 1970's. The NASA Glenn Research Center is leveraging this past NTR investment in their vehicle concepts and mission analysis studies with the aid of the Nuclear Engine System Simulation (NESS) code. This paper presents the additional capabilities and upgrades made to this code in order to perform higher fidelity NTR propulsion system analysis and design.
The DRG shift: a new twist for ICD-10 preparation.
Long, Peri L
2012-06-01
Analysis of your specific business is a key component of ICD-10 implementation. An understanding of your organization's current reimbursement trends will go a long way to assessing and preparing for the impact of ICD-10 in your environment. If you cannot be prepared for each detailed scenario, remember that much of the analysis and resolution requires familiar coding, DRG analysis, and claims processing best practices. Now, they simply have the new twist of researching new codes and some new concepts. The news of a delay in the implementation compliance date, along with the release of grouper Version 29, should encourage your educational and business analysis efforts. This is a great opportunity to maintain open communication with the Centers for Medicare & Medicaid Services, Department of Health and Human Services, and Centers for Disease Control. This is also a key time to report any unusual or discrepant findings in order to provide input to the final rule.
tRNA-Derived Small RNA: A Novel Regulatory Small Non-Coding RNA.
Li, Siqi; Xu, Zhengping; Sheng, Jinghao
2018-05-10
Deep analysis of next-generation sequencing data unveils numerous small non-coding RNAs with distinct functions. Recently, fragments derived from tRNA, named as tRNA-derived small RNA (tsRNA), have attracted broad attention. There are mainly two types of tsRNAs, including tRNA-derived stress-induced RNA (tiRNA) and tRNA-derived fragment (tRF), which differ in the cleavage position of the precursor or mature tRNA transcript. Emerging evidence has shown that tsRNAs are not merely tRNA degradation debris but have been recognized to play regulatory roles in many specific physiological and pathological processes. In this review, we summarize the biogeneses of various tsRNAs, present the emerging concepts regarding functions and mechanisms of action of tsRNAs, highlight the potential application of tsRNAs in human diseases, and put forward the current problems and future research directions.
A crystallographic model for nickel base single crystal alloys
NASA Technical Reports Server (NTRS)
Dame, L. T.; Stouffer, D. C.
1988-01-01
The purpose of this research is to develop a tool for the mechanical analysis of nickel-base single-crystal superalloys, specifically Rene N4, used in gas turbine engine components. This objective is achieved by developing a rate-dependent anisotropic constitutive model and implementing it in a nonlinear three-dimensional finite-element code. The constitutive model is developed from metallurgical concepts utilizing a crystallographic approach. An extension of Schmid's law is combined with the Bodner-Partom equations to model the inelastic tension/compression asymmetry and orientation-dependence in octahedral slip. Schmid's law is used to approximate the inelastic response of the material in cube slip. The constitutive equations model the tensile behavior, creep response and strain-rate sensitivity of the single-crystal superalloys. Methods for deriving the material constants from standard tests are also discussed. The model is implemented in a finite-element code, and the computed and experimental results are compared for several orientations and loading conditions.
Computer Graphics and Metaphorical Elaboration for Learning Science Concepts.
ERIC Educational Resources Information Center
ChanLin, Lih-Juan; Chan, Kung-Chi
This study explores the instructional impact of using computer multimedia to integrate metaphorical verbal information into graphical representations of biotechnology concepts. The combination of text and graphics into a single metaphor makes concepts dual-coded, and therefore more comprehensible and memorable for the student. Visual stimuli help…
Subsumption principles underlying medical concept systems and their formal reconstruction.
Bernauer, J.
1994-01-01
Conventional medical concept systems represent generic concept relations by hierarchical coding principles. Often, these coding principles constrain the concept system and reduce the potential for automatical derivation of subsumption. Formal reconstruction of medical concept systems is an approach that bases on the conceptual representation of meanings and that allows for the application of formal criteria for subsumption. Those criteria must reflect intuitive principles of subordination which are underlying conventional medical concept systems. Particularly these are: The subordinate concept results (1) from adding a specializing criterion to the superordinate concept, (2) from refining the primary category, or a criterion of the superordinate concept, by a concept that is less general, (3) from adding a partitive criterion to a criterion of the superordinate, (4) from refining a criterion by a concept that is less comprehensive, and finally (5) from coordinating the superordinate concept, or one of its criteria. This paper introduces a formalism called BERNWARD that aims at the formal reconstruction of medical concept systems according to these intuitive principles. The automatical derivation of hierarchical relations is primarily supported by explicit generic and explicit partititive hierarchies of concepts, secondly, by two formal criteria that base on the structure of concept descriptions and explicit hierarchical relations between their elements, namely: formal subsumption and part-sensitive subsumption. Formal subsumption takes only generic relations into account, part-sensitive subsumption additionally regards partive relations between criteria. This approach seems to be flexible enough to cope with unforeseeable effects of partitive criteria on subsumption. PMID:7949907
SEADYN Analysis of a Tow Line for a High Altitude Towed Glider
NASA Technical Reports Server (NTRS)
Colozza, Anthony J.
1996-01-01
The concept of using a system, consisting of a tow aircraft, glider and tow line, which would enable subsonic flight at altitudes above 24 km (78 kft) has previously been investigated. The preliminary results from these studies seem encouraging. Under certain conditions these studies indicate the concept is feasible. However, the previous studies did not accurately take into account the forces acting on the tow line. Therefore in order to investigate the concept further a more detailed analysis was needed. The code that was selected was the SEADYN cable dynamics computer program which was developed at the Naval Facilities Engineering Service Center. The program is a finite element based structural analysis code that was developed over a period of 10 years. The results have been validated by the Navy in both laboratory and at actual sea conditions. This code was used to simulate arbitrarily-configured cable structures subjected to excitations encountered in real-world operations. The Navy's interest was mainly for modeling underwater tow lines, however the code is also usable for tow lines in air when the change in fluid properties is taken into account. For underwater applications the fluid properties are basically constant over the length of the tow line. For the tow aircraft/glider application the change in fluid properties is considerable along the length of the tow line. Therefore the code had to be modified in order to take into account the variation in atmospheric properties that would be encountered in this application. This modification consisted of adding a variable density to the fluid based on the altitude of the node being calculated. This change in the way the code handled the fluid density had no effect on the method of calculation or any other factor related to the codes validation.
Thomson, Oliver P; Petty, Nicola J; Moore, Ann P
2014-02-01
How practitioners conceive clinical practice influences many aspects of their clinical work including how they view knowledge, clinical decision-making, and their actions. Osteopaths have relied upon the philosophical and theoretical foundations upon which the profession was built to guide clinical practice. However, it is currently unknown how osteopaths conceive clinical practice, and how these conceptions develop and influence their clinical work. This paper reports the conceptions of practice of experienced osteopaths in the UK. A constructivist grounded theory approach was taken in this study. The constant comparative method of analysis was used to code and analyse data. Purposive sampling was employed to initially select participants. Subsequent theoretical sampling, informed by data analysis, allowed specific participants to be sampled. Data collection methods involved semi-structured interviews and non-participant observation of practitioners during a patient appointment, which was video-recorded and followed by a video-prompted reflective interview. Participants' conception of practice lay on a continuum, from technical rationality to professional artistry and the development of which was influenced by their educational experience, view of health and disease, epistemology of practice knowledge, theory-practice relationship and their perceived therapeutic role. The findings from this study provide the first theoretical insight of osteopaths' conceptions of clinical practice and the factors which influence such conceptions. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Butler, Madeline J.; Sonneborn, George; Perkins, Dorothy C.
1994-01-01
The Mission Operations and Data Systems Directorate (MO&DSD, Code 500), the Space Sciences Directorate (Code 600), and the Flight Projects Directorate (Code 400) have developed a new approach to combine the science and mission operations for the FUSE mission. FUSE, the last of the Delta-class Explorer missions, will obtain high resolution far ultraviolet spectra (910 - 1220 A) of stellar and extragalactic sources to study the evolution of galaxies and conditions in the early universe. FUSE will be launched in 2000 into a 24-hour highly eccentric orbit. Science operations will be conducted in real time for 16-18 hours per day, in a manner similar to the operations performed today for the International Ultraviolet Explorer. In a radical departure from previous missions, the operations concept combines spacecraft and science operations and data processing functions in a single facility to be housed in the Laboratory for Astronomy and Solar Physics (Code 680). A small missions operations team will provide the spacecraft control, telescope operations and data handling functions in a facility designated as the Science and Mission Operations Center (SMOC). This approach will utilize the Transportable Payload Operations Control Center (TPOCC) architecture for both spacecraft and instrument commanding. Other concepts of integrated operations being developed by the Code 500 Renaissance Project will also be employed for the FUSE SMOC. The primary objective of this approach is to reduce development and mission operations costs. The operations concept, integration of mission and science operations, and extensive use of existing hardware and software tools will decrease both development and operations costs extensively. This paper describes the FUSE operations concept, discusses the systems engineering approach used for its development, and the software, hardware and management tools that will make its implementation feasible.
Local structure preserving sparse coding for infrared target recognition
Han, Jing; Yue, Jiang; Zhang, Yi; Bai, Lianfa
2017-01-01
Sparse coding performs well in image classification. However, robust target recognition requires a lot of comprehensive template images and the sparse learning process is complex. We incorporate sparsity into a template matching concept to construct a local sparse structure matching (LSSM) model for general infrared target recognition. A local structure preserving sparse coding (LSPSc) formulation is proposed to simultaneously preserve the local sparse and structural information of objects. By adding a spatial local structure constraint into the classical sparse coding algorithm, LSPSc can improve the stability of sparse representation for targets and inhibit background interference in infrared images. Furthermore, a kernel LSPSc (K-LSPSc) formulation is proposed, which extends LSPSc to the kernel space to weaken the influence of the linear structure constraint in nonlinear natural data. Because of the anti-interference and fault-tolerant capabilities, both LSPSc- and K-LSPSc-based LSSM can implement target identification based on a simple template set, which just needs several images containing enough local sparse structures to learn a sufficient sparse structure dictionary of a target class. Specifically, this LSSM approach has stable performance in the target detection with scene, shape and occlusions variations. High performance is demonstrated on several datasets, indicating robust infrared target recognition in diverse environments and imaging conditions. PMID:28323824
Clinical laboratory sciences data transmission : the NPU coding system
PONTET, Françoise; PETERSEN, Ulla MAGDAL; FUENTES-ARDERIU, Xavier; NORDIN, Gunnar; BRUUNSHUUS, Ivan; IHALAINEN, Jarkko; KARLSSON, Daniel; FORSUM, Urban; DYBKAER, René; SCHADOW, Gunther; KUELPMANN, Wolf; FÉRARD, Georges; KANG, Dongchon; McDONALD, Clement; HILL, Gilbert
2011-01-01
Introduction In health care services, technology requires that correct information be duly available to professionals, citizens and authorities, worldwide. Thus, clinical laboratory sciences require standardized electronic exchanges for results of laboratory examinations. Methods. The NPU (Nomenclature, Properties and Units) coding system provides a terminology for identification of result values (property values). It is structured according to BIPM, ISO, IUPAC and IFCC recommendations. It uses standard terms for established concepts and structured definitions describing: which part of the universe is examined, which component of relevance in that part, which kind-of-property is relevant. Unit and specifications can be added where relevant [System(spec) Component(spec); kind-of-property(spec) = ? unit]. Results. The English version of this terminology is freely accessible at http://dior.imt.liu.se/cnpu/ and http://www.labterm.dk, directly or through the IFCC and IUPAC websites. It has been nationally used for more than 10 years in Denmark and Sweden and has been translated into 6 other languages. Conclusions. The NPU coding system provides a terminology for dedicated kinds-of-property following the international recommendations. It fits well in the health network and is freely accessible. Clinical laboratory professionals worldwide will find many advantages in using the NPU coding system, notably with regards to an accreditation process. PMID:19745311
A Method to Determine the Impact of Patient-Centered Care Interventions in Primary Care
Daaleman, Timothy P.; Shea, Christopher M.; Halladay, Jacqueline; Reed, David
2014-01-01
INTRODUCTION The implementation of patient-centered care (PCC) innovations continues to be poorly understood. We used the implementation effectiveness framework to pilot a method for measuring the impact of a PCC innovation in primary care practices. METHODS We analyzed data from a prior study that assessed the implementation of an electronic geriatric quality-of-life (QOL) module in 3 primary care practices in central North Carolina in 2011–12. Patients responded to the items and the subsequent patient-provider encounter was coded using the Roter Interaction Analysis System (RIAS) system. We developed an implementation effectiveness measure specific to the QOL module (i.e., frequency of usage during the encounter) using RIAS and then tested if there were differences with RIAS codes using analysis of variance. RESULTS A total of 60 patient-provider encounters examined differences in the uptake of the QOL module (i.e., implementation-effectiveness measure) with the frequency of RIAS codes during the encounter (i.e., patient-centeredness measure). There was a significant association between the effectiveness measure and patient-centered RIAS codes. CONCLUSION The concept of implementation effectiveness provided a useful framework determine the impact of a PCC innovation. PRACTICE IMPLICATIONS A method that captures real-time interactions between patients and care staff over time can meaningfully evaluate PCC innovations. PMID:25269410
Clinical laboratory sciences data transmission: the NPU coding system.
Pontet, Françoise; Magdal Petersen, Ulla; Fuentes-Arderiu, Xavier; Nordin, Gunnar; Bruunshuus, Ivan; Ihalainen, Jarkko; Karlsson, Daniel; Forsum, Urban; Dybkaer, René; Schadow, Gunther; Kuelpmann, Wolf; Férard, Georges; Kang, Dongchon; McDonald, Clement; Hill, Gilbert
2009-01-01
In health care services, technology requires that correct information be duly available to professionals, citizens and authorities, worldwide. Thus, clinical laboratory sciences require standardized electronic exchanges for results of laboratory examinations. The NPU (Nomenclature, Properties and Units) coding system provides a terminology for identification of result values (property values). It is structured according to BIPM, ISO, IUPAC and IFCC recommendations. It uses standard terms for established concepts and structured definitions describing: which part of the universe is examined, which component of relevance in that part, which kind-of-property is relevant. Unit and specifications can be added where relevant [System(spec)-Component(spec); kind-of-property(spec) = ? unit]. The English version of this terminology is freely accessible at http://dior.imt.liu.se/cnpu/ and http://www.labterm.dk, directly or through the IFCC and IUPAC websites. It has been nationally used for more than 10 years in Denmark and Sweden and has been translated into 6 other languages. The NPU coding system provides a terminology for dedicated kinds-of-property following the international recommendations. It fits well in the health network and is freely accessible. Clinical laboratory professionals worldwide will find many advantages in using the NPU coding system, notably with regards to an accreditation process.
Spatial transform coding of color images.
NASA Technical Reports Server (NTRS)
Pratt, W. K.
1971-01-01
The application of the transform-coding concept to the coding of color images represented by three primary color planes of data is discussed. The principles of spatial transform coding are reviewed and the merits of various methods of color-image representation are examined. A performance analysis is presented for the color-image transform-coding system. Results of a computer simulation of the coding system are also given. It is shown that, by transform coding, the chrominance content of a color image can be coded with an average of 1.0 bits per element or less without serious degradation. If luminance coding is also employed, the average rate reduces to about 2.0 bits per element or less.
Legislative recognition in France of psychological harassment at work.
Graser, M; Manaouil, C; Verrier, A; Doutrellot-Phillipon, C; Jardé, O
2003-01-01
The recent French Law on Social Modernisation of 17 January 2002 introduced into the French Labour Code and into the French Criminal Code, the concept of "moral" harassment. The definition of psychological harassment under this law adopts quite a broad conception of the notion of psychological harassment. The legislator has established a means for "friendly" settlement of disputes: mediation. When it has not been possible to settle the dispute internally, the Courts have a number of sanctions available to them. The French Labour Code provides that any termination of the contract of employment resulting from a situation of psychological harassment is automatically null and void. Such nullification should therefore be applicable whatever the nature of the termination: dismissal, resignation or negotiated departure and it punishes psychological harassment at work by imprisonment for one year and a fine of 3,750 Euros. The French Criminal Code prescribes penalties of one year and 15,000 Euros.
EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young
2003-02-27
Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation ofmore » the Korean concept of the LILW disposal project in the near future.« less
Stagich, Brooke H; Moore, Kelsey R; Newton, Joseph R; Dixon, Kenneth L; Jannik, G Timothy
2017-04-01
Most U.S. Department of Energy (DOE) facilities with radiological airborne releases use the U.S. Environmental Protection Agency's (EPA) environmental dosimetry code CAP88-PC to demonstrate compliance with regulations in 40CFR61, subpart H [National Emission Standards for Hazardous Air Pollutants: Radiological (NESHAP)]. In 2015, EPA released Version 4 of CAP88-PC, which included significant modifications that improved usability and age-dependent dose coefficients and usage factors for six age groups (infant, 1 y, 5 y, 10 y, 15 y, and adult). However, EPA has not yet provided specific guidance on how to use these age-dependent factors. For demonstrating compliance with DOE public dose regulations, the Savannah River Site (SRS) recently changed from using the maximally exposed individual (MEI) concept (adult male) to the representative person concept (age- and gender-averaged reference person). In this study, dose comparisons are provided between the MEI and a SRS-specific representative person using the age-specific dose coefficients and usage factors in CAP88-PC V.4. Dose comparisons also are provided for each of the six age groups using five radionuclides of interest at SRS (tritium oxide, Cs, Sr, Pu, and I). In general, the total effective dose increases about 11% for the representative person as compared to the current NESHAP MEI because of the inclusion of the more radiosensitive age groups.
Simulation of the space station information system in Ada
NASA Technical Reports Server (NTRS)
Spiegel, James R.
1986-01-01
The Flexible Ada Simulation Tool (FAST) is a discrete event simulation language which is written in Ada. FAST has been used to simulate a number of options for ground data distribution of Space Station payload data. The fact that Ada language is used for implementation has allowed a number of useful interactive features to be built into FAST and has facilitated quick enhancement of its capabilities to support new modeling requirements. General simulation concepts are discussed, and how these concepts are implemented in FAST. The FAST design is discussed, and it is pointed out how the used of the Ada language enabled the development of some significant advantages over classical FORTRAN based simulation languages. The advantages discussed are in the areas of efficiency, ease of debugging, and ease of integrating user code. The specific Ada language features which enable these advances are discussed.
Error-correction coding for digital communications
NASA Astrophysics Data System (ADS)
Clark, G. C., Jr.; Cain, J. B.
This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.
Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J
1997-01-01
To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity. No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. Is suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record.
Semi-Automated Annotation of Biobank Data Using Standard Medical Terminologies in a Graph Database.
Hofer, Philipp; Neururer, Sabrina; Goebel, Georg
2016-01-01
Data describing biobank resources frequently contains unstructured free-text information or insufficient coding standards. (Bio-) medical ontologies like Orphanet Rare Diseases Ontology (ORDO) or the Human Disease Ontology (DOID) provide a high number of concepts, synonyms and entity relationship properties. Such standard terminologies increase quality and granularity of input data by adding comprehensive semantic background knowledge from validated entity relationships. Moreover, cross-references between terminology concepts facilitate data integration across databases using different coding standards. In order to encourage the use of standard terminologies, our aim is to identify and link relevant concepts with free-text diagnosis inputs within a biobank registry. Relevant concepts are selected automatically by lexical matching and SPARQL queries against a RDF triplestore. To ensure correctness of annotations, proposed concepts have to be confirmed by medical data administration experts before they are entered into the registry database. Relevant (bio-) medical terminologies describing diseases and phenotypes were identified and stored in a graph database which was tied to a local biobank registry. Concept recommendations during data input trigger a structured description of medical data and facilitate data linkage between heterogeneous systems.
NASA Astrophysics Data System (ADS)
Koval, Viacheslav
The seismic design provisions of the CSA-S6 Canadian Highway Bridge Design Code and the AASHTO LRFD Seismic Bridge Design Specifications have been developed primarily based on historical earthquake events that have occurred along the west coast of North America. For the design of seismic isolation systems, these codes include simplified analysis and design methods. The appropriateness and range of application of these methods are investigated through extensive parametric nonlinear time history analyses in this thesis. It was found that there is a need to adjust existing design guidelines to better capture the expected nonlinear response of isolated bridges. For isolated bridges located in eastern North America, new damping coefficients are proposed. The applicability limits of the code-based simplified methods have been redefined to ensure that the modified method will lead to conservative results and that a wider range of seismically isolated bridges can be covered by this method. The possibility of further improving current simplified code methods was also examined. By transforming the quantity of allocated energy into a displacement contribution, an idealized analytical solution is proposed as a new simplified design method. This method realistically reflects the effects of ground-motion and system design parameters, including the effects of a drifted oscillation center. The proposed method is therefore more appropriate than current existing simplified methods and can be applicable to isolation systems exhibiting a wider range of properties. A multi-level-hazard performance matrix has been adopted by different seismic provisions worldwide and will be incorporated into the new edition of the Canadian CSA-S6-14 Bridge Design code. However, the combined effect and optimal use of isolation and supplemental damping devices in bridges have not been fully exploited yet to achieve enhanced performance under different levels of seismic hazard. A novel Dual-Level Seismic Protection (DLSP) concept is proposed and developed in this thesis which permits to achieve optimum seismic performance with combined isolation and supplemental damping devices in bridges. This concept is shown to represent an attractive design approach for both the upgrade of existing seismically deficient bridges and the design of new isolated bridges.
Multipath search coding of stationary signals with applications to speech
NASA Astrophysics Data System (ADS)
Fehn, H. G.; Noll, P.
1982-04-01
This paper deals with the application of multipath search coding (MSC) concepts to the coding of stationary memoryless and correlated sources, and of speech signals, at a rate of one bit per sample. Use is made of three MSC classes: (1) codebook coding, or vector quantization, (2) tree coding, and (3) trellis coding. This paper explains the performances of these coders and compares them both with those of conventional coders and with rate-distortion bounds. The potentials of MSC coding strategies are demonstrated by illustrations. The paper reports also on results of MSC coding of speech, where both the strategy of adaptive quantization and of adaptive prediction were included in coder design.
ERIC Educational Resources Information Center
Bowers, Jeffrey S.
2009-01-01
A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated…
Innovation and Standardization in School Building: A Proposal for the National Code in Italy.
ERIC Educational Resources Information Center
Ridolfi, Giuseppe
This document discusses the University of Florence's experience and concepts as it developed the research to define a proposal for designing a new national school building code. Section 1 examines the current school building code and the Italian Reform Process in Education between 1960 and 2000. Section 2 details and explains the new school…
Neuronal Reward and Decision Signals: From Theories to Data
Schultz, Wolfram
2015-01-01
Rewards are crucial objects that induce learning, approach behavior, choices, and emotions. Whereas emotions are difficult to investigate in animals, the learning function is mediated by neuronal reward prediction error signals which implement basic constructs of reinforcement learning theory. These signals are found in dopamine neurons, which emit a global reward signal to striatum and frontal cortex, and in specific neurons in striatum, amygdala, and frontal cortex projecting to select neuronal populations. The approach and choice functions involve subjective value, which is objectively assessed by behavioral choices eliciting internal, subjective reward preferences. Utility is the formal mathematical characterization of subjective value and a prime decision variable in economic choice theory. It is coded as utility prediction error by phasic dopamine responses. Utility can incorporate various influences, including risk, delay, effort, and social interaction. Appropriate for formal decision mechanisms, rewards are coded as object value, action value, difference value, and chosen value by specific neurons. Although all reward, reinforcement, and decision variables are theoretical constructs, their neuronal signals constitute measurable physical implementations and as such confirm the validity of these concepts. The neuronal reward signals provide guidance for behavior while constraining the free will to act. PMID:26109341
Electromagnetic reprogrammable coding-metasurface holograms.
Li, Lianlin; Jun Cui, Tie; Ji, Wei; Liu, Shuo; Ding, Jun; Wan, Xiang; Bo Li, Yun; Jiang, Menghua; Qiu, Cheng-Wei; Zhang, Shuang
2017-08-04
Metasurfaces have enabled a plethora of emerging functions within an ultrathin dimension, paving way towards flat and highly integrated photonic devices. Despite the rapid progress in this area, simultaneous realization of reconfigurability, high efficiency, and full control over the phase and amplitude of scattered light is posing a great challenge. Here, we try to tackle this challenge by introducing the concept of a reprogrammable hologram based on 1-bit coding metasurfaces. The state of each unit cell of the coding metasurface can be switched between '1' and '0' by electrically controlling the loaded diodes. Our proof-of-concept experiments show that multiple desired holographic images can be realized in real time with only a single coding metasurface. The proposed reprogrammable hologram may be a key in enabling future intelligent devices with reconfigurable and programmable functionalities that may lead to advances in a variety of applications such as microscopy, display, security, data storage, and information processing.Realizing metasurfaces with reconfigurability, high efficiency, and control over phase and amplitude is a challenge. Here, Li et al. introduce a reprogrammable hologram based on a 1-bit coding metasurface, where the state of each unit cell of the coding metasurface can be switched electrically.
Panel methods: An introduction
NASA Technical Reports Server (NTRS)
Erickson, Larry L.
1990-01-01
Panel methods are numerical schemes for solving (the Prandtl-Glauert equation) for linear, inviscid, irrotational flow about aircraft flying at subsonic or supersonic speeds. The tools at the panel-method user's disposal are (1) surface panels of source-doublet-vorticity distributions that can represent nearly arbitrary geometry, and (2) extremely versatile boundary condition capabilities that can frequently be used for creative modeling. Panel-method capabilities and limitations, basic concepts common to all panel-method codes, different choices that were made in the implementation of these concepts into working computer programs, and various modeling techniques involving boundary conditions, jump properties, and trailing wakes are discussed. An approach for extending the method to nonlinear transonic flow is also presented. Three appendices supplement the main test. In appendix 1, additional detail is provided on how the basic concepts are implemented into a specific computer program (PANAIR). In appendix 2, it is shown how to evaluate analytically the fundamental surface integral that arises in the expressions for influence-coefficients, and evaluate its jump property. In appendix 3, a simple example is used to illustrate the so-called finite part of the improper integrals.
Culture and self in South Africa: individualism-collectivism predictions.
Eaton, L; Louw, J
2000-04-01
People from collectivist cultures may have more concrete and interdependent self-concepts than do people from individualist cultures (G. Hofstede, 1980). African cultures are considered collectivist (H. C. Triandis, 1989), but research on self-concept and culture has neglected this continent. The authors attempted a partial replication in an African context of cross-cultural findings on the abstract-concrete and independent-interdependent dimensions of self-construal (referred to as the abstract-specific and the autonomous-social dimensions, respectively, by E. Rhee, J. S. Uleman, H. K. Lee, & R. J. Roman, 1995). University students in South Africa took the 20 Statements Test (M. Kuhn & T. S. McPartland, 1954; Rhee et al.); home languages were rough indicators of cultural identity. The authors used 3 coding schemes to analyze the content of 78 protocols from African-language speakers and 77 protocols from English speakers. In accord with predictions from individualism-collectivism theory, the African-language speakers produced more interdependent and concrete self-descriptions than did the English speakers. Additional findings concerned the orthogonality of the 2 dimensions and the nature and assessment of the social self-concept.
NASA Technical Reports Server (NTRS)
Elrad, Tzilla (Editor); Filman, Robert E. (Editor); Bader, Atef (Editor)
2001-01-01
Computer science has experienced an evolution in programming languages and systems from the crude assembly and machine codes of the earliest computers through concepts such as formula translation, procedural programming, structured programming, functional programming, logic programming, and programming with abstract data types. Each of these steps in programming technology has advanced our ability to achieve clear separation of concerns at the source code level. Currently, the dominant programming paradigm is object-oriented programming - the idea that one builds a software system by decomposing a problem into objects and then writing the code of those objects. Such objects abstract together behavior and data into a single conceptual and physical entity. Object-orientation is reflected in the entire spectrum of current software development methodologies and tools - we have OO methodologies, analysis and design tools, and OO programming languages. Writing complex applications such as graphical user interfaces, operating systems, and distributed applications while maintaining comprehensible source code has been made possible with OOP. Success at developing simpler systems leads to aspirations for greater complexity. Object orientation is a clever idea, but has certain limitations. We are now seeing that many requirements do not decompose neatly into behavior centered on a single locus. Object technology has difficulty localizing concerns invoking global constraints and pandemic behaviors, appropriately segregating concerns, and applying domain-specific knowledge. Post-object programming (POP) mechanisms that look to increase the expressiveness of the OO paradigm are a fertile arena for current research. Examples of POP technologies include domain-specific languages, generative programming, generic programming, constraint languages, reflection and metaprogramming, feature-oriented development, views/viewpoints, and asynchronous message brokering. (Czarneclu and Eisenecker s book includes a good survey of many of these technologies).
NASA Technical Reports Server (NTRS)
Larson, William E.; Lueck, Dale E.; Parrish, Clyde F.; Sanders, Gerald B.; Trevathan, Joseph R.; Baird, R. Scott; Simon, Tom; Peters, T.; Delgado, H. (Technical Monitor)
2001-01-01
As we look forward into the new millennium, the extension of human presence beyond Low-Earth Orbit (LEO) looms large in the plans of NASA. The Agency's Strategic Plan specifically calls out the need to identify and develop technologies for 100 and 1000-day class missions beyond LEO. To meet the challenge of these extended duration missions, it is important that we learn how to utilize the indigenous resources available to us on extraterrestrial bodies. This concept, known as In-Situ Resource Utilization (ISRU) can greatly reduce the launch mass & cost of human missions while reducing the risk. These technologies may also pave the way for the commercial development of space. While no specific target beyond LEO is identified in NASA's Strategic Plan, mission architecture studies have been on-going for the Moon, Mars, Near-Earth Asteroids and Earth/Moon & Earth/Sun Libration Points. As a result of these studies, the NASA Office of Space Flight (Code M) through the Johnson and Kennedy Space Centers, is leading the effort to develop ISRU technologies and systems to meet the current and future needs of human missions beyond LEO and on to Mars. This effort also receives support from the NASA Office of Biological and Physical Research (Code U), the Office of Space Science (Code S), and the Office of Aerospace Technology (Code R). This paper will present unique developments in the area of fuel and oxidizer production, breathing air production, water production, C02 collection, separation of atmospheric gases, and gas liquefaction and storage. A technology overview will be provided for each topic along with the results achieved to date, future development plans, and the mission architectures that these technologies support.
Resources for comparing the speed and performance of medical autocoders.
Berman, Jules J
2004-06-15
Concept indexing is a popular method for characterizing medical text, and is one of the most important early steps in many data mining efforts. Concept indexing differs from simple word or phrase indexing because concepts are typically represented by a nomenclature code that binds a medical concept to all equivalent representations. A concept search on the term renal cell carcinoma would be expected to find occurrences of hypernephroma, and renal carcinoma (concept equivalents). The purpose of this study is to provide freely available resources to compare speed and performance among different autocoders. These tools consist of: 1) a public domain autocoder written in Perl (a free and open source programming language that installs on any operating system); 2) a nomenclature database derived from the unencumbered subset of the publicly available Unified Medical Language System; 3) a large corpus of autocoded output derived from a publicly available medical text. A simple lexical autocoder was written that parses plain-text into a listing of all 1,2,3, and 4-word strings contained in text, assigning a nomenclature code for text strings that match terms in the nomenclature. The nomenclature used is the unencumbered subset of the 2003 Unified Medical Language System (UMLS). The unencumbered subset of UMLS was reduced to exclude homonymous one-word terms and proper names, resulting in a term/code data dictionary containing about a half million medical terms. The Online Mendelian Inheritance in Man (OMIM), a 92+ Megabyte publicly available medical opus, was used as sample medical text for the autocoder. The autocoding Perl script is remarkably short, consisting of just 38 command lines. The 92+ Megabyte OMIM file was completely autocoded in 869 seconds on a 2.4 GHz processor (less than 10 seconds per Megabyte of text). The autocoded output file (9,540,442 bytes) contains 367,963 coded terms from OMIM and is distributed with this manuscript. A public domain Perl script is provided that can parse through plain-text files of any length, matching concepts against an external nomenclature. The script and associated files can be used freely to compare the speed and performance of autocoding software.
NASA Astrophysics Data System (ADS)
Leclaire, N.; Cochet, B.; Le Dauphin, F. X.; Haeck, W.; Jacquet, O.
2014-06-01
The present paper aims at providing experimental validation for the use of the MORET 5 code for advanced concepts of reactor involving thorium and heavy water. It therefore constitutes an opportunity to test and improve the thermal-scattering data of heavy water and also to test the recent implementation of probability tables in the MORET 5 code.
Ascent Aerodynamic Pressure Distributions on WB001
NASA Technical Reports Server (NTRS)
Vu, B.; Ruf, J.; Canabal, F.; Brunty, J.
1996-01-01
To support the reusable launch vehicle concept study, the aerodynamic data and surface pressure for WB001 were predicted using three computational fluid dynamic (CFD) codes at several flow conditions between code to code and code to aerodynamic database as well as available experimental data. A set of particular solutions have been selected and recommended for use in preliminary conceptual designs. These computational fluid dynamic (CFD) results have also been provided to the structure group for wing loading analysis.
Concept Inventory Development Reveals Common Student Misconceptions about Microbiology †
Briggs, Amy G.; Hughes, Lee E.; Brennan, Robert E.; Buchner, John; Horak, Rachel E. A.; Amburn, D. Sue Katz; McDonald, Ann H.; Primm, Todd P.; Smith, Ann C.; Stevens, Ann M.; Yung, Sunny B.; Paustian, Timothy D.
2017-01-01
Misconceptions, or alternative conceptions, are incorrect understandings that students have incorporated into their prior knowledge. The goal of this study was the identification of misconceptions in microbiology held by undergraduate students upon entry into an introductory, general microbiology course. This work was the first step in developing a microbiology concept inventory based on the American Society for Microbiology’s Recommended Curriculum Guidelines for Undergraduate Microbiology. Responses to true/false (T/F) questions accompanied by written explanations by undergraduate students at a diverse set of institutions were used to reveal misconceptions for fundamental microbiology concepts. These data were analyzed to identify the most difficult core concepts, misalignment between explanations and answer choices, and the most common misconceptions for each core concept. From across the core concepts, nineteen misconception themes found in at least 5% of the coded answers for a given question were identified. The top five misconceptions, with coded responses ranging from 19% to 43% of the explanations, are described, along with suggested classroom interventions. Identification of student misconceptions in microbiology provides a foundation upon which to understand students’ prior knowledge and to design appropriate tools for improving instruction in microbiology. PMID:29854046
NASA Technical Reports Server (NTRS)
Clement, J. D.; Kirby, K. D.
1973-01-01
Exploratory calculations were performed for several gas core breeder reactor configurations. The computational method involved the use of the MACH-1 one dimensional diffusion theory code and the THERMOS integral transport theory code for thermal cross sections. Computations were performed to analyze thermal breeder concepts and nonbreeder concepts. Analysis of breeders was restricted to the (U-233)-Th breeding cycle, and computations were performed to examine a range of parameters. These parameters include U-233 to hydrogen atom ratio in the gaseous cavity, carbon to thorium atom ratio in the breeding blanket, cavity size, and blanket size.
Phase II Evaluation of Clinical Coding Schemes
Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith
1997-01-01
Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343
Amsden, Jason J; Herr, Philip J; Landry, David M W; Kim, William; Vyas, Raul; Parker, Charles B; Kirley, Matthew P; Keil, Adam D; Gilchrist, Kristin H; Radauscher, Erich J; Hall, Stephen D; Carlson, James B; Baldasaro, Nicholas; Stokes, David; Di Dona, Shane T; Russell, Zachary E; Grego, Sonia; Edwards, Steven J; Sperline, Roger P; Denton, M Bonner; Stoner, Brian R; Gehm, Michael E; Glass, Jeffrey T
2018-02-01
Despite many potential applications, miniature mass spectrometers have had limited adoption in the field due to the tradeoff between throughput and resolution that limits their performance relative to laboratory instruments. Recently, a solution to this tradeoff has been demonstrated by using spatially coded apertures in magnetic sector mass spectrometers, enabling throughput and signal-to-background improvements of greater than an order of magnitude with no loss of resolution. This paper describes a proof of concept demonstration of a cycloidal coded aperture miniature mass spectrometer (C-CAMMS) demonstrating use of spatially coded apertures in a cycloidal sector mass analyzer for the first time. C-CAMMS also incorporates a miniature carbon nanotube (CNT) field emission electron ionization source and a capacitive transimpedance amplifier (CTIA) ion array detector. Results confirm the cycloidal mass analyzer's compatibility with aperture coding. A >10× increase in throughput was achieved without loss of resolution compared with a single slit instrument. Several areas where additional improvement can be realized are identified. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Amsden, Jason J.; Herr, Philip J.; Landry, David M. W.; Kim, William; Vyas, Raul; Parker, Charles B.; Kirley, Matthew P.; Keil, Adam D.; Gilchrist, Kristin H.; Radauscher, Erich J.; Hall, Stephen D.; Carlson, James B.; Baldasaro, Nicholas; Stokes, David; Di Dona, Shane T.; Russell, Zachary E.; Grego, Sonia; Edwards, Steven J.; Sperline, Roger P.; Denton, M. Bonner; Stoner, Brian R.; Gehm, Michael E.; Glass, Jeffrey T.
2018-02-01
Despite many potential applications, miniature mass spectrometers have had limited adoption in the field due to the tradeoff between throughput and resolution that limits their performance relative to laboratory instruments. Recently, a solution to this tradeoff has been demonstrated by using spatially coded apertures in magnetic sector mass spectrometers, enabling throughput and signal-to-background improvements of greater than an order of magnitude with no loss of resolution. This paper describes a proof of concept demonstration of a cycloidal coded aperture miniature mass spectrometer (C-CAMMS) demonstrating use of spatially coded apertures in a cycloidal sector mass analyzer for the first time. C-CAMMS also incorporates a miniature carbon nanotube (CNT) field emission electron ionization source and a capacitive transimpedance amplifier (CTIA) ion array detector. Results confirm the cycloidal mass analyzer's compatibility with aperture coding. A >10× increase in throughput was achieved without loss of resolution compared with a single slit instrument. Several areas where additional improvement can be realized are identified.
Motion-adaptive model-assisted compatible coding with spatiotemporal scalability
NASA Astrophysics Data System (ADS)
Lee, JaeBeom; Eleftheriadis, Alexandros
1997-01-01
We introduce the concept of motion adaptive spatio-temporal model-assisted compatible (MA-STMAC) coding, a technique to selectively encode areas of different importance to the human eye in terms of space and time in moving images with the consideration of object motion. PRevious STMAC was proposed base don the fact that human 'eye contact' and 'lip synchronization' are very important in person-to-person communication. Several areas including the eyes and lips need different types of quality, since different areas have different perceptual significance to human observers. The approach provides a better rate-distortion tradeoff than conventional image coding techniques base don MPEG-1, MPEG- 2, H.261, as well as H.263. STMAC coding is applied on top of an encoder, taking full advantage of its core design. Model motion tracking in our previous STMAC approach was not automatic. The proposed MA-STMAC coding considers the motion of the human face within the STMAC concept using automatic area detection. Experimental results are given using ITU-T H.263, addressing very low bit-rate compression.
Wickert, Natasha M; Wong Riff, Karen W Y; Mansour, Mark; Forrest, Christopher R; Goodacre, Timothy E E; Pusic, Andrea L; Klassen, Anne F
2018-01-01
Objective The aim of this systematic review was to identify patient-reported outcome (PRO) instruments used in research with children/youth with conditions associated with facial differences to identify the health concepts measured. Design MEDLINE, EMBASE, CINAHL, and PsycINFO were searched from 2004 to 2016 to identify PRO instruments used in acne vulgaris, birthmarks, burns, ear anomalies, facial asymmetries, and facial paralysis patients. We performed a content analysis whereby the items were coded to identify concepts and categorized as positive or negative content or phrasing. Results A total of 7,835 articles were screened; 6 generic and 11 condition-specific PRO instruments were used in 96 publications. Condition-specific instruments were for acne (four), oral health (two), dermatology (one), facial asymmetries (two), microtia (one), and burns (one). The PRO instruments provided 554 items (295 generic; 259 condition specific) that were sorted into 4 domains, 11 subdomains, and 91 health concepts. The most common domain was psychological (n = 224 items). Of the identified items, 76% had negative content or phrasing (e.g., "Because of the way my face looks I wish I had never been born"). Given the small number of items measuring facial appearance (n = 19) and function (n = 22), the PRO instruments reviewed lacked content validity for patients whose condition impacted facial function and/or appearance. Conclusions Treatments can change facial appearance and function. This review draws attention to a problem with content validity in existing PRO instruments. Our team is now developing a new PRO instrument called FACE-Q Kids to address this problem.
Quality Scalability Aware Watermarking for Visual Content.
Bhowmik, Deepayan; Abhayaratne, Charith
2016-11-01
Scalable coding-based content adaptation poses serious challenges to traditional watermarking algorithms, which do not consider the scalable coding structure and hence cannot guarantee correct watermark extraction in media consumption chain. In this paper, we propose a novel concept of scalable blind watermarking that ensures more robust watermark extraction at various compression ratios while not effecting the visual quality of host media. The proposed algorithm generates scalable and robust watermarked image code-stream that allows the user to constrain embedding distortion for target content adaptations. The watermarked image code-stream consists of hierarchically nested joint distortion-robustness coding atoms. The code-stream is generated by proposing a new wavelet domain blind watermarking algorithm guided by a quantization based binary tree. The code-stream can be truncated at any distortion-robustness atom to generate the watermarked image with the desired distortion-robustness requirements. A blind extractor is capable of extracting watermark data from the watermarked images. The algorithm is further extended to incorporate a bit-plane discarding-based quantization model used in scalable coding-based content adaptation, e.g., JPEG2000. This improves the robustness against quality scalability of JPEG2000 compression. The simulation results verify the feasibility of the proposed concept, its applications, and its improved robustness against quality scalable content adaptation. Our proposed algorithm also outperforms existing methods showing 35% improvement. In terms of robustness to quality scalable video content adaptation using Motion JPEG2000 and wavelet-based scalable video coding, the proposed method shows major improvement for video watermarking.
Evaluation of Aeroelastically Tailored Small Wind Turbine Blades Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffin, Dayton A.
2005-09-29
Evaluation of Aeroelastically Tailored Small Wind Turbine Blades Final Report Global Energy Concepts, LLC (GEC) has performed a conceptual design study concerning aeroelastic tailoring of small wind turbine blades. The primary objectives were to evaluate ways that blade/rotor geometry could be used to enable cost-of-energy reductions by enhancing energy capture while constraining or mitigating blade costs, system loads, and related component costs. This work builds on insights developed in ongoing adaptive-blade programs but with a focus on application to small turbine systems with isotropic blade material properties and with combined blade sweep and pre-bending/pre-curving to achieve the desired twist coupling.more » Specific goals of this project are to: (A) Evaluate and quantify the extent to which rotor geometry can be used to realize load-mitigating small wind turbine rotors. Primary aspects of the load mitigation are: (1) Improved overspeed safety affected by blades twisting toward stall in response to speed increases. (2) Reduced fatigue loading affected by blade twisting toward feather in response to turbulent gusts. (B) Illustrate trade-offs and design sensitivities for this concept. (C) Provide the technical basis for small wind turbine manufacturers to evaluate this concept and commercialize if the technology appears favorable. The SolidWorks code was used to rapidly develop solid models of blade with varying shapes and material properties. Finite element analyses (FEA) were performed using the COSMOS code modeling with tip-loads and centripetal accelerations. This tool set was used to investigate the potential for aeroelastic tailoring with combined planform sweep and pre-curve. An extensive matrix of design variables was investigated, including aerodynamic design, magnitude and shape of planform sweep, magnitude and shape of blade pre-curve, material stiffness, and rotor diameter. The FEA simulations resulted in substantial insights into the structural response of these blades. The trends were used to identify geometries and rotor configurations that showed the greatest promise for achieving beneficial aeroelastic response. The ADAMS code was used to perform complete aeroelastic simulations of selected rotor configurations; however, the results of these simulations were not satisfactory. This report documents the challenges encountered with the ADAMS simulations and presents recommendations for further development of this concept for aeroelastically tailored small wind turbine blades.« less
Evaluation of Recent Upgrades to the NESS (Nuclear Engine System Simulation) Code
NASA Technical Reports Server (NTRS)
Fittje, James E.; Schnitzler, Bruce G.
2008-01-01
The Nuclear Thermal Rocket (NTR) concept is being evaluated as a potential propulsion technology for exploratory expeditions to the moon, Mars, and beyond. The need for exceptional propulsion system performance in these missions has been documented in numerous studies, and was the primary focus of a considerable effort undertaken during the Rover/NERVA program from 1955 to 1973. The NASA Glenn Research Center is leveraging this past NTR investment in their vehicle concepts and mission analysis studies with the aid of the Nuclear Engine System Simulation (NESS) code. This paper presents the additional capabilities and upgrades made to this code in order to perform higher fidelity NTR propulsion system analysis and design, and a comparison of its results to the Small Nuclear Rocket Engine (SNRE) design.
Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.
The R package 'Luminescence': a history of unexpected complexity and concepts to deal with it
NASA Astrophysics Data System (ADS)
Kreutzer, Sebastian; Burow, Christoph; Dietze, Michael; Fuchs, Margret C.; Friedrich, Johannes; Fischer, Manfred; Schmidt, Christoph
2017-04-01
Overcoming limitations in the so far used standard software, developing an efficient solution of low weight for a very specific task or creating graphs of high quality: the reasons that may had initially lead a scientist to work with R are manifold. And as long as developed solutions, e.g., R scripts, are needed for personal use only, code can remain unstructured and a documentation is not compulsory. However, this changes with the first friendly request for help after the code has been reused by others. In contrast to single scripts, written without intention to ever get published, for R packages the CRAN policy demands a more structured and elaborated approach including a minimum of documentation. Nevertheless, growing projects with thousands of lines of code that need to be maintained can become overwhelming, in particular as researchers are not by definition experts on managing software projects. The R package 'Luminescence' (Kreutzer et al., 2017), a collection of tools dealing with the analysis of luminescence data in a geoscientific, geochronological context, started as one single R script, but quickly evolved into a comprehensive solution connected with various other R packages. We present (1) a very brief development history of the package 'Luminescence', before we (2) sketch technical challenges encountered over time and solutions that have been found to deal with it by using various open source tools. Our presentation is considered as a collection of concepts and approaches to set up R projects in geosciences. References. Kreutzer, S., Dietze, M., Burow, C., Fuchs, M. C., Schmidt, C., Fischer, M., Friedrich, J., 2017. Luminescence: Comprehensive Luminescence Dating Data Analysis. R package version 0.6.4. https://CRAN.R-project.org/package=Luminescence
Masuda, Naoki
2009-12-01
Selective attention is often accompanied by gamma oscillations in local field potentials and spike field coherence in brain areas related to visual, motor, and cognitive information processing. Gamma oscillations are implicated to play an important role in, for example, visual tasks including object search, shape perception, and speed detection. However, the mechanism by which gamma oscillations enhance cognitive and behavioral performance of attentive subjects is still elusive. Using feedforward fan-in networks composed of spiking neurons, we examine a possible role for gamma oscillations in selective attention and population rate coding of external stimuli. We implement the concept proposed by Fries ( 2005 ) that under dynamic stimuli, neural populations effectively communicate with each other only when there is a good phase relationship among associated gamma oscillations. We show that the downstream neural population selects a specific dynamic stimulus received by an upstream population and represents it by population rate coding. The encoded stimulus is the one for which gamma rhythm in the corresponding upstream population is resonant with the downstream gamma rhythm. The proposed role for gamma oscillations in stimulus selection is to enable top-down control, a neural version of time division multiple access used in communication engineering.
Decision Making and the IACUC: Part 1—Protocol Information Discussed at Full-Committee Reviews
Silverman, Jerald; Lidz, Charles W; Clayfield, Jonathan C; Murray, Alexandra; Simon, Lorna J; Rondeau, Richard G
2015-01-01
IACUC protocols can be reviewed by either the full committee or designated members. Both review methods use the principles of the 3 Rs (reduce, refine, replace) as the overarching paradigm, with federal regulations and policies providing more detailed guidance. The primary goal of this study was to determine the frequency of topics discussed by IACUC during full-committee reviews and whether the topics included those required for consideration by IACUC (for example, pain and distress, number of animals used, availability of alternatives, skill and experience of researchers). We recorded and transcribed 87 protocol discussions undergoing full-committee review at 10 academic institutions. Each transcript was coded to capture the key concepts of the discussion and analyzed for the frequency of the codes mentioned. Pain and distress was the code mentioned most often, followed by the specific procedures performed, the study design, and the completeness of the protocol form. Infrequently mentioned topics were alternatives to animal use or painful or distressful procedures, the importance of the research, and preliminary data. Not all of the topics required to be considered by the IACUC were openly discussed for all protocols, and many of the discussions were limited in their depth. PMID:26224439
Genetic evidence for conserved non-coding element function across species–the ears have it
Turner, Eric E.; Cox, Timothy C.
2014-01-01
Comparison of genomic sequences from diverse vertebrate species has revealed numerous highly conserved regions that do not appear to encode proteins or functional RNAs. Often these “conserved non-coding elements,” or CNEs, can direct gene expression to specific tissues in transgenic models, demonstrating they have regulatory function. CNEs are frequently found near “developmental” genes, particularly transcription factors, implying that these elements have essential regulatory roles in development. However, actual examples demonstrating CNE regulatory functions across species have been few, and recent loss-of-function studies of several CNEs in mice have shown relatively minor effects. In this Perspectives article, we discuss new findings in “fancy” rats and Highland cattle demonstrating that function of a CNE near the Hmx1 gene is crucial for normal external ear development and when disrupted can mimic loss-of function Hmx1 coding mutations in mice and humans. These findings provide important support for conserved developmental roles of CNEs in divergent species, and reinforce the concept that CNEs should be examined systematically in the ongoing search for genetic causes of human developmental disorders in the era of genome-scale sequencing. PMID:24478720
A new encoding scheme for visible light communications with applications to mobile connections
NASA Astrophysics Data System (ADS)
Benton, David M.; St. John Brittan, Paul
2017-10-01
A new, novel and unconventional encoding scheme called concurrent coding, has recently been demonstrated and shown to offer interesting features and benefits in comparison to conventional techniques, such as robustness against burst errors and improved efficiency of transmitted power. Free space optical communications can suffer particularly from issues of alignment which requires stable, fixed links to be established and beam wander which can interrupt communications. Concurrent coding has the potential to help ease these difficulties and enable mobile, flexible optical communications to be implemented through the use of a source encoding technique. This concept has been applied for the first time to optical communications where standard light emitting diodes (LEDs) have been used to transmit information encoded with concurrent coding. The technique successfully transmits and decodes data despite unpredictable interruptions to the transmission causing significant drop-outs to the detected signal. The technique also shows how it is possible to send a single block of data in isolation with no pre-synchronisation required between transmitter and receiver, and no specific synchronisation sequence appended to the transmission. Such systems are robust against interference - intentional or otherwise - as well as intermittent beam blockage.
Baneyx, Audrey; Charlet, Jean; Jaulent, Marie-Christine
2007-01-01
Pathologies and acts are classified in thesauri to help physicians to code their activity. In practice, the use of thesauri is not sufficient to reduce variability in coding and thesauri are not suitable for computer processing. We think the automation of the coding task requires a conceptual modeling of medical items: an ontology. Our task is to help lung specialists code acts and diagnoses with software that represents medical knowledge of this concerned specialty by an ontology. The objective of the reported work was to build an ontology of pulmonary diseases dedicated to the coding process. To carry out this objective, we develop a precise methodological process for the knowledge engineer in order to build various types of medical ontologies. This process is based on the need to express precisely in natural language the meaning of each concept using differential semantics principles. A differential ontology is a hierarchy of concepts and relationships organized according to their similarities and differences. Our main research hypothesis is to apply natural language processing tools to corpora to develop the resources needed to build the ontology. We consider two corpora, one composed of patient discharge summaries and the other being a teaching book. We propose to combine two approaches to enrich the ontology building: (i) a method which consists of building terminological resources through distributional analysis and (ii) a method based on the observation of corpus sequences in order to reveal semantic relationships. Our ontology currently includes 1550 concepts and the software implementing the coding process is still under development. Results show that the proposed approach is operational and indicates that the combination of these methods and the comparison of the resulting terminological structures give interesting clues to a knowledge engineer for the building of an ontology.
The global public good concept: a means of promoting good veterinary governance.
Eloit, M
2012-08-01
At the outset, the concept of a 'public good' was associated with economic policies. However, it has now evolved not only from a national to a global concept (global public good), but also from a concept applying solely to the production of goods to one encompassing societal issues (education, environment, etc.) and fundamental rights, including the right to health and food. Through their actions, Veterinary Services, as defined by the Terrestrial Animal Health Code (Terrestrial Code) of the World Organisation for Animal Health (OIE), help to improve animal health and reduce production losses. In this way they contribute directly and indirectly to food security and to safeguarding human health and economic resources. The organisation and operating procedures of Veterinary Services are therefore key to the efficient governance required to achieve these objectives. The OIE is a major player in global cooperation and governance in the fields of animal and public health through the implementation of its strategic standardisation mission and other programmes for the benefit of Veterinary Services and OIE Member Countries. Thus, the actions of Veterinary Services and the OIE deserve to be recognised as a global public good, backed by public investment to ensure that all Veterinary Services are in a position to apply the principles of good governance and to comply with the international standards for the quality of Veterinary Services set out in the OIE Terrestrial Code (Section 3 on Quality of Veterinary Services) and Aquatic Animal Health Code (Section 3 on Quality of Aquatic Animal Health Services).
Progress in The Semantic Analysis of Scientific Code
NASA Technical Reports Server (NTRS)
Stewart, Mark
2000-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.
Clinical physiology grand rounds.
Richards, Jeremy; Schwartzstein, Richard; Irish, Julie; Almeida, Jacqueline; Roberts, David
2013-04-01
Clinical Physiology Grand Rounds (CPGR) is an interactive, case-based conference for medical students designed to: (1) integrate preclinical and clinical learning; (2) promote inductive clinical reasoning; and (3) emphasise students as peer teachers. CPGR specifically encourages mixed learning level student interactions and emphasises the use of concept mapping. We describe the theoretical basis and logistical considerations for an interactive, integrative, mixed-learner environment such as CPGR. In addition, we report qualitative data regarding students' attitudes towards and perceptions of CPGR. Medical students from first to fourth year participate in a monthly, interactive conference. The CPGR was designed to bridge gaps and reinforce linkages between basic science and clinical concepts, and to incorporate interactive vertical integration between preclinical and clinical students. Medical education and content experts use Socratic, interactive teaching methods to develop real-time concept maps to emphasise the presence and importance of linkages across curricula. Student focus groups were held to assess attitudes towards and perceptions of the mixed-learner environment and concept maps in CPGR. Qualitative analyses of focus group transcripts were performed to develop themes and codes describing the students' impressions of CPGR. CPGR is a case-based, interactive conference designed to help students gain an increased appreciation of linkages between basic science and clinical medicine concepts, and an increased awareness of clinical reasoning thought processes. Success is dependent upon explicit attention being given to goals for students' integrated learning. © Blackwell Publishing Ltd 2013.
van Rensburg, Elsie S Janse; Poggenpoel, Marie; Myburgh, Chris
2015-11-25
Student nurses (SNs) experience emotional discomfort during placement in the clinical psychiatric learning environment. This may negatively influence their mental health. Limited support is available to assist both SNs working with persons with intellectual disabilities and nurse educators during clinical accompaniment. This article aims to discuss the generation of this framework to enhance student support. A theory-generative, qualitative, exploratory, descriptive, contextual design was utilised to develop the framework by applying four steps. In step 1 concept analysis identified the central concept through field work. Data were collected from 13 SNs purposively selected from a specific higher educational institution in Gauteng through two focus group interviews, reflective journals, a reflective letter, naïve sketches, drawings and field notes and analysed with thematic coding. The central concept was identified from the results, supported by a literature review and defined by essential attributes. The central concept was classified through a survey list and demonstrated in a model case. In step 2 the central concepts were placed into relationships with each other. The conceptual framework was described and evaluated in step 3 and guidelines for implementation were described in step 4. The focus of this article will be on generating the conceptual framework. The central concept was 'the facilitation of engagement on a deeper emotional level of SNs'. The conceptual framework was described and evaluated. The conceptual framework can enhance the educational practices of nurse educators and can SN's practices of care for persons with intellectual disabilities.
Standardization of Terminology in Laboratory Medicine II
Lee, Kap No; Yoon, Jong-Hyun; Min, Won Ki; Lim, Hwan Sub; Song, Junghan; Chae, Seok Lae; Jang, Seongsoo; Ki, Chang-Seok; Bae, Sook Young; Kim, Jang Su; Kwon, Jung-Ah; Lee, Chang Kyu
2008-01-01
Standardization of medical terminology is essential in data transmission between health care institutes and in maximizing the benefits of information technology. The purpose of this study was to standardize medical terms for laboratory observations. During the second year of the study, a standard database of concept names for laboratory terms that covered those used in tertiary health care institutes and reference laboratories was developed. The laboratory terms in the Logical Observation Identifier Names and Codes (LOINC) database were adopted and matched with the electronic data interchange (EDI) codes in Korea. A public hearing and a workshop for clinical pathologists were held to collect the opinions of experts. The Korean standard laboratory terminology database containing six axial concept names, components, property, time aspect, system (specimen), scale type, and method type, was established for 29,340 test observations. Short names and mapping tables for EDI codes and UMLS were added. Synonym tables were prepared to help match concept names to common terms used in the fields. We herein described the Korean standard laboratory terminology database for test names, result description terms, and result units encompassing most of the laboratory tests in Korea. PMID:18756062
A multimission three-axis stabilized spacecraft flight dynamics ground support system
NASA Technical Reports Server (NTRS)
Langston, J.; Krack, K.; Reupke, W.
1993-01-01
The Multimission Three-Axis Stabilized Spacecraft (MTASS) Flight Dynamics Support System (FDSS) has been developed in an effort to minimize the costs of ground support systems. Unlike single-purpose ground support systems, which attempt to reduce costs by reusing software specifically developed for previous missions, the multimission support system is an intermediate step in the progression to a fully generalized mission support system in which numerous missions may be served by one general system. The benefits of multimission attitude ground support systems extend not only to the software design and coding process, but to the entire system environment, from specification through testing, simulation, operations, and maintenance. This paper reports the application of an MTASS FDSS to multiple scientific satellite missions. The satellites are the Upper Atmosphere Research Satellite (UARS), the Extreme Ultraviolet Explorer (EUVE), and the Solar Anomalous Magnetospheric Particle Explorer (SAMPEX). Both UARS and EUVE use the multimission modular spacecraft (MMS) concept. SAMPEX is part of the Small Explorer (SMEX) series and uses a much simpler set of attitude sensors. This paper centers on algorithm and design concepts for a multimission system and discusses flight experience from UARS.
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1992-01-01
Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.
Multiple component codes based generalized LDPC codes for high-speed optical transport.
Djordjevic, Ivan B; Wang, Ting
2014-07-14
A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.
NASA Technical Reports Server (NTRS)
Stricker, L. T.
1975-01-01
The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.
Parametric Studies of the Ejector Process within a Turbine-Based Combined-Cycle Propulsion System
NASA Technical Reports Server (NTRS)
Georgiadis, Nicholas J.; Walker, James F.; Trefny, Charles J.
1999-01-01
Performance characteristics of the ejector process within a turbine-based combined-cycle (TBCC) propulsion system are investigated using the NPARC Navier-Stokes code. The TBCC concept integrates a turbine engine with a ramjet into a single propulsion system that may efficiently operate from takeoff to high Mach number cruise. At the operating point considered, corresponding to a flight Mach number of 2.0, an ejector serves to mix flow from the ramjet duct with flow from the turbine engine. The combined flow then passes through a diffuser where it is mixed with hydrogen fuel and burned. Three sets of fully turbulent Navier-Stokes calculations are compared with predictions from a cycle code developed specifically for the TBCC propulsion system. A baseline ejector system is investigated first. The Navier-Stokes calculations indicate that the flow leaving the ejector is not completely mixed, which may adversely affect the overall system performance. Two additional sets of calculations are presented; one set that investigated a longer ejector region (to enhance mixing) and a second set which also utilized the longer ejector but replaced the no-slip surfaces of the ejector with slip (inviscid) walls in order to resolve discrepancies with the cycle code. The three sets of Navier-Stokes calculations and the TBCC cycle code predictions are compared to determine the validity of each of the modeling approaches.
Minozzi, Silvia; Armaroli, Paola; Espina, Carolina; Villain, Patricia; Wiseman, Martin; Schüz, Joachim; Segnan, Nereo
2015-12-01
The European Code Against Cancer is a set of recommendations to give advice on cancer prevention. Its 4th edition is an update of the 3rd edition, from 2003. Working Groups of independent experts from different fields of cancer prevention were appointed to review the recommendations, supported by a Literature Group to provide scientific and technical support in the assessment of the scientific evidence, through systematic reviews of the literature. Common procedures were developed to guide the experts in identifying, retrieving, assessing, interpreting and summarizing the scientific evidence in order to revise the recommendations. The Code strictly followed the concept of providing advice to European Union citizens based on the current best available science. The advice, if followed, would be expected to reduce cancer risk, referring both to avoiding or reducing exposure to carcinogenic agents or changing behaviour related to cancer risk and to participating in medical interventions able to avert specific cancers or their consequences. The information sources and procedures for the review of the scientific evidence are described here in detail. The 12 recommendations of the 4th edition of the European Code Against Cancer were ultimately approved by a Scientific Committee of leading European cancer and public health experts. Copyright © 2015 International Agency for Research on Cancer. Published by Elsevier Ltd. All rights reserved.
Kaltner, H; Gabius, H-J
2012-04-01
Lectin histochemistry has revealed cell-type-selective glycosylation. It is under dynamic and spatially controlled regulation. Since their chemical properties allow carbohydrates to reach unsurpassed structural diversity in oligomers, they are ideal for high density information coding. Consequently, the concept of the sugar code assigns a functional dimension to the glycans of cellular glycoconjugates. Indeed, multifarious cell processes depend on specific recognition of glycans by their receptors (lectins), which translate the sugar-encoded information into effects. Duplication of ancestral genes and the following divergence of sequences account for the evolutionary dynamics in lectin families. Differences in gene number can even appear among closely related species. The adhesion/growth-regulatory galectins are selected as an instructive example to trace the phylogenetic diversification in several animals, most of them popular models in developmental and tumor biology. Chicken galectins are identified as a low-level-complexity set, thus singled out for further detailed analysis. The various operative means for establishing protein diversity among the chicken galectins are delineated, and individual characteristics in expression profiles discerned. To apply this galectin-fingerprinting approach in histopathology has potential for refining differential diagnosis and for obtaining prognostic assessments. On the grounds of in vitro work with tumor cells a strategically orchestrated co-regulation of galectin expression with presentation of cognate glycans is detected. This coordination epitomizes the far-reaching physiological significance of sugar coding.
Investigation of Moving Belt Radiator Technology Issues
NASA Technical Reports Server (NTRS)
Teagan, W. Peter; Aguilar, Jerry L.
1994-01-01
The development of an advanced spacecraft radiator technology is reported. The moving belt radiator is a thermal radiator concept with the promise of lower specific mass (per kW rejected) than that afforded by existing technologies. The results of a parametric study to estimate radiator mass for future space power systems is presented. It is shown that this technology can be scaled up to 200 MW for higher rejection temperatures. Several aspects of the design concept are discussed, including the dynamics of a large rotating belt in microgravity. The results of a computer code developed to model the belt dynamics are presented. A series of one-g experiments to investigate the dynamics of small belts is described. A comprehensive test program to investigate belt dynamics in microgravity aboard the NASA KC-135 aircraft is discussed. It was found that the desired circular shape can readily be achieved in microgravity. It is also shown that a rotating belt is stable when subjected to simulated attitude control maneuvers. Heat exchanger design is also investigated. Several sealing concepts were examined experimentally, and are discussed. Overall heat transfer coefficients to the rotating belt are presented. Material properties for various belt materials, including screen meshes, are also presented. The results presented in this report indicate that the moving belt radiator concept is technically feasible.
GOATS 2008 Autonomous, Adaptive Multistatic Acoustic Sensing
2008-09-30
To develop net-centric, autonomous underwater vehicle sensing concepts for littoral MCM and ASW, exploiting collaborative and environmentally...unlimited 13. SUPPLEMENTARY NOTES code 1 only 14. ABSTRACT To develop net-centric, autonomous underwater vehicle sensing concepts for littoral MCM and...of autonomous underwater vehicle networks as platforms for new sonar concepts exploring the full 3-D acoustic environment of shallow water (SW) and
Nodes and Codes: The Reality of Cyber Warfare
2012-05-17
Nodes and Codes explores the reality of cyber warfare through the story of Stuxnet, a string of weaponized code that reached through a domain...nodes. Stuxnet served as a proof-of-concept for cyber weapons and provided a comparative laboratory to study the reality of cyber warfare from the...military powers most often associated with advanced, offensive cyber attack capabilities. The reality of cyber warfare holds significant operational
Concept and performance study of turbocharged solid propellant ramjet
NASA Astrophysics Data System (ADS)
Li, Jiang; Liu, Kai; Liu, Yang; Liu, Shichang
2018-06-01
This study proposes a turbocharged solid propellant ramjet (TSPR) propulsion system that integrates a turbocharged system consisting of a solid propellant (SP) air turbo rocket (ATR) and the fuel-rich gas generator of a solid propellant ramjet (SPR). First, a suitable propellant scheme was determined for the TSPR. A solid hydrocarbon propellant is used to generate gas for driving the turbine, and a boron-based fuel-rich propellant is used to provide fuel-rich gas to the afterburner. An appropriate TSPR structure was also determined. The TSPR's thermodynamic cycle was analysed to prove its theoretical feasibility. The results showed that the TSPR's specific cycle power was larger than those of SP-ATR and SPR and thermal efficiency was slightly less than that of SP-ATR. Overall, TSPR showed optimal performance in a wide flight envelope. The specific impulses and specific thrusts of TSPR, SP-ATR, and SPR in the flight envelope were calculated and compared. TSPR's flight envelope roughly overlapped that of SP-ATR, its specific impulse was larger than that of SP-ATR, and its specific thrust was larger than those of SP-ATR and SPR. Attempts to improve the TSPR off-design performance prompted our proposal of a control plan for off-design codes in which both the turbocharger corrected speed and combustor excess gas coefficient are kept constant. An off-design performance model was established by analysing the TSPR working process. We concluded that TSPR with a constant corrected speed had wider flight envelope, higher thrust, and higher specific impulse than TSPR with a constant physical speed determined by calculating the performance of off-design TSPR codes under different control plans. The results of this study can provide a reference for further studies on TSPRs.
Zydziak, Nicolas; Konrad, Waldemar; Feist, Florian; Afonin, Sergii; Weidner, Steffen; Barner-Kowollik, Christopher
2016-01-01
Designing artificial macromolecules with absolute sequence order represents a considerable challenge. Here we report an advanced light-induced avenue to monodisperse sequence-defined functional linear macromolecules up to decamers via a unique photochemical approach. The versatility of the synthetic strategy—combining sequential and modular concepts—enables the synthesis of perfect macromolecules varying in chemical constitution and topology. Specific functions are placed at arbitrary positions along the chain via the successive addition of monomer units and blocks, leading to a library of functional homopolymers, alternating copolymers and block copolymers. The in-depth characterization of each sequence-defined chain confirms the precision nature of the macromolecules. Decoding of the functional information contained in the molecular structure is achieved via tandem mass spectrometry without recourse to their synthetic history, showing that the sequence information can be read. We submit that the presented photochemical strategy is a viable and advanced concept for coding individual monomer units along a macromolecular chain. PMID:27901024
Habs, H
1981-01-01
After having altered the name of International Committee for Bacteriological Nomenclature in International Committee on Systematic Bacteriology in 1970, the latter will also have to reflect upon the objects of taxonomy. An approach thereto is recognizable in the revision of the International Code of Nomenclature of Bacteria in 1975. Considerations are being made whether a classification of bacteria does justice to the laws of homogenicity, specification and continuity as laid down by Kant in his transcendental dialectic. Most important of all are definition and determination of the taxon species. As far as contents go the latter is not possible from the biological point of view but applicable to its range in application of the regulations of the code. Within the priorities of taxa the species adopts a preferential position because conceptions of applied bacteriology are contained therein. The variety of infra-subspecific subdivisions is taken into consideration; as far as the formae speciales are concerned considerations as made with regard to species apply.
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Delaney, Robert A.; Bettner, James L.
1991-01-01
The primary objective was the development of a time dependent 3-D Euler/Navier-Stokes aerodynamic analysis to predict unsteady compressible transonic flows about ducted and unducted propfan propulsion systems at angle of attack. The resulting computer codes are referred to as Advanced Ducted Propfan Analysis Codes (ADPAC). A computer program user's manual is presented for the ADPAC. Aerodynamic calculations were based on a four stage Runge-Kutta time marching finite volume solution technique with added numerical dissipation. A time accurate implicit residual smoothing operator was used for unsteady flow predictions. For unducted propfans, a single H-type grid was used to discretize each blade passage of the complete propeller. For ducted propfans, a coupled system of five grid blocks utilizing an embedded C grid about the cowl leading edge was used to discretize each blade passage. Grid systems were generated by a combined algebraic/elliptic algorithm developed specifically for ducted propfans. Numerical calculations were compared with experimental data for both ducted and unducted flows.
Yick, Alice G; Oomen-Early, Jody
2008-08-01
Until recently, research studies have implied that domestic violence does not affect Asian American and immigrant communities, or even Asians abroad, because ethnicity or culture has not been addressed. In this content analysis, the authors examined trends in publications in leading scholarly journals on violence relating to Asian women and domestic violence. A coding schema was developed, with two raters coding the data with high interrater reliability. Sixty articles were published over the 16 years studied, most atheoretical and focusing on individual levels of analysis. The terms used in discussing domestic violence reflected a feminist perspective. Three quarters of the studies were empirical, with most guided by logical positivism using quantitative designs. Most targeted specific Asian subgroups (almost a third focused on Asian Indians) rather than categorizing Asians as a general ethnic category. The concept of "Asian culture" was most often assessed by discussing Asian family structure. Future research is discussed in light of the findings.
Mise en Scene: Conversion of Scenarios to CSP Traces for the Requirements-to-Design-to-Code Project
NASA Technical Reports Server (NTRS)
Carter. John D.; Gardner, William B.; Rash, James L.; Hinchey, Michael G.
2007-01-01
The "Requirements-to-Design-to-Code" (R2D2C) project at NASA's Goddard Space Flight Center is based on deriving a formal specification expressed in Communicating Sequential Processes (CSP) notation from system requirements supplied in the form of CSP traces. The traces, in turn, are to be extracted from scenarios, a user-friendly medium often used to describe the required behavior of computer systems under development. This work, called Mise en Scene, defines a new scenario medium (Scenario Notation Language, SNL) suitable for control-dominated systems, coupled with a two-stage process for automatic translation of scenarios to a new trace medium (Trace Notation Language, TNL) that encompasses CSP traces. Mise en Scene is offered as an initial solution to the problem of the scenarios-to-traces "D2" phase of R2D2C. A survey of the "scenario" concept and some case studies are also provided.
Simulator platform for fast reactor operation and safety technology demonstration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, R. B.; Park, Y. S.; Grandy, C.
2012-07-30
A simulator platform for visualization and demonstration of innovative concepts in fast reactor technology is described. The objective is to make more accessible the workings of fast reactor technology innovations and to do so in a human factors environment that uses state-of-the art visualization technologies. In this work the computer codes in use at Argonne National Laboratory (ANL) for the design of fast reactor systems are being integrated to run on this platform. This includes linking reactor systems codes with mechanical structures codes and using advanced graphics to depict the thermo-hydraulic-structure interactions that give rise to an inherently safe responsemore » to upsets. It also includes visualization of mechanical systems operation including advanced concepts that make use of robotics for operations, in-service inspection, and maintenance.« less
Tumor taxonomy for the developmental lineage classification of neoplasms
Berman, Jules J
2004-01-01
Background The new "Developmental lineage classification of neoplasms" was described in a prior publication. The classification is simple (the entire hierarchy is described with just 39 classifiers), comprehensive (providing a place for every tumor of man), and consistent with recent attempts to characterize tumors by cytogenetic and molecular features. A taxonomy is a list of the instances that populate a classification. The taxonomy of neoplasia attempts to list every known term for every known tumor of man. Methods The taxonomy provides each concept with a unique code and groups synonymous terms under the same concept. A Perl script validated successive drafts of the taxonomy ensuring that: 1) each term occurs only once in the taxonomy; 2) each term occurs in only one tumor class; 3) each concept code occurs in one and only one hierarchical position in the classification; and 4) the file containing the classification and taxonomy is a well-formed XML (eXtensible Markup Language) document. Results The taxonomy currently contains 122,632 different terms encompassing 5,376 neoplasm concepts. Each concept has, on average, 23 synonyms. The taxonomy populates "The developmental lineage classification of neoplasms," and is available as an XML file, currently 9+ Megabytes in length. A representation of the classification/taxonomy listing each term followed by its code, followed by its full ancestry, is available as a flat-file, 19+ Megabytes in length. The taxonomy is the largest nomenclature of neoplasms, with more than twice the number of neoplasm names found in other medical nomenclatures, including the 2004 version of the Unified Medical Language System, the Systematized Nomenclature of Medicine Clinical Terminology, the National Cancer Institute's Thesaurus, and the International Classification of Diseases Oncolology version. Conclusions This manuscript describes a comprehensive taxonomy of neoplasia that collects synonymous terms under a unique code number and assigns each tumor to a single class within the tumor hierarchy. The entire classification and taxonomy are available as open access files (in XML and flat-file formats) with this article. PMID:15571625
Research Prototype: Automated Analysis of Scientific and Engineering Semantics
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.; Follen, Greg (Technical Monitor)
2001-01-01
Physical and mathematical formulae and concepts are fundamental elements of scientific and engineering software. These classical equations and methods are time tested, universally accepted, and relatively unambiguous. The existence of this classical ontology suggests an ideal problem for automated comprehension. This problem is further motivated by the pervasive use of scientific code and high code development costs. To investigate code comprehension in this classical knowledge domain, a research prototype has been developed. The prototype incorporates scientific domain knowledge to recognize code properties (including units, physical, and mathematical quantity). Also, the procedure implements programming language semantics to propagate these properties through the code. This prototype's ability to elucidate code and detect errors will be demonstrated with state of the art scientific codes.
Performance Cycle Analysis of a Two-Spool, Separate-Exhaust Turbofan With Interstage Turbine Burner
NASA Technical Reports Server (NTRS)
Liew, K. H.; Urip, E.; Yang, S. L.; Mattingly, J. D.; Marek, C. J.
2005-01-01
This paper presents the performance cycle analysis of a dual-spool, separate-exhaust turbofan engine, with an Interstage Turbine Burner serving as a secondary combustor. The ITB, which is located at the transition duct between the high- and the low-pressure turbines, is a relatively new concept for increasing specific thrust and lowering pollutant emissions in modern jet engine propulsion. A detailed performance analysis of this engine has been conducted for steady-state engine performance prediction. A code is written and is capable of predicting engine performances (i.e., thrust and thrust specific fuel consumption) at varying flight conditions and throttle settings. Two design-point engines were studied to reveal trends in performance at both full and partial throttle operations. A mission analysis is also presented to assure the advantage of saving fuel by adding ITB.
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1988-01-01
The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.
ERIC Educational Resources Information Center
Budoff, Milton; And Others
This three volume report presents findings from an interview study with 103 children and adults regarding their awareness and conceptions of handicapping conditions and from a followup study of preschool handicapped and nonhandicapped students. Volume I details the design and results of the interview study focusing on Ss in five age groups:…
Advanced technology development for image gathering, coding, and processing
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.
1990-01-01
Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.
Personalisation: The Emerging "Revised" Code of Education?
ERIC Educational Resources Information Center
Hartley, David
2007-01-01
In England, a "revised" educational code appears to be emerging. It centres upon the concept of "personalisation". Its basis is less in educational theory, more in contemporary marketing theory. Personalisation can be regarded in two ways. First, it provides the rationale for a new mode of public-service delivery, one which…
Enhancing Undergraduate Mathematics Curriculum via Coding Theory and Cryptography
ERIC Educational Resources Information Center
Aydin, Nuh
2009-01-01
The theory of error-correcting codes and cryptography are two relatively recent applications of mathematics to information and communication systems. The mathematical tools used in these fields generally come from algebra, elementary number theory, and combinatorics, including concepts from computational complexity. It is possible to introduce the…
Beacon- and Schema-Based Method for Recognizing Algorithms from Students' Source Code
ERIC Educational Resources Information Center
Taherkhani, Ahmad; Malmi, Lauri
2013-01-01
In this paper, we present a method for recognizing algorithms from students programming submissions coded in Java. The method is based on the concept of "programming schemas" and "beacons". Schemas are high-level programming knowledge with detailed knowledge abstracted out, and beacons are statements that imply specific…
Chiang, Michael F; Casper, Daniel S; Cimino, James J; Starren, Justin
2005-02-01
To assess the adequacy of 5 controlled medical terminologies (International Classification of Diseases 9, Clinical Modification [ICD9-CM]; Current Procedural Terminology 4 [CPT-4]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; Logical Identifiers, Names, and Codes [LOINC]; Medical Entities Dictionary [MED]) for representing concepts in ophthalmology. Noncomparative case series. Twenty complete ophthalmology case presentations were sequentially selected from a publicly available ophthalmology journal. Each of the 20 cases was parsed into discrete concepts, and each concept was classified along 2 axes: (1) diagnosis, finding, or procedure and (2) ophthalmic or medical concept. Electronic or paper browsers were used to assign a code for every concept in each of the 5 terminologies. Adequacy of assignment for each concept was scored on a 3-point scale. Findings from all 20 case presentations were combined and compared based on a coverage score, which was the average score for all concepts in that terminology. Adequacy of assignment for concepts in each terminology, based on a 3-point Likert scale (0, no match; 1, partial match; 2, complete match). Cases were parsed into 1603 concepts. SNOMED-CT had the highest mean overall coverage score (1.625+/-0.667), followed by MED (0.974+/-0.764), LOINC (0.781+/-0.929), ICD9-CM (0.280+/-0.619), and CPT-4 (0.082+/-0.337). SNOMED-CT also had higher coverage scores than any of the other terminologies for concepts in the diagnosis, finding, and procedure categories. Average coverage scores for ophthalmic concepts were lower than those for medical concepts. Controlled terminologies are required for electronic representation of ophthalmology data. SNOMED-CT had significantly higher content coverage than any other terminology in this study.
Construction of self-dual codes in the Rosenbloom-Tsfasman metric
NASA Astrophysics Data System (ADS)
Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin
2017-12-01
Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.
Techniques for the analysis of data from coded-mask X-ray telescopes
NASA Technical Reports Server (NTRS)
Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.
1987-01-01
Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.
Asymmetric Memory Circuit Would Resist Soft Errors
NASA Technical Reports Server (NTRS)
Buehler, Martin G.; Perlman, Marvin
1990-01-01
Some nonlinear error-correcting codes more efficient in presence of asymmetry. Combination of circuit-design and coding concepts expected to make integrated-circuit random-access memories more resistant to "soft" errors (temporary bit errors, also called "single-event upsets" due to ionizing radiation). Integrated circuit of new type made deliberately more susceptible to one kind of bit error than to other, and associated error-correcting code adapted to exploit this asymmetry in error probabilities.
Jamoulle, Marc; Resnick, Melissa; Grosjean, Julien; Ittoo, Ashwin; Cardillo, Elena; Vander Stichele, Robert; Darmoni, Stefan; Vanmeerbeek, Marc
2018-12-01
While documentation of clinical aspects of General Practice/Family Medicine (GP/FM) is assured by the International Classification of Primary Care (ICPC), there is no taxonomy for the professional aspects (context and management) of GP/FM. To present the development, dissemination, applications, and resulting face validity of the Q-Codes taxonomy specifically designed to describe contextual features of GP/FM, proposed as an extension to the ICPC. The Q-Codes taxonomy was developed from Lamberts' seminal idea for indexing contextual content (1987) by a multi-disciplinary team of knowledge engineers, linguists and general practitioners, through a qualitative and iterative analysis of 1702 abstracts from six GP/FM conferences using Atlas.ti software. A total of 182 concepts, called Q-Codes, representing professional aspects of GP/FM were identified and organized in a taxonomy. Dissemination: The taxonomy is published as an online terminological resource, using semantic web techniques and web ontology language (OWL) ( http://www.hetop.eu/Q ). Each Q-Code is identified with a unique resource identifier (URI), and provided with preferred terms, and scope notes in ten languages (Portuguese, Spanish, English, French, Dutch, Korean, Vietnamese, Turkish, Georgian, German) and search filters for MEDLINE and web searches. This taxonomy has already been used to support queries in bibliographic databases (e.g., MEDLINE), to facilitate indexing of grey literature in GP/FM as congress abstracts, master theses, websites and as an educational tool in vocational teaching, Conclusions: The rapidly growing list of practical applications provides face-validity for the usefulness of this freely available new terminological resource.
Internalism, Externalism and Coding
ERIC Educational Resources Information Center
Carr, Philip
2007-01-01
I examine some of the issues connected with the internalist/externalist distinction in work on the ontology of language. I note that Chomskyan radical internalism necessarily leads to a passive conception of child language acquisition. I reject that passive conception, and support current versions of constructivism [Tomasello, M., 2001. "The…
ERIC Educational Resources Information Center
Booth, Tony
1994-01-01
This article looks at two concepts in the British 1993 draft Code of Practice concerning students with special needs: the concepts of a "continuum of needs" and a "continuum of provision." Issues involved in connecting the two continua are addressed, including whether service delivery decisions should be based on severity of…
Promoting Transfer of Ecosystems Concepts
ERIC Educational Resources Information Center
Yu, Yawen; Hmelo-Silver, Cindy E.; Jordan, Rebecca; Eberbach, Catherine; Sinha, Suparna
2016-01-01
This study examines to what extent students transferred their knowledge from a familiar aquatic ecosystem to an unfamiliar rainforest ecosystem after participating in a technology-rich inquiry curriculum. We coded students' drawings for components of important ecosystems concepts at pre- and posttest. Our analysis examined the extent to which each…
A Third Approach to Gene Prediction Suggests Thousands of Additional Human Transcribed Regions
Glusman, Gustavo; Qin, Shizhen; El-Gewely, M. Raafat; Siegel, Andrew F; Roach, Jared C; Hood, Leroy; Smit, Arian F. A
2006-01-01
The identification and characterization of the complete ensemble of genes is a main goal of deciphering the digital information stored in the human genome. Many algorithms for computational gene prediction have been described, ultimately derived from two basic concepts: (1) modeling gene structure and (2) recognizing sequence similarity. Successful hybrid methods combining these two concepts have also been developed. We present a third orthogonal approach to gene prediction, based on detecting the genomic signatures of transcription, accumulated over evolutionary time. We discuss four algorithms based on this third concept: Greens and CHOWDER, which quantify mutational strand biases caused by transcription-coupled DNA repair, and ROAST and PASTA, which are based on strand-specific selection against polyadenylation signals. We combined these algorithms into an integrated method called FEAST, which we used to predict the location and orientation of thousands of putative transcription units not overlapping known genes. Many of the newly predicted transcriptional units do not appear to code for proteins. The new algorithms are particularly apt at detecting genes with long introns and lacking sequence conservation. They therefore complement existing gene prediction methods and will help identify functional transcripts within many apparent “genomic deserts.” PMID:16543943
Lam, Raymond; Kruger, Estie; Tennant, Marc
2014-12-01
One disadvantage of the remarkable achievements in dentistry is that treatment options have never been more varied or confusing. This has made the concept of Evidenced Based Dentistry more applicable to modern dental practice. Despite merit in the concept whereby clinical decisions are guided by scientific evidence, there are problems with establishing a scientific base. This is no more challenging than in modern dentistry where the gap between rapidly developing products/procedures and its evidence base are widening. Furthermore, the burden of oral disease continues to remain high at the population level. These problems have prompted new approaches to enhancing research. The aim of this paper is to outline how a modified approach to dental coding may benefit clinical and population level research. Using publically assessable data obtained from the Australian Chronic Disease Dental Scheme and item codes contained within the Australian Schedule of Dental Services and Glossary, a suggested approach to dental informatics is illustrated. A selection of item codes have been selected and expanded with the addition of suffixes. These suffixes provided circumstantial information that will assist in assessing clinical outcomes such as success rates and prognosis. The use of item codes in administering the CDDS yielded a large database of item codes. These codes are amenable to dental informatics which has been shown to enhance research at both the clinical and population level. This is a cost effective method to supplement existing research methods. Copyright © 2014 Elsevier Inc. All rights reserved.
An Experiment in Scientific Program Understanding
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.; Owen, Karl (Technical Monitor)
2000-01-01
This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. Results are shown for three intensively studied codes and seven blind test cases; all test cases are state of the art scientific codes. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.
2002 CNA Code of Ethics: some recommendations.
Kikuchi, June F
2004-07-01
The Canadian Nurses Association (CNA) recently revised its 1997 Code of Ethics for Registered Nurses to reflect the context within which nurses practise today. Given the unprecedented changes that have taken place within the profession, healthcare and society, it was timely for the CNA to review and revise its Code. But the revisions were relatively minor; important problematic, substantive aspects of the Code were essentially left untouched and persist in the updated 2002 Code. In this paper, three of those aspects are examined and discussed: the 2002 Code's (a) definition of health and well-being, (b) notion of respect and (c) conception of justice. Recommendations are made. It is hoped that these comments will encourage nurse leaders in Canada to initiate discussion of the Code now, in preparation for its next planned revision in 2007.
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084
SYMTRAN - A Time-dependent Symmetric Tandem Mirror Transport Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hua, D; Fowler, T
2004-06-15
A time-dependent version of the steady-state radial transport model in symmetric tandem mirrors in Ref. [1] has been coded up and first tests performed. Our code, named SYMTRAN, is an adaptation of the earlier SPHERE code for spheromaks, now modified for tandem mirror physics. Motivated by Post's new concept of kinetic stabilization of symmetric mirrors, it is an extension of the earlier TAMRAC rate-equation code omitting radial transport [2], which successfully accounted for experimental results in TMX. The SYMTRAN code differs from the earlier tandem mirror radial transport code TMT in that our code is focused on axisymmetric tandem mirrorsmore » and classical diffusion, whereas TMT emphasized non-ambipolar transport in TMX and MFTF-B due to yin-yang plugs and non-symmetric transitions between the plugs and axisymmetric center cell. Both codes exhibit interesting but different non-linear behavior.« less
Emergence of biological organization through thermodynamic inversion.
Kompanichenko, Vladimir
2014-01-01
Biological organization arises under thermodynamic inversion in prebiotic systems that provide the prevalence of free energy and information contribution over the entropy contribution. The inversion might occur under specific far-from-equilibrium conditions in prebiotic systems oscillating around the bifurcation point. At the inversion moment, (physical) information characteristic of non-biological systems acquires the new features: functionality, purposefulness, and control over the life processes, which transform it into biological information. Random sequences of amino acids and nucleotides, spontaneously synthesized in the prebiotic microsystem, in the primary living unit (probiont) re-assemble into functional sequences, involved into bioinformation circulation through nucleoprotein interaction, resulted in the genetic code emergence. According to the proposed concept, oscillating three-dimensional prebiotic microsystems transformed into probionts in the changeable hydrothermal medium of the early Earth. The inversion concept states that spontaneous (accidental, random) transformations in prebiotic systems cannot produce life; it is only non-spontaneous (perspective, purposeful) transformations, which are the result of thermodynamic inversion, that lead to the negentropy conversion of prebiotic systems into initial living units.
Mastery of Content Representation (CoRes) Related TPACK High School Biology Teacher
NASA Astrophysics Data System (ADS)
Nasution, W. R.; Sriyati, S.; Riandi, R.; Safitri, M.
2017-09-01
The purpose of this study was to determine the mastery of Content Representation (CoRes) teachers related to the integration of technology and pedagogy in teaching Biology (TPACK). This research uses a descriptive method. The data were taken using instruments CoRes as the primary data and semi-structured interviews as supporting data. The subjects were biology teacher in class X MIA from four schools in Bandung. Teachers raised CoRes was analyzed using a scoring rubric CoRes with coding 1-3 then categorized into a group of upper, middle, or lower. The results showed that the two teachers in the lower category. This results means that the control of teachers in defining the essential concept in the CoRes has not been detailed and specific. Meanwhile, two other teachers were in the middle category. This means that the ability of teachers to determine the essential concepts in the CoRes are still inadequate so that still needs to be improved.
NASA Technical Reports Server (NTRS)
Mularz, Edward J.; Sockol, Peter M.
1987-01-01
Future aerospace propulsion concepts involve the combination of liquid or gaseous fuels in a highly turbulent internal air stream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at Lewis to better understand chemical reacting flows with the long term goal of establishing these reliable computer codes. The approach to understanding chemical reacting flows is to look at separate simple parts of this complex phenomena as well as to study the full turbulent reacting flow process. As a result research on the fluid mechanics associated with chemical reacting flows was initiated. The chemistry of fuel-air combustion is also being studied. Finally, the phenomena of turbulence-combustion interaction is being investigated. This presentation will highlight research, both experimental and analytical, in each of these three major areas.
NASA Technical Reports Server (NTRS)
Mularz, Edward J.; Sockol, Peter M.
1990-01-01
Future aerospace propulsion concepts involve the combustion of liquid or gaseous fuels in a highly turbulent internal airstream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence-combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at LeRC to better understand chemical reacting flows with the long-term goal of establishing these reliable computer codes. Our approach to understand chemical reacting flows is to look at separate, more simple parts of this complex phenomenon as well as to study the full turbulent reacting flow process. As a result, we are engaged in research on the fluid mechanics associated with chemical reacting flows. We are also studying the chemistry of fuel-air combustion. Finally, we are investigating the phenomenon of turbulence-combustion interaction. Research, both experimental and analytical, is highlighted in each of these three major areas.
NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding
Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo
2016-01-01
Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280
Life is physics and chemistry and communication.
Witzany, Guenther
2015-04-01
Manfred Eigen extended Erwin Schroedinger's concept of "life is physics and chemistry" through the introduction of information theory and cybernetic systems theory into "life is physics and chemistry and information." Based on this assumption, Eigen developed the concepts of quasispecies and hypercycles, which have been dominant in molecular biology and virology ever since. He insisted that the genetic code is not just used metaphorically: it represents a real natural language. However, the basics of scientific knowledge changed dramatically within the second half of the 20th century. Unfortunately, Eigen ignored the results of the philosophy of science discourse on essential features of natural languages and codes: a natural language or code emerges from populations of living agents that communicate. This contribution will look at some of the highlights of this historical development and the results relevant for biological theories about life. © 2014 New York Academy of Sciences.
Semantic Interoperability of Health Risk Assessments
Rajda, Jay; Vreeman, Daniel J.; Wei, Henry G.
2011-01-01
The health insurance and benefits industry has administered Health Risk Assessments (HRAs) at an increasing rate. These are used to collect data on modifiable health risk factors for wellness and disease management programs. However, there is significant variability in the semantics of these assessments, making it difficult to compare data sets from the output of 2 different HRAs. There is also an increasing need to exchange this data with Health Information Exchanges and Electronic Medical Records. To standardize the data and concepts from these tools, we outline a process to determine presence of certain common elements of modifiable health risk extracted from these surveys. This information is coded using concept identifiers, which allows cross-survey comparison and analysis. We propose that using LOINC codes or other universal coding schema may allow semantic interoperability of a variety of HRA tools across the industry, research, and clinical settings. PMID:22195174
Getting Started in Classroom Computing.
ERIC Educational Resources Information Center
Ahl, David H.
Written for secondary students, this booklet provides an introduction to several computer-related concepts through a set of six classroom games, most of which can be played with little more than a sheet of paper and a pencil. The games are: 1) SECRET CODES--introduction to binary coding, punched cards, and paper tape; 2) GUESS--efficient methods…
Personalized Guideline-Based Treatment Recommendations Using Natural Language Processing Techniques.
Becker, Matthias; Böckmann, Britta
2017-01-01
Clinical guidelines and clinical pathways are accepted and proven instruments for quality assurance and process optimization. Today, electronic representation of clinical guidelines exists as unstructured text, but is not well-integrated with patient-specific information from electronic health records. Consequently, generic content of the clinical guidelines is accessible, but it is not possible to visualize the position of the patient on the clinical pathway, decision support cannot be provided by personalized guidelines for the next treatment step. The Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT) provides common reference terminology as well as the semantic link for combining the pathways and the patient-specific information. This paper proposes a model-based approach to support the development of guideline-compliant pathways combined with patient-specific structured and unstructured information using SNOMED CT. To identify SNOMED CT concepts, a software was developed to extract SNOMED CT codes out of structured and unstructured German data to map these with clinical pathways annotated in accordance with the systematized nomenclature.
Ribeiro, Aridiane Alves; Arantes, Cássia Irene Spinelli; Gualda, Dulce Maria Rosa; Rossi, Lídia Aparecida
2017-06-01
This case study aimed to interpret the underlying historical and cultural aspects of the provision of care at an indigenous healthcare service facility. This is an interpretive, case study-type research with qualitative approach, which was conducted in 2012 at the Indigenous Health Support Center (CASAI) of the State of Mato Grosso do Sul, Brazil. Data were collected by means systematic observation, documentary analyses and semi-structured interviews with ten health professionals. Data review was performed according to an approach based on social anthropology and health anthropology. The anthropological concepts of social code and ethnocentrism underpinned the interpretation of outcomes. Two categories were identified: CASAI, a space between streets and village; Ethnocentrism and indigenous health care. Healthcare practice and current social code are influenced by each other. The street social code prevails in the social environment under study. The institutional organization and professionals' appreciation of the indigenous biological body are decisive to provision of care under the streets social code perspective. Professionals' concepts evidence ethnocentrism in healthcare. Workers, however, try to adopt a relativized view vis-à-vis indigenous people at CASAI.
Advanced Modulation and Coding Technology Conference
NASA Technical Reports Server (NTRS)
1992-01-01
The objectives, approach, and status of all current LeRC-sponsored industry contracts and university grants are presented. The following topics are covered: (1) the LeRC Space Communications Program, and Advanced Modulation and Coding Projects; (2) the status of four contracts for development of proof-of-concept modems; (3) modulation and coding work done under three university grants, two small business innovation research contracts, and two demonstration model hardware development contracts; and (4) technology needs and opportunities for future missions.
Music Handbook for Primary Grades.
ERIC Educational Resources Information Center
Bowman, Doris; And Others
GRADES OR AGES: Primary grades (1, 2, and 3). SUBJECT MATTER: Music. ORGANIZATION AND PHYSICAL APPEARANCE: This guide contains a detailed outline of the basic music concepts for elementary grades with suggestions for activities which may develop understanding of the concepts. The pages of activities are color coded by grade level. There are three…
Changing the Latitudes and Attitudes about Content Analysis Research
ERIC Educational Resources Information Center
Brank, Eve M.; Fox, Kathleen A.; Youstin, Tasha J.; Boeppler, Lee C.
2008-01-01
The current research employs the use of content analysis to teach research methods concepts among students enrolled in an upper division research methods course. Students coded and analyzed Jimmy Buffett song lyrics rather than using a downloadable database or collecting survey data. Students' knowledge of content analysis concepts increased after…
[The concept of mental health deterioration in light of decisions by higher judicial bodies].
Kaya, Ahsen; Aktaş, Ekin Özgür
2014-01-01
Important arrangements were made to protect an individuals' sexual safety in the Turkish Penal Code. During judgments of sexual crimes, the witnesses of medical experts are usually used for evidence collection and for researching whether the crimes were aggravated. Due to this, reports are frequently requested from all physicians in all fields of medicine in their daily clinical practices by judicial authorities. Following implementation of the new Turkish Penal Code, the concept of mental health deterioration was frequently discussed and is still a discussed topic in the fields of both law and medicine in terms of crimes against sexual immunity. It is believed that subjects discussed in this article will provide important information for both adult, child and adolescent mental health professionals in terms of drawing attention to the importance of the medicolegal evaluations which are frequently requested from psychiatrists in their daily clinical practice and in terms of providing an evaluation of the concept of mental health deterioration in light of judicial decisions. Regarding the process from the beginning of application to the present, prejudications reduce questions about how the concept must be evaluated and what the meaning of the concept is. In this study, the decisions of Higher Judicial Bodies were researched and situations relating to how concepts must be evaluated and the meaning of the concept of mental health deterioration today in accordance with the prejudications were presented.
Flexible Generation of Kalman Filter Code
NASA Technical Reports Server (NTRS)
Richardson, Julian; Wilson, Edward
2006-01-01
Domain-specific program synthesis can automatically generate high quality code in complex domains from succinct specifications, but the range of programs which can be generated by a given synthesis system is typically narrow. Obtaining code which falls outside this narrow scope necessitates either 1) extension of the code generator, which is usually very expensive, or 2) manual modification of the generated code, which is often difficult and which must be redone whenever changes are made to the program specification. In this paper, we describe adaptations and extensions of the AUTOFILTER Kalman filter synthesis system which greatly extend the range of programs which can be generated. Users augment the input specification with a specification of code fragments and how those fragments should interleave with or replace parts of the synthesized filter. This allows users to generate a much wider range of programs without their needing to modify the synthesis system or edit generated code. We demonstrate the usefulness of the approach by applying it to the synthesis of a complex state estimator which combines code from several Kalman filters with user-specified code. The work described in this paper allows the complex design decisions necessary for real-world applications to be reflected in the synthesized code. When executed on simulated input data, the generated state estimator was found to produce comparable estimates to those produced by a handcoded estimator
NASA Technical Reports Server (NTRS)
Dame, L. T.; Stouffer, D. C.
1986-01-01
A tool for the mechanical analysis of nickel base single crystal superalloys, specifically Rene N4, used in gas turbine engine components is developed. This is achieved by a rate dependent anisotropic constitutive model implemented in a nonlinear three dimensional finite element code. The constitutive model is developed from metallurigical concepts utilizing a crystallographic approach. A non Schmid's law formulation is used to model the tension/compression asymmetry and orientation dependence in octahedral slip. Schmid's law is a good approximation to the inelastic response of the material in cube slip. The constitutive equations model the tensile behavior, creep response, and strain rate sensitivity of these alloys. Methods for deriving the material constants from standard tests are presented. The finite element implementation utilizes an initial strain method and twenty noded isoparametric solid elements. The ability to model piecewise linear load histories is included in the finite element code. The constitutive equations are accurately and economically integrated using a second order Adams-Moulton predictor-corrector method with a dynamic time incrementing procedure. Computed results from the finite element code are compared with experimental data for tensile, creep and cyclic tests at 760 deg C. The strain rate sensitivity and stress relaxation capabilities of the model are evaluated.
Shahraz, Saeid; Lagu, Tara; Ritter, Grant A; Liu, Xiadong; Tompkins, Christopher
2017-03-01
Selection of International Classification of Diseases (ICD)-based coded information for complex conditions such as severe sepsis is a subjective process and the results are sensitive to the codes selected. We use an innovative data exploration method to guide ICD-based case selection for severe sepsis. Using the Nationwide Inpatient Sample, we applied Latent Class Analysis (LCA) to determine if medical coders follow any uniform and sensible coding for observations with severe sepsis. We examined whether ICD-9 codes specific to sepsis (038.xx for septicemia, a subset of 995.9 codes representing Systemic Inflammatory Response syndrome, and 785.52 for septic shock) could all be members of the same latent class. Hospitalizations coded with sepsis-specific codes could be assigned to a latent class of their own. This class constituted 22.8% of all potential sepsis observations. The probability of an observation with any sepsis-specific codes being assigned to the residual class was near 0. The chance of an observation in the residual class having a sepsis-specific code as the principal diagnosis was close to 0. Validity of sepsis class assignment is supported by empirical results, which indicated that in-hospital deaths in the sepsis-specific class were around 4 times as likely as that in the residual class. The conventional methods of defining severe sepsis cases in observational data substantially misclassify sepsis cases. We suggest a methodology that helps reliable selection of ICD codes for conditions that require complex coding.
Genomics-Based Security Protocols: From Plaintext to Cipherprotein
NASA Technical Reports Server (NTRS)
Shaw, Harry; Hussein, Sayed; Helgert, Hermann
2011-01-01
The evolving nature of the internet will require continual advances in authentication and confidentiality protocols. Nature provides some clues as to how this can be accomplished in a distributed manner through molecular biology. Cryptography and molecular biology share certain aspects and operations that allow for a set of unified principles to be applied to problems in either venue. A concept for developing security protocols that can be instantiated at the genomics level is presented. A DNA (Deoxyribonucleic acid) inspired hash code system is presented that utilizes concepts from molecular biology. It is a keyed-Hash Message Authentication Code (HMAC) capable of being used in secure mobile Ad hoc networks. It is targeted for applications without an available public key infrastructure. Mechanics of creating the HMAC are presented as well as a prototype HMAC protocol architecture. Security concepts related to the implementation differences between electronic domain security and genomics domain security are discussed.
Link performance optimization for digital satellite broadcasting systems
NASA Astrophysics Data System (ADS)
de Gaudenzi, R.; Elia, C.; Viola, R.
The authors introduce the concept of digital direct satellite broadcasting (D-DBS), which allows unprecedented flexibility by providing a large number of audiovisual services. The concept assumes an information rate of 40 Mb/s, which is compatible with practically all present-day transponders. After discussion of the general system concept, the results of transmission system optimization are presented. Channel and interference effects are taken into account. Numerical results show that the scheme with the best performance is trellis-coded 8-PSK (phase shift keying) modulation concatenated with Reed-Solomon block code. For a net data rate of 40 Mb/s a bit error rate of 10-10 can be achieved with an equivalent bit energy to noise density of 9.5 dB, including channel, interference, and demodulator impairments. A link budget analysis shows how a medium-power direct-to-home TV satellite can provide multimedia services to users equipped with small (60-cm) dish antennas.
Fitting perception in and to cognition.
Goldstone, Robert L; de Leeuw, Joshua R; Landy, David H
2015-02-01
Perceptual modules adapt at evolutionary, lifelong, and moment-to-moment temporal scales to better serve the informational needs of cognizers. Perceptual learning is a powerful way for an individual to become tuned to frequently recurring patterns in its specific local environment that are pertinent to its goals without requiring costly executive control resources to be deployed. Mechanisms like predictive coding, categorical perception, and action-informed vision allow our perceptual systems to interface well with cognition by generating perceptual outputs that are systematically guided by how they will be used. In classic conceptions of perceptual modules, people have access to the modules' outputs but no ability to adjust their internal workings. However, humans routinely and strategically alter their perceptual systems via training regimes that have predictable and specific outcomes. In fact, employing a combination of strategic and automatic devices for adapting perception is one of the most promising approaches to improving cognition. Copyright © 2014 Elsevier B.V. All rights reserved.
Błażej, Paweł; Wnȩtrzak, Małgorzata; Mackiewicz, Paweł
2016-12-01
One of theories explaining the present structure of canonical genetic code assumes that it was optimized to minimize harmful effects of amino acid replacements resulting from nucleotide substitutions and translational errors. A way to testify this concept is to find the optimal code under given criteria and compare it with the canonical genetic code. Unfortunately, the huge number of possible alternatives makes it impossible to find the optimal code using exhaustive methods in sensible time. Therefore, heuristic methods should be applied to search the space of possible solutions. Evolutionary algorithms (EA) seem to be ones of such promising approaches. This class of methods is founded both on mutation and crossover operators, which are responsible for creating and maintaining the diversity of candidate solutions. These operators possess dissimilar characteristics and consequently play different roles in the process of finding the best solutions under given criteria. Therefore, the effective searching for the potential solutions can be improved by applying both of them, especially when these operators are devised specifically for a given problem. To study this subject, we analyze the effectiveness of algorithms for various combinations of mutation and crossover probabilities under three models of the genetic code assuming different restrictions on its structure. To achieve that, we adapt the position based crossover operator for the most restricted model and develop a new type of crossover operator for the more general models. The applied fitness function describes costs of amino acid replacement regarding their polarity. Our results indicate that the usage of crossover operators can significantly improve the quality of the solutions. Moreover, the simulations with the crossover operator optimize the fitness function in the smaller number of generations than simulations without this operator. The optimal genetic codes without restrictions on their structure minimize the costs about 2.7 times better than the canonical genetic code. Interestingly, the optimal codes are dominated by amino acids characterized by polarity close to its average value for all amino acids. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Langner, Ingo; Mikolajczyk, Rafael; Garbe, Edeltraut
2011-08-17
Health insurance claims data are increasingly used for health services research in Germany. Hospital diagnoses in these data are coded according to the International Classification of Diseases, German modification (ICD-10-GM). Due to the historical division into West and East Germany, different coding practices might persist in both former parts. Additionally, the introduction of Diagnosis Related Groups (DRGs) in Germany in 2003/2004 might have changed the coding. The aim of this study was to investigate regional and temporal variations in coding of hospitalisation diagnoses in Germany. We analysed hospitalisation diagnoses for oesophageal bleeding (OB) and upper gastrointestinal bleeding (UGIB) from the official German Hospital Statistics provided by the Federal Statistical Office. Bleeding diagnoses were classified as "specific" (origin of bleeding provided) or "unspecific" (origin of bleeding not provided) coding. We studied regional (former East versus West Germany) differences in incidence of hospitalisations with specific or unspecific coding for OB and UGIB and temporal variations between 2000 and 2005. For each year, incidence ratios of hospitalisations for former East versus West Germany were estimated with log-linear regression models adjusting for age, gender and population density. Significant differences in specific and unspecific coding between East and West Germany and over time were found for both, OB and UGIB hospitalisation diagnoses, respectively. For example in 2002, incidence ratios of hospitalisations for East versus West Germany were 1.24 (95% CI 1.16-1.32) for specific and 0.67 (95% CI 0.60-0.74) for unspecific OB diagnoses and 1.43 (95% CI 1.36-1.51) for specific and 0.83 (95% CI 0.80-0.87) for unspecific UGIB. Regional differences nearly disappeared and time trends were less marked when using combined specific and unspecific diagnoses of OB or UGIB, respectively. During the study period, there were substantial regional and temporal variations in the coding of OB and UGIB diagnoses in hospitalised patients. Possible explanations for the observed regional variations are different coding preferences, further influenced by changes in coding and reimbursement rules. Analysing groups of diagnoses including specific and unspecific codes reduces the influence of varying coding practices.
Three-Dimensional Algebraic Models of the tRNA Code and 12 Graphs for Representing the Amino Acids.
José, Marco V; Morgado, Eberto R; Guimarães, Romeu Cardoso; Zamudio, Gabriel S; de Farías, Sávio Torres; Bobadilla, Juan R; Sosa, Daniela
2014-08-11
Three-dimensional algebraic models, also called Genetic Hotels, are developed to represent the Standard Genetic Code, the Standard tRNA Code (S-tRNA-C), and the Human tRNA code (H-tRNA-C). New algebraic concepts are introduced to be able to describe these models, to wit, the generalization of the 2n-Klein Group and the concept of a subgroup coset with a tail. We found that the H-tRNA-C displayed broken symmetries in regard to the S-tRNA-C, which is highly symmetric. We also show that there are only 12 ways to represent each of the corresponding phenotypic graphs of amino acids. The averages of statistical centrality measures of the 12 graphs for each of the three codes are carried out and they are statistically compared. The phenotypic graphs of the S-tRNA-C display a common triangular prism of amino acids in 10 out of the 12 graphs, whilst the corresponding graphs for the H-tRNA-C display only two triangular prisms. The graphs exhibit disjoint clusters of amino acids when their polar requirement values are used. We contend that the S-tRNA-C is in a frozen-like state, whereas the H-tRNA-C may be in an evolving state.
Schütz, U; Reichel, H; Dreinhöfer, K
2007-01-01
We introduce a grouping system for clinical practice which allows the separation of DRG coding in specific orthopaedic groups based on anatomic regions, operative procedures, therapeutic interventions and morbidity equivalent diagnosis groups. With this, a differentiated aim-oriented analysis of illustrated internal DRG data becomes possible. The group-specific difference of the coding quality between the DRG groups following primary coding by the orthopaedic surgeon and final coding by the medical controlling is analysed. In a consecutive series of 1600 patients parallel documentation and group-specific comparison of the relevant DRG parameters were carried out in every case after primary and final coding. Analysing the group-specific share in the additional CaseMix coding, the group "spine surgery" dominated, closely followed by the groups "arthroplasty" and "surgery due to infection, tumours, diabetes". Altogether, additional cost-weight-relevant coding was necessary most frequently in the latter group (84%), followed by group "spine surgery" (65%). In DRGs representing conservative orthopaedic treatment documented procedures had nearly no influence on the cost weight. The introduced system of case group analysis in internal DRG documentation can lead to the detection of specific problems in primary coding and cost-weight relevant changes of the case mix. As an instrument for internal process control in the orthopaedic field, it can serve as a communicative interface between an economically oriented classification of the hospital performance and a specific problem solution of the medical staff involved in the department management.
The development of an intelligent interface to a computational fluid dynamics flow-solver code
NASA Technical Reports Server (NTRS)
Williams, Anthony D.
1988-01-01
Researchers at NASA Lewis are currently developing an 'intelligent' interface to aid in the development and use of large, computational fluid dynamics flow-solver codes for studying the internal fluid behavior of aerospace propulsion systems. This paper discusses the requirements, design, and implementation of an intelligent interface to Proteus, a general purpose, 3-D, Navier-Stokes flow solver. The interface is called PROTAIS to denote its introduction of artificial intelligence (AI) concepts to the Proteus code.
The development of an intelligent interface to a computational fluid dynamics flow-solver code
NASA Technical Reports Server (NTRS)
Williams, Anthony D.
1988-01-01
Researchers at NASA Lewis are currently developing an 'intelligent' interface to aid in the development and use of large, computational fluid dynamics flow-solver codes for studying the internal fluid behavior of aerospace propulsion systems. This paper discusses the requirements, design, and implementation of an intelligent interface to Proteus, a general purpose, three-dimensional, Navier-Stokes flow solver. The interface is called PROTAIS to denote its introduction of artificial intelligence (AI) concepts to the Proteus code.
Patients' Conceptions of Terms Related to Sexual Interest, Desire, and Arousal.
DeLamater, John D; Weinfurt, Kevin P; Flynn, Kathryn E
2017-11-01
Measurement of sexual function typically uses self-report, which, to work as intended, must use language that is understood consistently by diverse respondents. Commonly used measures employ multiple terms, primarily (sexual) interest, desire, and arousal, that might not be understood in the same way by laypeople and professionals. To inform self-reported measurement efforts for research and clinical settings by examining how US men and women recruited from a health care setting understand and interpret different terms. We conducted 10 focus groups in Durham, NC (N = 57). Discussions were audio-recorded and transcribed, and the content of the discussions was systematically analyzed in 2 phases of coding by the research team, facilitated by Nvivo qualitative analysis software (QSR International, Doncaster, VIC, Australia). Patient focus group discussions about the meanings and connotations of multiple terms related to sexual function, especially interest, desire, and arousal. 5 groups included male participants and 5 included female participants. Participants characterized (sexual) interest as a cognitive phenomenon and a situational response to a specific person. Similarly, they characterized (sexual) desire as a situational person-specific experience with some support for it as a cognitive phenomenon but more support for it as a physical phenomenon. In contrast, participants characterized sexual arousal as a physical phenomenon occurring in response to physical or visual stimulation and not related to a specific person. These results can help us understand how laypeople are using and responding to these terms when they are used in clinical and research settings. Patient participants in these groups were diverse in age, gender, sexual orientation, and health, with the potential to voice diverse perspectives on sexual functioning; however, the sample was limited to a single city in the southeastern United States. The meanings of interest, desire, and arousal were defined, compared, and contrasted in the context of patient focus groups. Qualitative coding showed that interest was considered the most "cognitive," arousal the most "physical," and desire somewhere in between. DeLamater JD, Weinfurt KP, Flynn KE. Patients' Conceptions of Terms Related to Sexual Interest, Desire, and Arousal. J Sex Med 2017;14:1327-1335. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
The accuracy of burn diagnosis codes in health administrative data: A validation study.
Mason, Stephanie A; Nathens, Avery B; Byrne, James P; Fowler, Rob; Gonzalez, Alejandro; Karanicolas, Paul J; Moineddin, Rahim; Jeschke, Marc G
2017-03-01
Health administrative databases may provide rich sources of data for the study of outcomes following burn. We aimed to determine the accuracy of International Classification of Diseases diagnoses codes for burn in a population-based administrative database. Data from a regional burn center's clinical registry of patients admitted between 2006-2013 were linked to administrative databases. Burn total body surface area (TBSA), depth, mechanism, and inhalation injury were compared between the registry and administrative records. The sensitivity, specificity, and positive and negative predictive values were determined, and coding agreement was assessed with the kappa statistic. 1215 burn center patients were linked to administrative records. TBSA codes were highly sensitive and specific for ≥10 and ≥20% TBSA (89/93% sensitive and 95/97% specific), with excellent agreement (κ, 0.85/κ, 0.88). Codes were weakly sensitive (68%) in identifying ≥10% TBSA full-thickness burn, though highly specific (86%) with moderate agreement (κ, 0.46). Codes for inhalation injury had limited sensitivity (43%) but high specificity (99%) with moderate agreement (κ, 0.54). Burn mechanism had excellent coding agreement (κ, 0.84). Administrative data diagnosis codes accurately identify burn by burn size and mechanism, while identification of inhalation injury or full-thickness burns is less sensitive but highly specific. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
CFD validation needs for advanced concepts at Northrop Corporation
NASA Technical Reports Server (NTRS)
George, Michael W.
1987-01-01
Information is given in viewgraph form on the Computational Fluid Dynamics (CFD) Workshop held July 14 - 16, 1987. Topics covered include the philosophy of CFD validation, current validation efforts, the wing-body-tail Euler code, F-20 Euler simulated oil flow, and Euler Navier-Stokes code validation for 2D and 3D nozzle afterbody applications.
Functions of Code-Switching among Iranian Advanced and Elementary Teachers and Students
ERIC Educational Resources Information Center
Momenian, Mohammad; Samar, Reza Ghafar
2011-01-01
This paper reports on the findings of a study carried out on the advanced and elementary teachers' and students' functions and patterns of code-switching in Iranian English classrooms. This concept has not been adequately examined in L2 (second language) classroom contexts than in outdoor natural contexts. Therefore, besides reporting on the…
ERIC Educational Resources Information Center
Freeman, Nancy; Feeney, Stephanie; Moravcik, Eva
2003-01-01
Proposes an addendum to the National Association for the Education of Young Children's Code of Ethical Conduct concerning the unique ethical challenges facing teacher educators. Presents a conception of professional responsibility in six areas: children and families, adult students, programs hosting practicum students and programs' staffs and…
ERIC Educational Resources Information Center
Salisbury, Amy L.; Fallone, Melissa Duncan; Lester, Barry
2005-01-01
This review provides an overview and definition of the concept of neurobehavior in human development. Two neurobehavioral assessments used by the authors in current fetal and infant research are discussed: the NICU Network Neurobehavioral Assessment Scale and the Fetal Neurobehavior Coding System. This review will present how the two assessments…
78 FR 47028 - Exchange Traded Concepts, LLC, et al.; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
..., and receive securities from, the series in connection with the purchase and redemption of Creation... similar Inside Information Policy. In accordance with the Code of Ethics \\12\\ and Inside Information... code of ethics pursuant to rule 17j-1 under the Act and Rule 204A-1 under the Advisers Act, which...
Technical Support Document for Version 3.6.1 of the COMcheck Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan
2009-09-29
This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.
Choi, Jeungok; Jenkins, Melinda L.; Cimino, James J.; White, Thomas M.; Bakken, Suzanne
2005-01-01
Objective: The authors aimed to (1) formally represent OASIS-B1 concepts using the Logical Observation Identifiers, Names, and Codes (LOINC) semantic structure; (2) demonstrate integration of OASIS-B1 concepts into a concept-oriented terminology, the Medical Entities Dictionary (MED); (3) examine potential hierarchical structures within LOINC among OASIS-B1 and other nursing terms; and (4) illustrate a Web-based implementation for OASIS-B1 data entry using Dialogix, a software tool with a set of functions that supports complex data entry. Design and Measurements: Two hundred nine OASIS-B1 items were dissected into the six elements of the LOINC semantic structure and then integrated into the MED hierarchy. Each OASIS-B1 term was matched to LOINC-coded nursing terms, Home Health Care Classification, the Omaha System, and the Sign and Symptom Check-List for Persons with HIV, and the extent of the match was judged based on a scale of 0 (no match) to 4 (exact match). OASIS-B1 terms were implemented as a Web-based survey using Dialogix. Results: Of 209 terms, 204 were successfully dissected into the elements of the LOINC semantics structure and integrated into the MED with minor revisions of MED semantics. One hundred fifty-one OASIS-B1 terms were mapped to one or more of the LOINC-coded nursing terms. Conclusion: The LOINC semantic structure offers a standard way to add home health care data to a comprehensive patient record to facilitate data sharing for monitoring outcomes across sites and to further terminology management, decision support, and accurate information retrieval for evidence-based practice. The cross-mapping results support the possibility of a hierarchical structure of the OASIS-B1 concepts within nursing terminologies in the LOINC database. PMID:15802480
Choi, Jeungok; Jenkins, Melinda L; Cimino, James J; White, Thomas M; Bakken, Suzanne
2005-01-01
The authors aimed to (1) formally represent OASIS-B1 concepts using the Logical Observation Identifiers, Names, and Codes (LOINC) semantic structure; (2) demonstrate integration of OASIS-B1 concepts into a concept-oriented terminology, the Medical Entities Dictionary (MED); (3) examine potential hierarchical structures within LOINC among OASIS-B1 and other nursing terms; and (4) illustrate a Web-based implementation for OASIS-B1 data entry using Dialogix, a software tool with a set of functions that supports complex data entry. Two hundred nine OASIS-B1 items were dissected into the six elements of the LOINC semantic structure and then integrated into the MED hierarchy. Each OASIS-B1 term was matched to LOINC-coded nursing terms, Home Health Care Classification, the Omaha System, and the Sign and Symptom Check-List for Persons with HIV, and the extent of the match was judged based on a scale of 0 (no match) to 4 (exact match). OASIS-B1 terms were implemented as a Web-based survey using Dialogix. Of 209 terms, 204 were successfully dissected into the elements of the LOINC semantics structure and integrated into the MED with minor revisions of MED semantics. One hundred fifty-one OASIS-B1 terms were mapped to one or more of the LOINC-coded nursing terms. The LOINC semantic structure offers a standard way to add home health care data to a comprehensive patient record to facilitate data sharing for monitoring outcomes across sites and to further terminology management, decision support, and accurate information retrieval for evidence-based practice. The cross-mapping results support the possibility of a hierarchical structure of the OASIS-B1 concepts within nursing terminologies in the LOINC database.
"The City Snuffs out Nature": Young People's Conceptions of and Relationship with Nature
ERIC Educational Resources Information Center
Pointon, Pam
2014-01-01
This paper reports a study of 384 13-14-year olds' written responses to open-ended questions about their understanding of and relationship with "nature." Using constant comparative method the responses were coded, categorised and themed. Most students held scientific conceptions of nature (excluding humans) and a utilitarian relationship…
Concepts of Healthful Food among Low-Income African American Women
ERIC Educational Resources Information Center
Lynch, Elizabeth B.; Holmes, Shane; Keim, Kathryn; Koneman, Sylvia A.
2012-01-01
Objective: Describe beliefs about what makes foods healthful among low-income African American women. Methods: In one-on-one interviews, 28 low-income African American mothers viewed 30 pairs of familiar foods and explained which food in the pair was more healthful and why. Responses were grouped into codes describing concepts of food…
Contagious Ideas: Vulnerability, Epistemic Injustice and Counter-Terrorism in Education
ERIC Educational Resources Information Center
O'Donnell, Aislinn
2018-01-01
The article addresses the implications of Prevent and Channel for epistemic justice. The first section outlines the background of Prevent. It draws upon Moira Gatens and Genevieve Lloyd's concept of the collective imaginary, alongside Lorraine Code's concept of epistemologies of mastery, in order to outline some of the images and imaginaries that…
Effect of sexed semen on conception rate for Holsteins in the United States
USDA-ARS?s Scientific Manuscript database
Effect of sexed-semen breedings on conception rate was investigated using US Holstein field data from January 2006 through October 2008. Sexed-semen breeding status was determined by a National Association of Animal Breeders’ 500-series marketing code or by individual breeding information in a cow o...
How to identify up to 30 colors without training: color concept retrieval by free color naming
NASA Astrophysics Data System (ADS)
Derefeldt, Gunilla A. M.; Swartling, Tiina
1994-05-01
Used as a redundant code, color is shown to be advantageous in visual search tasks. It enhances attention, detection, and recall of information. Neuropsychological and neurophysiological findings have shown color and spatial perception to be interrelated functions. Studies on eye movements show that colored symbols are easier to detect and that eye fixations are more correctly directed to color-coded symbols. Usually between 5 and 15 colors have been found useful in classification tasks, but this umber can be increased to between 20 to 30 by careful selection of colors, and by a subject's practice with the identification task and familiarity with the particular colors. Recent neurophysiological findings concerning the language-concept connection in color suggest that color concept retrieval would be enhanced by free color naming or by the use of natural associations between color concepts and color words. To test this hypothesis, we had subjects give their own free associations to a set of 35 colors presented on a display. They were able to identify as many as 30 colors without training.
Parametric Weight Comparison of Current and Proposed Thermal Protection System (TPS) Concepts
NASA Technical Reports Server (NTRS)
Myers, David E.; Martin, Carl J.; Blosser, Max L.
1999-01-01
A parametric weight assessment of advanced metallic panel, ceramic blanket, and ceramic tile thermal protection systems (TPS) was conducted using an implicit, one-dimensional (1 -D) thermal finite element sizing code. This sizing code contained models to ac- count for coatings, fasteners, adhesives, and strain isolation pads. Atmospheric entry heating profiles for two vehicles, the Access to Space (ATS) rocket-powered single-stage-to-orbit (SSTO) vehicle and a proposed Reusable Launch Vehicle (RLV), were used to ensure that the trends were not unique to a particular trajectory. Eight TPS concepts were compared for a range of applied heat loads and substructural heat capacities to identify general trends. This study found the blanket TPS concepts have the lightest weights over the majority of their applicable ranges, and current technology ceramic tiles and metallic TPS concepts have similar weights. A proposed, state-of-the-art metallic system which uses a higher temperature alloy and efficient multilayer insulation was predicted to be significantly lighter than the ceramic tile systems and approaches blanket TPS weights for higher integrated heat loads.
Vector processing efficiency of plasma MHD codes by use of the FACOM 230-75 APU
NASA Astrophysics Data System (ADS)
Matsuura, T.; Tanaka, Y.; Naraoka, K.; Takizuka, T.; Tsunematsu, T.; Tokuda, S.; Azumi, M.; Kurita, G.; Takeda, T.
1982-06-01
In the framework of pipelined vector architecture, the efficiency of vector processing is assessed with respect to plasma MHD codes in nuclear fusion research. By using a vector processor, the FACOM 230-75 APU, the limit of the enhancement factor due to parallelism of current vector machines is examined for three numerical codes based on a fluid model. Reasonable speed-up factors of approximately 6,6 and 4 times faster than the highly optimized scalar version are obtained for ERATO (linear stability code), AEOLUS-R1 (nonlinear stability code) and APOLLO (1-1/2D transport code), respectively. Problems of the pipelined vector processors are discussed from the viewpoint of restructuring, optimization and choice of algorithms. In conclusion, the important concept of "concurrency within pipelined parallelism" is emphasized.
Wong, Alex W K; Lau, Stephen C L; Fong, Mandy W M; Cella, David; Lai, Jin-Shei; Heinemann, Allen W
2018-04-03
To determine the extent to which the content of the Quality of Life in Neurological Disorders (Neuro-QoL) covers the International Classification of Functioning, Disability and Health (ICF) Core Sets for multiple sclerosis (MS), stroke, spinal cord injury (SCI), and traumatic brain injury (TBI) using summary linkage indicators. Content analysis by linking content of the Neuro-QoL to corresponding ICF codes of each Core Set for MS, stroke, SCI, and TBI. Three academic centers. None. None. Four summary linkage indicators proposed by MacDermid et al were estimated to compare the content coverage between Neuro-QoL and the ICF codes of Core Sets for MS, stroke, MS, and TBI. Neuro-QoL represented 20% to 30% Core Set codes for different conditions in which more codes in Core Sets for MS (29%), stroke (28%), and TBI (28%) were covered than those for SCI in the long-term (20%) and early postacute (19%) contexts. Neuro-QoL represented nearly half of the unique Activity and Participation codes (43%-49%) and less than one third of the unique Body Function codes (12%-32%). It represented fewer Environmental Factors codes (2%-6%) and no Body Structures codes. Absolute linkage indicators found that at least 60% of Neuro-QoL items were linked to Core Set codes (63%-95%), but many items covered the same codes as revealed by unique linkage indicators (7%-13%), suggesting high concept redundancy among items. The Neuro-QoL links more closely to ICF Core Sets for stroke, MS, and TBI than to those for SCI, and primarily covers activity and participation ICF domains. Other instruments are needed to address concepts not measured by the Neuro-QoL when a comprehensive health assessment is needed. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Pulsifer, P. L.; Parsons, M. A.; Duerr, R. E.; Fox, P. A.; Khalsa, S. S.; McCusker, J. P.; McGuinness, D. L.
2012-12-01
To address interoperability, we first need to understand how human perspectives and worldviews influence the way people conceive of and describe geophysical phenomena. There is never a single, unambiguous description of a phenomenon - the terminology used is based on the relationship people have with it and what their interests are. So how can these perspectives be reconciled in a way that is not only clear to different people but also formally described so that information systems can interoperate? In this paper we explore conceptions of Arctic sea ice as a means of exploring these issues. We examine multiple conceptions of sea ice and related processes as fundamental components of the Earth system. Arctic sea ice is undergoing rapid and dramatic decline. This will have huge impact on climate and biological systems as well as on shipping, exploration, human culture, and geopolitics. Local hunters, operational shipping forecasters, global climate researchers, and others have critical needs for sea ice data and information, but they conceive of, and describe sea ice phenomena in very different ways. Our hypothesis is that formally representing these diverse conceptions in a suite of formal ontologies can help facilitate sharing of information across communities and enhance overall Arctic data interoperability. We present initial work to model operational, research, and Indigenous (Iñupiat and Yup'ik) concepts of sea ice phenomena and data. Our results illustrate important and surprising differences in how these communities describe and represent sea ice, and we describe our approach to resolving incongruities and inconsistencies. We begin by exploring an intriguing information artifact, the World Meteorological Organization "egg code". The egg code is a compact, information rich way of illustrating detailed ice conditions that has been used broadly for a century. There is much agreement on construction and content encoding, but there are important regional differences in its application. Furthermore, it is an analog encoding scheme whose meaning has evolved over time. By semantically modeling the egg code, its subtle variations, and how it connects to other data, we illustrate a mechanism for translating across data formats and representations. But there are limits to what semantically modeling the egg-code can achieve. The egg-code and common operational sea ice formats do not address community needs, notably the timing and processes of sea ice freeze-up and break-up which have profound impact on local hunting, shipping, oil exploration, and safety. We work with local experts from four very different Indigenous communities and scientific creators of sea ice forecasts to establish an understanding of concepts and terminology related to fall freeze-up and spring break up from the individually represented regions. This helps expand our conceptions of sea ice while also aiding in understanding across cultures and communities, and in passing knowledge to younger generations. This is an early step to expanding concepts of interoperability to very different ways of knowing to make data truly relevant and locally useful.
Computational Fluid Dynamics Analysis Method Developed for Rocket-Based Combined Cycle Engine Inlet
NASA Technical Reports Server (NTRS)
1997-01-01
Renewed interest in hypersonic propulsion systems has led to research programs investigating combined cycle engines that are designed to operate efficiently across the flight regime. The Rocket-Based Combined Cycle Engine is a propulsion system under development at the NASA Lewis Research Center. This engine integrates a high specific impulse, low thrust-to-weight, airbreathing engine with a low-impulse, high thrust-to-weight rocket. From takeoff to Mach 2.5, the engine operates as an air-augmented rocket. At Mach 2.5, the engine becomes a dual-mode ramjet; and beyond Mach 8, the rocket is turned back on. One Rocket-Based Combined Cycle Engine variation known as the "Strut-Jet" concept is being investigated jointly by NASA Lewis, the U.S. Air Force, Gencorp Aerojet, General Applied Science Labs (GASL), and Lockheed Martin Corporation. Work thus far has included wind tunnel experiments and computational fluid dynamics (CFD) investigations with the NPARC code. The CFD method was initiated by modeling the geometry of the Strut-Jet with the GRIDGEN structured grid generator. Grids representing a subscale inlet model and the full-scale demonstrator geometry were constructed. These grids modeled one-half of the symmetric inlet flow path, including the precompression plate, diverter, center duct, side duct, and combustor. After the grid generation, full Navier-Stokes flow simulations were conducted with the NPARC Navier-Stokes code. The Chien low-Reynolds-number k-e turbulence model was employed to simulate the high-speed turbulent flow. Finally, the CFD solutions were postprocessed with a Fortran code. This code provided wall static pressure distributions, pitot pressure distributions, mass flow rates, and internal drag. These results were compared with experimental data from a subscale inlet test for code validation; then they were used to help evaluate the demonstrator engine net thrust.
Motomura, Kenta; Nakamura, Morikazu; Otaki, Joji M.
2013-01-01
Protein structure and function information is coded in amino acid sequences. However, the relationship between primary sequences and three-dimensional structures and functions remains enigmatic. Our approach to this fundamental biochemistry problem is based on the frequencies of short constituent sequences (SCSs) or words. A protein amino acid sequence is considered analogous to an English sentence, where SCSs are equivalent to words. Availability scores, which are defined as real SCS frequencies in the non-redundant amino acid database relative to their probabilistically expected frequencies, demonstrate the biological usage bias of SCSs. As a result, this frequency-based linguistic approach is expected to have diverse applications, such as secondary structure specifications by structure-specific SCSs and immunological adjuvants with rare or non-existent SCSs. Linguistic similarities (e.g., wide ranges of scale-free distributions) and dissimilarities (e.g., behaviors of low-rank samples) between proteins and the natural English language have been revealed in the rank-frequency relationships of SCSs or words. We have developed a web server, the SCS Package, which contains five applications for analyzing protein sequences based on the linguistic concept. These tools have the potential to assist researchers in deciphering structurally and functionally important protein sites, species-specific sequences, and functional relationships between SCSs. The SCS Package also provides researchers with a tool to construct amino acid sequences de novo based on the idiomatic usage of SCSs. PMID:24688703
Motomura, Kenta; Nakamura, Morikazu; Otaki, Joji M
2013-01-01
Protein structure and function information is coded in amino acid sequences. However, the relationship between primary sequences and three-dimensional structures and functions remains enigmatic. Our approach to this fundamental biochemistry problem is based on the frequencies of short constituent sequences (SCSs) or words. A protein amino acid sequence is considered analogous to an English sentence, where SCSs are equivalent to words. Availability scores, which are defined as real SCS frequencies in the non-redundant amino acid database relative to their probabilistically expected frequencies, demonstrate the biological usage bias of SCSs. As a result, this frequency-based linguistic approach is expected to have diverse applications, such as secondary structure specifications by structure-specific SCSs and immunological adjuvants with rare or non-existent SCSs. Linguistic similarities (e.g., wide ranges of scale-free distributions) and dissimilarities (e.g., behaviors of low-rank samples) between proteins and the natural English language have been revealed in the rank-frequency relationships of SCSs or words. We have developed a web server, the SCS Package, which contains five applications for analyzing protein sequences based on the linguistic concept. These tools have the potential to assist researchers in deciphering structurally and functionally important protein sites, species-specific sequences, and functional relationships between SCSs. The SCS Package also provides researchers with a tool to construct amino acid sequences de novo based on the idiomatic usage of SCSs.
Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2018-01-01
Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028
NDARC NASA Design and Analysis of Rotorcraft. Appendix 5; Theory
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft - Input, Appendix 2
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration exibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tilt-rotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft. Appendix 6; Input
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2017-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne R.
2009-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool intended to support both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility; a hierarchy of models; and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with lowfidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single main-rotor and tailrotor helicopter; tandem helicopter; coaxial helicopter; and tiltrotors. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC - NASA Design and Analysis of Rotorcraft
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2015-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
NDARC NASA Design and Analysis of Rotorcraft Theory Appendix 1
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2016-01-01
The NASA Design and Analysis of Rotorcraft (NDARC) software is an aircraft system analysis tool that supports both conceptual design efforts and technology impact assessments. The principal tasks are to design (or size) a rotorcraft to meet specified requirements, including vertical takeoff and landing (VTOL) operation, and then analyze the performance of the aircraft for a set of conditions. For broad and lasting utility, it is important that the code have the capability to model general rotorcraft configurations, and estimate the performance and weights of advanced rotor concepts. The architecture of the NDARC code accommodates configuration flexibility, a hierarchy of models, and ultimately multidisciplinary design, analysis, and optimization. Initially the software is implemented with low-fidelity models, typically appropriate for the conceptual design environment. An NDARC job consists of one or more cases, each case optionally performing design and analysis tasks. The design task involves sizing the rotorcraft to satisfy specified design conditions and missions. The analysis tasks can include off-design mission performance calculation, flight performance calculation for point operating conditions, and generation of subsystem or component performance maps. For analysis tasks, the aircraft description can come from the sizing task, from a previous case or a previous NDARC job, or be independently generated (typically the description of an existing aircraft). The aircraft consists of a set of components, including fuselage, rotors, wings, tails, and propulsion. For each component, attributes such as performance, drag, and weight can be calculated; and the aircraft attributes are obtained from the sum of the component attributes. Description and analysis of conventional rotorcraft configurations is facilitated, while retaining the capability to model novel and advanced concepts. Specific rotorcraft configurations considered are single-main-rotor and tail-rotor helicopter, tandem helicopter, coaxial helicopter, and tiltrotor. The architecture of the code accommodates addition of new or higher-fidelity attribute models for a component, as well as addition of new components.
National Combustion Code Validated Against Lean Direct Injection Flow Field Data
NASA Technical Reports Server (NTRS)
Iannetti, Anthony C.
2003-01-01
Most combustion processes have, in some way or another, a recirculating flow field. This recirculation stabilizes the reaction zone, or flame, but an unnecessarily large recirculation zone can result in high nitrogen oxide (NOx) values for combustion systems. The size of this recirculation zone is crucial to the performance of state-of-the-art, low-emissions hardware. If this is a large-scale combustion process, the flow field will probably be turbulent and, therefore, three-dimensional. This research dealt primarily with flow fields resulting from lean direct injection (LDI) concepts, as described in Research & Technology 2001. LDI is a concept that depends heavily on the design of the swirler. The LDI concept has the potential to reduce NOx values from 50 to 70 percent of current values, with good flame stability characteristics. It is cost effective and (hopefully) beneficial to do most of the design work for an LDI swirler using computer-aided design (CAD) and computer-aided engineering (CAE) tools. Computational fluid dynamics (CFD) codes are CAE tools that can calculate three-dimensional flows in complex geometries. However, CFD codes are only beginning to correctly calculate the flow fields for complex devices, and the related combustion models usually remove a large portion of the flow physics.
Trombert-Paviot, B; Rodrigues, J M; Rogers, J E; Baud, R; van der Haring, E; Rassinoux, A M; Abrial, V; Clavel, L; Idir, H
1999-01-01
GALEN has developed a new generation of terminology tools based on a language independent concept reference model using a compositional formalism allowing computer processing and multiple reuses. During the 4th framework program project Galen-In-Use we applied the modelling and the tools to the development of a new multipurpose coding system for surgical procedures (CCAM) in France. On one hand we contributed to a language independent knowledge repository for multicultural Europe. On the other hand we support the traditional process for creating a new coding system in medicine which is very much labour consuming by artificial intelligence tools using a medically oriented recursive ontology and natural language processing. We used an integrated software named CLAW to process French professional medical language rubrics produced by the national colleges of surgeons into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation on one hand we generate controlled French natural language to support the finalization of the linguistic labels in relation with the meanings of the conceptual system structure. On the other hand the classification manager of third generation proves to be very powerful to retrieve the initial professional rubrics with different categories of concepts within a semantic network.
Overview and Current Status of Analyses of Potential LEU Design Concepts for TREAT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connaway, H. M.; Kontogeorgakos, D. C.; Papadias, D. D.
2015-10-01
Neutronic and thermal-hydraulic analyses have been performed to evaluate the performance of different low-enriched uranium (LEU) fuel design concepts for the conversion of the Transient Reactor Test Facility (TREAT) from its current high-enriched uranium (HEU) fuel. TREAT is an experimental reactor developed to generate high neutron flux transients for the testing of nuclear fuels. The goal of this work was to identify an LEU design which can maintain the performance of the existing HEU core while continuing to operate safely. A wide variety of design options were considered, with a focus on minimizing peak fuel temperatures and optimizing the powermore » coupling between the TREAT core and test samples. Designs were also evaluated to ensure that they provide sufficient reactivity and shutdown margin for each control rod bank. Analyses were performed using the core loading and experiment configuration of historic M8 Power Calibration experiments (M8CAL). The Monte Carlo code MCNP was utilized for steady-state analyses, and transient calculations were performed with the point kinetics code TREKIN. Thermal analyses were performed with the COMSOL multi-physics code. Using the results of this study, a new LEU Baseline design concept is being established, which will be evaluated in detail in a future report.« less
Development and application of unified algorithms for problems in computational science
NASA Technical Reports Server (NTRS)
Shankar, Vijaya; Chakravarthy, Sukumar
1987-01-01
A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.
Department of Energy's Virtual Lab Infrastructure for Integrated Earth System Science Data
NASA Astrophysics Data System (ADS)
Williams, D. N.; Palanisamy, G.; Shipman, G.; Boden, T.; Voyles, J.
2014-12-01
The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) Climate and Environmental Sciences Division (CESD) produces a diversity of data, information, software, and model codes across its research and informatics programs and facilities. This information includes raw and reduced observational and instrumentation data, model codes, model-generated results, and integrated data products. Currently, most of this data and information are prepared and shared for program specific activities, corresponding to CESD organization research. A major challenge facing BER CESD is how best to inventory, integrate, and deliver these vast and diverse resources for the purpose of accelerating Earth system science research. This talk provides a concept for a CESD Integrated Data Ecosystem and an initial roadmap for its implementation to address this integration challenge in the "Big Data" domain. Towards this end, a new BER Virtual Laboratory Infrastructure will be presented, which will include services and software connecting the heterogeneous CESD data holdings, and constructed with open source software based on industry standards, protocols, and state-of-the-art technology.
Venco, Paola; Dusi, Sabrina; Valletta, Lorella; Tiranti, Valeria
2014-08-01
NBIA (neurodegeneration with brain iron accumulation) comprises a heterogeneous group of neurodegenerative diseases having as a common denominator, iron overload in specific brain areas, mainly basal ganglia and globus pallidus. In the past decade a bunch of disease genes have been identified, but NBIA pathomechanisms are still not completely clear. PKAN (pantothenate kinase-associated neurodegeneration), an autosomal recessive disorder with progressive impairment of movement, vision and cognition, is the most common form of NBIA. It is caused by mutations in the PANK2 (pantothenate kinase 2) gene, coding for a mitochondrial enzyme that phosphorylates vitamin B5 in the first reaction of the CoA (coenzyme A) biosynthetic pathway. A distinct form of NBIA, denominated CoPAN (CoA synthase protein-associated neurodegeneration), is caused by mutations in the CoASY (CoA synthase) gene coding for a bifunctional mitochondrial enzyme, which catalyses the final steps of CoA biosynthesis. These two inborn errors of CoA metabolism further support the concept that dysfunctions in CoA synthesis may play a crucial role in the pathogenesis of NBIA.
The Nuremberg Code and the Nuremberg Trial. A reappraisal.
Katz, J
1996-11-27
The Nuremberg Code includes 10 principles to guide physician-investigators in experiments involving human subjects. These principles, particularly the first principle on "voluntary consent," primarily were based on legal concepts because medical codes of ethics existent at the time of the Nazi atrocities did not address consent and other safeguards for human subjects. The US judges who presided over the proceedings did not intend the Code to apply only to the case before them, to be a response to the atrocities committed by the Nazi physicians, or to be inapplicable to research as it is customarily carried on in medical institutions. Instead, a careful reading of the judgment suggests that they wrote the Code for the practice of human experimentation whenever it is being conducted.
Noussa-Yao, Joseph; Heudes, Didier; Escudie, Jean-Baptiste; Degoulet, Patrice
2016-01-01
Short-stay MSO (Medicine, Surgery, Obstetrics) hospitalization activities in public and private hospitals providing public services are funded through charges for the services provided (T2A in French). Coding must be well matched to the severity of the patient's condition, to ensure that appropriate funding is provided to the hospital. We propose the use of an autocompletion process and multidimensional matrix, to help physicians to improve the expression of information and to optimize clinical coding. With this approach, physicians without knowledge of the encoding rules begin from a rough concept, which is gradually refined through semantic proximity and uses information on the associated codes stemming of optimized knowledge bases of diagnosis code.
Flexible digital modulation and coding synthesis for satellite communications
NASA Technical Reports Server (NTRS)
Vanderaar, Mark; Budinger, James; Hoerig, Craig; Tague, John
1991-01-01
An architecture and a hardware prototype of a flexible trellis modem/codec (FTMC) transmitter are presented. The theory of operation is built upon a pragmatic approach to trellis-coded modulation that emphasizes power and spectral efficiency. The system incorporates programmable modulation formats, variations of trellis-coding, digital baseband pulse-shaping, and digital channel precompensation. The modulation formats examined include (uncoded and coded) binary phase shift keying (BPSK), quatenary phase shift keying (QPSK), octal phase shift keying (8PSK), 16-ary quadrature amplitude modulation (16-QAM), and quadrature quadrature phase shift keying (Q squared PSK) at programmable rates up to 20 megabits per second (Mbps). The FTMC is part of the developing test bed to quantify modulation and coding concepts.
NASA Astrophysics Data System (ADS)
Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.
2018-01-01
The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epiney, A.; Canepa, S.; Zerkak, O.
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
ERIC Educational Resources Information Center
Henning, Elizabeth
2012-01-01
From the field of developmental psycholinguistics and from conceptual development theory there is evidence that excessive linguistic "code-switching" in early school education may pose some hazards for the learning of young multilingual children. In this article the author addresses the issue, invoking post-Piagetian and neo-Vygotskian…
ERIC Educational Resources Information Center
Adamo-Villani, Nicoletta; Oania, Marcus; Cooper, Stephen
2013-01-01
We report the development and initial evaluation of a serious game that, in conjunction with appropriately designed matching laboratory exercises, can be used to teach secure coding and Information Assurance (IA) concepts across a range of introductory computing courses. The IA Game is a role-playing serious game (RPG) in which the student travels…
Playing Music, Playing with Music: A Proposal for Music Coding in Primary School
ERIC Educational Resources Information Center
Baratè, Adriano; Ludovico, Luca Andrea; Mangione, Giuseppina Rita; Rosa, Alessia
2015-01-01
In this work we will introduce the concept of "music coding," namely a new discipline that employs basic music activities and simplified languages to teach the computational way of thinking to musically-untrained children who attend the primary school. In this context, music represents both a mean and a goal: in fact, from one side…
Brauer, Cletus S
2013-09-01
Should environmental, social, and economic sustainability be of primary concern to engineers? Should social justice be among these concerns? Although the deterioration of our natural environment and the increase in social injustices are among today's most pressing and important issues, engineering codes of ethics and their paramountcy clause, which contains those values most important to engineering and to what it means to be an engineer, do not yet put either concept on a par with the safety, health, and welfare of the public. This paper addresses a recent proposal by Michelfelder and Jones (2011) to include sustainability in the paramountcy clause as a way of rectifying the current disregard for social justice issues in the engineering codes. That proposal builds on a certain notion of sustainability that includes social justice as one of its dimensions and claims that social justice is a necessary condition for sustainability, not vice versa. The relationship between these concepts is discussed, and the original proposal is rejected. Drawing on insights developed throughout the paper, some suggestions are made as to how one should address the different requirements that theory and practice demand of the value taxonomy of professional codes of ethics.
[Conflicts between nursing ethics and health care legislation in Spain].
Gea-Sánchez, Montserrat; Terés-Vidal, Lourdes; Briones-Vozmediano, Erica; Molina, Fidel; Gastaldo, Denise; Otero-García, Laura
2016-01-01
To identify the ethical conflicts that may arise between the nursing codes of ethics and the Royal Decree-law 16/2012 modifying Spanish health regulations. We conducted a review and critical analysis of the discourse of five nursing codes of ethics from Barcelona, Catalonia, Spain, Europe and International, and of the discourse of the Spanish legislation in force in 2013. Language structures referring to five different concepts of the theoretical framework of care were identified in the texts: equity, human rights, right to healthcare, access to care, and continuity of care. Codes of ethics define the function of nursing according to equity, acknowledgement of human rights, right to healthcare, access to care and continuity of care, while legal discourse hinges on the concept of beneficiary or being insured. The divergence between the code of ethics and the legal discourse may produce ethical conflicts that negatively affect nursing practice. The application of RDL 16/2012 promotes a framework of action that prevents nursing professionals from providing care to uninsured collectives, which violates human rights and the principles of care ethics. Copyright © 2016 SESPAS. Published by Elsevier Espana. All rights reserved.
Code of Federal Regulations, 2014 CFR
2014-10-01
... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...
Code of Federal Regulations, 2011 CFR
2011-10-01
... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...
Code of Federal Regulations, 2012 CFR
2012-10-01
... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...
Code of Federal Regulations, 2013 CFR
2013-10-01
... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...
Code of Federal Regulations, 2010 CFR
2010-10-01
... standards of safety, decency, and sanitation and in conformity with applicable codes, specifications and standards. (b) Applicable codes, specifications, and standards shall include any disaster resistant building code that meets the minimum requirements of the National Flood Insurance Program (NFIP) as well as...
Application of a Database System for Korean Military Personnel Management.
1987-03-01
PUNOINtGiSPONSORING 6b OFFICE SYMBOIL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (taoab 8c AOORE SS (city. Stare. MWd BP Code) 10...concepts ......... 33 C. R SHIIONSPS WITH THE A .TL. ................. ............................... 35 1. Tree or hierarchical relationships...between relation and data-processing concepts6 ............... 35 3.6 Example of Tree Relationship ......... .......................... 36 3.7
Cobweb/3: A portable implementation
NASA Technical Reports Server (NTRS)
Mckusick, Kathleen; Thompson, Kevin
1990-01-01
An algorithm is examined for data clustering and incremental concept formation. An overview is given of the Cobweb/3 system and the algorithm on which it is based, as well as the practical details of obtaining and running the system code. The implementation features a flexible user interface which includes a graphical display of the concept hierarchies that the system constructs.
NASA Astrophysics Data System (ADS)
Naumov, D.; Fischer, T.; Böttcher, N.; Watanabe, N.; Walther, M.; Rink, K.; Bilke, L.; Shao, H.; Kolditz, O.
2014-12-01
OpenGeoSys (OGS) is a scientific open source code for numerical simulation of thermo-hydro-mechanical-chemical processes in porous and fractured media. Its basic concept is to provide a flexible numerical framework for solving multi-field problems for applications in geoscience and hydrology as e.g. for CO2 storage applications, geothermal power plant forecast simulation, salt water intrusion, water resources management, etc. Advances in computational mathematics have revolutionized the variety and nature of the problems that can be addressed by environmental scientists and engineers nowadays and an intensive code development in the last years enables in the meantime the solutions of much larger numerical problems and applications. However, solving environmental processes along the water cycle at large scales, like for complete catchment or reservoirs, stays computationally still a challenging task. Therefore, we started a new OGS code development with focus on execution speed and parallelization. In the new version, a local data structure concept improves the instruction and data cache performance by a tight bundling of data with an element-wise numerical integration loop. Dedicated analysis methods enable the investigation of memory-access patterns in the local and global assembler routines, which leads to further data structure optimization for an additional performance gain. The concept is presented together with a technical code analysis of the recent development and a large case study including transient flow simulation in the unsaturated / saturated zone of the Thuringian Syncline, Germany. The analysis is performed on a high-resolution mesh (up to 50M elements) with embedded fault structures.
Vo, Elaine; Davila, Jessica A; Hou, Jason; Hodge, Krystle; Li, Linda T; Suliburk, James W; Kao, Lillian S; Berger, David H; Liang, Mike K
2013-08-01
Large databases provide a wealth of information for researchers, but identifying patient cohorts often relies on the use of current procedural terminology (CPT) codes. In particular, studies of stoma surgery have been limited by the accuracy of CPT codes in identifying and differentiating ileostomy procedures from colostomy procedures. It is important to make this distinction because the prevalence of complications associated with stoma formation and reversal differ dramatically between types of stoma. Natural language processing (NLP) is a process that allows text-based searching. The Automated Retrieval Console is an NLP-based software that allows investigators to design and perform NLP-assisted document classification. In this study, we evaluated the role of CPT codes and NLP in differentiating ileostomy from colostomy procedures. Using CPT codes, we conducted a retrospective study that identified all patients undergoing a stoma-related procedure at a single institution between January 2005 and December 2011. All operative reports during this time were reviewed manually to abstract the following variables: formation or reversal and ileostomy or colostomy. Sensitivity and specificity for validation of the CPT codes against the mastery surgery schedule were calculated. Operative reports were evaluated by use of NLP to differentiate ileostomy- from colostomy-related procedures. Sensitivity and specificity for identifying patients with ileostomy or colostomy procedures were calculated for CPT codes and NLP for the entire cohort. CPT codes performed well in identifying stoma procedures (sensitivity 87.4%, specificity 97.5%). A total of 664 stoma procedures were identified by CPT codes between 2005 and 2011. The CPT codes were adequate in identifying stoma formation (sensitivity 97.7%, specificity 72.4%) and stoma reversal (sensitivity 74.1%, specificity 98.7%), but they were inadequate in identifying ileostomy (sensitivity 35.0%, specificity 88.1%) and colostomy (75.2% and 80.9%). NLP performed with greater sensitivity, specificity, and accuracy than CPT codes in identifying stoma procedures and stoma types. Major differences where NLP outperformed CPT included identifying ileostomy (specificity 95.8%, sensitivity 88.3%, and accuracy 91.5%) and colostomy (97.6%, 90.5%, and 92.8%, respectively). CPT codes can identify effectively patients who have had stoma procedures and are adequate in distinguishing between formation and reversal; however, CPT codes cannot differentiate ileostomy from colostomy. NLP can be used to differentiate between ileostomy- and colostomy-related procedures. The role of NLP in conjunction with electronic medical records in data retrieval warrants further investigation. Published by Mosby, Inc.
Billing, coding, and documentation in the critical care environment.
Fakhry, S M
2000-06-01
Optimal conduct of modern-day physician practices involves a thorough understanding and application of the principles of documentation, coding, and billing. Physicians' role in these activities can no longer be secondary. Surgeons practicing critical care must be well versed in these concepts and their effective application to ensure that they are competitive in an increasingly difficult and demanding environment. Health care policies and regulations continue to evolve, mandating constant education of practicing physicians and their staffs and surgical residents who also will have to function in this environment. Close, collaborative relationships between physicians and individuals well versed in the concepts of documentation, coding, and billing are indispensable. Similarly, ongoing educational and review processes (whether internal or consultative from outside sources) not only can decrease the possibility of unfavorable outcomes from audit but also will likely enhance practice efficiency and cash flow. A financially viable practice is certainly a prerequisite for a surgical critical care practice to achieve its primary goal of excellence in patient care.
Inference in the brain: Statistics flowing in redundant population codes
Pitkow, Xaq; Angelaki, Dora E
2017-01-01
It is widely believed that the brain performs approximate probabilistic inference to estimate causal variables in the world from ambiguous sensory data. To understand these computations, we need to analyze how information is represented and transformed by the actions of nonlinear recurrent neural networks. We propose that these probabilistic computations function by a message-passing algorithm operating at the level of redundant neural populations. To explain this framework, we review its underlying concepts, including graphical models, sufficient statistics, and message-passing, and then describe how these concepts could be implemented by recurrently connected probabilistic population codes. The relevant information flow in these networks will be most interpretable at the population level, particularly for redundant neural codes. We therefore outline a general approach to identify the essential features of a neural message-passing algorithm. Finally, we argue that to reveal the most important aspects of these neural computations, we must study large-scale activity patterns during moderately complex, naturalistic behaviors. PMID:28595050
NRA8-21 Cycle 2 RBCC Turbopump Risk Reduction
NASA Technical Reports Server (NTRS)
Ferguson, Thomas V.; Williams, Morgan; Marcu, Bogdan
2004-01-01
This project was composed of three sub-tasks. The objective of the first task was to use the CFD code INS3D to generate both on- and off-design predictions for the consortium optimized impeller flowfield. The results of the flow simulations are given in the first section. The objective of the second task was to construct a turbomachinery testing database comprised of measurements made on several different impellers, an inducer and a diffuser. The data was in the form of static pressure measurements as well as laser velocimeter measurements of velocities and flow angles within the stated components. Several databases with this information were created for these components. The third subtask objective was two-fold: first, to validate the Enigma CFD code for pump diffuser analysis, and secondly, to perform steady and unsteady analyses on some wide flow range diffuser concepts using Enigma. The code was validated using the consortium optimized impeller database and then applied to two different concepts for wide flow diffusers.
The IHMC CmapTools software in research and education: a multi-level use case in Space Meteorology
NASA Astrophysics Data System (ADS)
Messerotti, Mauro
2010-05-01
The IHMC (Institute for Human and Machine Cognition, Florida University System, USA) CmapTools software is a powerful multi-platform tool for knowledge modelling in graphical form based on concept maps. In this work we present its application for the high-level development of a set of multi-level concept maps in the framework of Space Meteorology to act as the kernel of a space meteorology domain ontology. This is an example of a research use case, as a domain ontology coded in machine-readable form via e.g. OWL (Web Ontology Language) is suitable to be an active layer of any knowledge management system embedded in a Virtual Observatory (VO). Apart from being manageable at machine level, concept maps developed via CmapTools are intrinsically human-readable and can embed hyperlinks and objects of many kinds. Therefore they are suitable to be published on the web: the coded knowledge can be exploited for educational purposes by the students and the public, as the level of information can be naturally organized among linked concept maps in progressively increasing complexity levels. Hence CmapTools and its advanced version COE (Concept-map Ontology Editor) represent effective and user-friendly software tools for high-level knowledge represention in research and education.
Pragmatic turn in biology: From biological molecules to genetic content operators.
Witzany, Guenther
2014-08-26
Erwin Schrödinger's question "What is life?" received the answer for decades of "physics + chemistry". The concepts of Alain Turing and John von Neumann introduced a third term: "information". This led to the understanding of nucleic acid sequences as a natural code. Manfred Eigen adapted the concept of Hammings "sequence space". Similar to Hilbert space, in which every ontological entity could be defined by an unequivocal point in a mathematical axiomatic system, in the abstract "sequence space" concept each point represents a unique syntactic structure and the value of their separation represents their dissimilarity. In this concept molecular features of the genetic code evolve by means of self-organisation of matter. Biological selection determines the fittest types among varieties of replication errors of quasi-species. The quasi-species concept dominated evolution theory for many decades. In contrast to this, recent empirical data on the evolution of DNA and its forerunners, the RNA-world and viruses indicate cooperative agent-based interactions. Group behaviour of quasi-species consortia constitute de novo and arrange available genetic content for adaptational purposes within real-life contexts that determine epigenetic markings. This review focuses on some fundamental changes in biology, discarding its traditional status as a subdiscipline of physics and chemistry.
Telemetry: Summary of concept and rationale
NASA Astrophysics Data System (ADS)
1987-12-01
This report presents the concept and supporting rationale for the telemetry system developed by the Consultative Committee for Space Data Systems (CCSDS). The concepts, protocols and data formats developed for the telemetry system are designed for flight and ground data systems supporting conventional, contemporary free-flyer spacecraft. Data formats are designed with efficiency as a primary consideration, i.e., format overhead is minimized. The results reflect the consensus of experts from many space agencies. An overview of the CCSDS telemetry system introduces the notion of architectural layering to achieve transparent and reliable delivery of scientific and engineering sensor data (generated aboard space vehicles) to users located in space or on earth. The system is broken down into two major conceptual categories: a packet telemetry concept and a telemetry channel coding concept. Packet telemetry facilitates data transmission from source to user in a standardized and highly automated manner. It provides a mechanism for implementing common data structures and protocols which can enhance the development and operation of space mission systems. Telemetry channel coding is a method by which data can be sent from a source to a destination by processing it in such a way that distinct messages are created which are easily distinguishable from one another. This allows construction of the data with low error probability, thus improving performance of the channel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmer, M.E.
1997-12-05
This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less
Decoding "us" and "them": Neural representations of generalized group concepts.
Cikara, Mina; Van Bavel, Jay J; Ingbretsen, Zachary A; Lau, Tatiana
2017-05-01
Humans form social coalitions in every society on earth, yet we know very little about how the general concepts us and them are represented in the brain. Evolutionary psychologists have argued that the human capacity for group affiliation is a byproduct of adaptations that evolved for tracking coalitions in general. These theories suggest that humans possess a common neural code for the concepts in-group and out-group, regardless of the category by which group boundaries are instantiated. The authors used multivoxel pattern analysis to identify the neural substrates of generalized group concept representations. They trained a classifier to encode how people represented the most basic instantiation of a specific social group (i.e., arbitrary teams created in the lab with no history of interaction or associated stereotypes) and tested how well the neural data decoded membership along an objectively orthogonal, real-world category (i.e., political parties). The dorsal anterior cingulate cortex/middle cingulate cortex and anterior insula were associated with representing groups across multiple social categories. Restricting the analyses to these regions in a separate sample of participants performing an explicit categorization task, the authors replicated cross-categorization classification in anterior insula. Classification accuracy across categories was driven predominantly by the correct categorization of in-group targets, consistent with theories indicating in-group preference is more central than out-group derogation to group perception and cognition. These findings highlight the extent to which social group concepts rely on domain-general circuitry associated with encoding stimuli's functional significance. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Maharaj, S I; Rodin, G M; Olmsted, M P; Connolly, J A; Daneman, D
2003-04-01
This study examined the relative contribution of adolescent self-concept, maternal weight and shape concerns (WSC), and mother-daughter relationships to eating disturbances among girls with type 1 diabetes mellitus (DM). Eighty-eight adolescent girls (mean = 15.0 years, S.D. = 2.2) and their mothers completed self-report measures of disordered eating and weight control behaviours, with teens also reporting on disturbed eating and body attitudes. Based on reported symptoms, adolescents were classified as highly (N = 18), mildly (N = 30) and non-eating disturbed (N = 40). Self-concept was assessed by adolescent self-report. Mother-daughter relationships were assessed by adolescent self-report and by observed mother-daughter interactions that were rated using a macroanalytic coding system that assesses intimacy and autonomy in these relationships. Hierarchical regressions illustrated that adolescent self-concept deficits, maternal WSC, and impaired mother-daughter relationships significantly predicted eating disturbances in girls with DM, accounting for 57% of the variance. Mothers who engaged in dieting and binge-eating were more impaired in their ability to support their daughters' emerging autonomy. The quality of mother-daughter relationships partly mediated the influence of maternal WSC on adolescent eating disturbances. Moreover, the impact of maternal WSC and mother-daughter relationships on eating disturbances was mediated by adolescent self-concept. Findings illustrate two pathways through which mother-daughter relationships may impact upon risk of eating disturbances in girls with DM and highlight the need to evaluate family-based interventions specifically tailored for this high-risk population.
Green, Nancy
2005-04-01
We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.
A proto-code of ethics and conduct for European nurse directors.
Stievano, Alessandro; De Marinis, Maria Grazia; Kelly, Denise; Filkins, Jacqueline; Meyenburg-Altwarg, Iris; Petrangeli, Mauro; Tschudin, Verena
2012-03-01
The proto-code of ethics and conduct for European nurse directors was developed as a strategic and dynamic document for nurse managers in Europe. It invites critical dialogue, reflective thinking about different situations, and the development of specific codes of ethics and conduct by nursing associations in different countries. The term proto-code is used for this document so that specifically country-orientated or organization-based and practical codes can be developed from it to guide professionals in more particular or situation-explicit reflection and values. The proto-code of ethics and conduct for European nurse directors was designed and developed by the European Nurse Directors Association's (ENDA) advisory team. This article gives short explanations of the code' s preamble and two main parts: Nurse directors' ethical basis, and Principles of professional practice, which is divided into six specific points: competence, care, safety, staff, life-long learning and multi-sectorial working.
Methodology, status and plans for development and assessment of TUF and CATHENA codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luxat, J.C.; Liu, W.S.; Leung, R.K.
1997-07-01
An overview is presented of the Canadian two-fluid computer codes TUF and CATHENA with specific focus on the constraints imposed during development of these codes and the areas of application for which they are intended. Additionally a process for systematic assessment of these codes is described which is part of a broader, industry based initiative for validation of computer codes used in all major disciplines of safety analysis. This is intended to provide both the licensee and the regulator in Canada with an objective basis for assessing the adequacy of codes for use in specific applications. Although focused specifically onmore » CANDU reactors, Canadian experience in developing advanced two-fluid codes to meet wide-ranging application needs while maintaining past investment in plant modelling provides a useful contribution to international efforts in this area.« less
Rate-compatible punctured convolutional codes (RCPC codes) and their applications
NASA Astrophysics Data System (ADS)
Hagenauer, Joachim
1988-04-01
The concept of punctured convolutional codes is extended by punctuating a low-rate 1/N code periodically with period P to obtain a family of codes with rate P/(P + l), where l can be varied between 1 and (N - 1)P. A rate-compatibility restriction on the puncturing tables ensures that all code bits of high rate codes are used by the lower-rate codes. This allows transmission of incremental redundancy in ARQ/FEC (automatic repeat request/forward error correction) schemes and continuous rate variation to change from low to high error protection within a data frame. Families of RCPC codes with rates between 8/9 and 1/4 are given for memories M from 3 to 6 (8 to 64 trellis states) together with the relevant distance spectra. These codes are almost as good as the best known general convolutional codes of the respective rates. It is shown that the same Viterbi decoder can be used for all RCPC codes of the same M. The application of RCPC codes to hybrid ARQ/FEC schemes is discussed for Gaussian and Rayleigh fading channels using channel-state information to optimize throughput.
Toward Right-Fidelity Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Sinsay, Jeffrey D.; Johnson, Wayne
2010-01-01
The aviation Advanced Design Office (ADO) of the US Army Aeroflightdynamics Directorate (AMRDEC) performs conceptual design of advanced Vertical Takeoff and Landing (VTOL) concepts in support of the Army's development and acquisition of new aviation systems. In particular, ADO engages in system synthesis to assess the impact of new technologies and their application to satisfy emerging warfighter needs and requirements. Fundamental to ADO being successful in accomplishing its role; is the ability to evaluate a wide array of proposed air vehicle concepts, and independently synthesize new concepts to inform Army and DoD decision makers about the tradespace in which decisions will be made (Figure 1). ADO utilizes a conceptual design (CD) process in the execution of its role. Benefiting from colocation with NASA rotorcraft researchers at the Ames Research Center, ADO and NASA have engaged in a survey of the current rotorcraft PD practices and begun the process of improving those capabilities to enable effective design and development of the next generation of VTOL systems. A unique aspect of CD in ADO is the fact that actual designs developed in-house are not intended to move forward in the development process. Rather, they are used as reference points in discussions about requirements development and technology impact. The ultimate products of ADO CD efforts are technology impact assessments and specifications which guide industry design activity. The fact that both the requirement and design are variables in the tradespace adds to the complexity of the CD process. A frequent need is ability to assess the relative "cost" of variations in requirement for a diverse set of VTOL configurations. Each of these configurations may have fundamentally different response characteristics to this requirement variation, and such insight into how different requirements drive different designs is a critical insight ADO attempts to provide decision makers. The processes and tools utilized are driven by the timeline in which questions must be answered. This can range from quick "back-of-the-envelope" assessments of a configuration made in an afternoon, to more detailed tradespace explorations that can take upwards of a year to complete. A variety of spreadsheet based tools and conceptual design codes are currently in use. The in-house developed conceptual sizing code RC (Rotorcraft) has been the preferred tool of choice for CD activity for a number of years. Figure 2 illustrates the long standing coupling between RC and solid modeling tools for layout, as well as a number of ad-hoc interfaces with external analyses. RC contains a sizing routine that is built around the use of momentum theory for rotors, classic finite wing theory, a referred parameter engine model, and semi-emperical weight estimation techniques. These methods lend themselves to rapid solutions, measured in seconds and minutes. The successful use of RC, however requires careful consideration of model input parameters and judicious comparison with existing aircraft to avoid unjustified extrapolation of results. RC is in fact a legacy of a series of codes whose development started in the early 1970s, and is best suited to the study of conventional helicopters and XV-15 style tiltrotors. Other concepts have been analyzed with RC, but typically it became necessary to modify the source code and methods for each unique configuration. Recent activity has lead to the development of a new code, NASA Design and Analysis of Rotorcraft (NDARC). NDARC uses a similar level of analytical fidelity as RC, but is built on a new framework intended to improve modularity and ability to rapidly model a wider array of concepts. Critical to achieving this capability is the decomposition of the aircraft system into a series of fundamental components which can then be assembled to form a wide-array of configurations. The paper will provide an overview of NDARC and its capabilities.
Introduction to the Natural Anticipator and the Artificial Anticipator
NASA Astrophysics Data System (ADS)
Dubois, Daniel M.
2010-11-01
This short communication deals with the introduction of the concept of anticipator, which is one who anticipates, in the framework of computing anticipatory systems. The definition of anticipation deals with the concept of program. Indeed, the word program, comes from "pro-gram" meaning "to write before" by anticipation, and means a plan for the programming of a mechanism, or a sequence of coded instructions that can be inserted into a mechanism, or a sequence of coded instructions, as genes or behavioural responses, that is part of an organism. Any natural or artificial programs are thus related to anticipatory rewriting systems, as shown in this paper. All the cells in the body, and the neurons in the brain, are programmed by the anticipatory genetic code, DNA, in a low-level language with four signs. The programs in computers are also computing anticipatory systems. It will be shown, at one hand, that the genetic code DNA is a natural anticipator. As demonstrated by Nobel laureate McClintock [8], genomes are programmed. The fundamental program deals with the DNA genetic code. The properties of the DNA consist in self-replication and self-modification. The self-replicating process leads to reproduction of the species, while the self-modifying process leads to new species or evolution and adaptation in existing ones. The genetic code DNA keeps its instructions in memory in the DNA coding molecule. The genetic code DNA is a rewriting system, from DNA coding to DNA template molecule. The DNA template molecule is a rewriting system to the Messenger RNA molecule. The information is not destroyed during the execution of the rewriting program. On the other hand, it will be demonstrated that Turing machine is an artificial anticipator. The Turing machine is a rewriting system. The head reads and writes, modifying the content of the tape. The information is destroyed during the execution of the program. This is an irreversible process. The input data are lost.
Local active information storage as a tool to understand distributed neural information processing
Wibral, Michael; Lizier, Joseph T.; Vögler, Sebastian; Priesemann, Viola; Galuske, Ralf
2013-01-01
Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today's digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, definitions were given for the dynamics of these information processing operations on a local scale in space and time in a distributed system, and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure the space-time dynamics of local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding. PMID:24501593
Solid Geometric Modeling - The Key to Improved Materiel Acquisition from Concept to Deployment
1984-09-01
M. J. Reisinger, "The GIFT Code User Manual; Volume I, Introduction and Input Requirements (U)," BRL Report No. 1802, July 1975. AD# A078364. 8 G...G. Kuehl, L. W. Bain, Jr., M. J. Reisinger, "The GIFT Code User Manual; Volume II, The Output Options (U)," USA ARRAOCOM Report No. 02189, Sep 79, AD...A078364 . • These results are plotted by a code called RunShot written by L. M. Rybak which takes input from GIFT and plots color shotlines on a
NASA Technical Reports Server (NTRS)
Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.
1994-01-01
The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. This user's manual describes how to use the ADPAC code as developed in Task 5, NAS3-25270, including the modifications made to date in Tasks 7 and 8, NAS3-25270.
Three-Dimensional Algebraic Models of the tRNA Code and 12 Graphs for Representing the Amino Acids
José, Marco V.; Morgado, Eberto R.; Guimarães, Romeu Cardoso; Zamudio, Gabriel S.; de Farías, Sávio Torres; Bobadilla, Juan R.; Sosa, Daniela
2014-01-01
Three-dimensional algebraic models, also called Genetic Hotels, are developed to represent the Standard Genetic Code, the Standard tRNA Code (S-tRNA-C), and the Human tRNA code (H-tRNA-C). New algebraic concepts are introduced to be able to describe these models, to wit, the generalization of the 2n-Klein Group and the concept of a subgroup coset with a tail. We found that the H-tRNA-C displayed broken symmetries in regard to the S-tRNA-C, which is highly symmetric. We also show that there are only 12 ways to represent each of the corresponding phenotypic graphs of amino acids. The averages of statistical centrality measures of the 12 graphs for each of the three codes are carried out and they are statistically compared. The phenotypic graphs of the S-tRNA-C display a common triangular prism of amino acids in 10 out of the 12 graphs, whilst the corresponding graphs for the H-tRNA-C display only two triangular prisms. The graphs exhibit disjoint clusters of amino acids when their polar requirement values are used. We contend that the S-tRNA-C is in a frozen-like state, whereas the H-tRNA-C may be in an evolving state. PMID:25370377
Death of a dogma: eukaryotic mRNAs can code for more than one protein
Mouilleron, Hélène; Delcourt, Vivian; Roucou, Xavier
2016-01-01
mRNAs carry the genetic information that is translated by ribosomes. The traditional view of a mature eukaryotic mRNA is a molecule with three main regions, the 5′ UTR, the protein coding open reading frame (ORF) or coding sequence (CDS), and the 3′ UTR. This concept assumes that ribosomes translate one ORF only, generally the longest one, and produce one protein. As a result, in the early days of genomics and bioinformatics, one CDS was associated with each protein-coding gene. This fundamental concept of a single CDS is being challenged by increasing experimental evidence indicating that annotated proteins are not the only proteins translated from mRNAs. In particular, mass spectrometry (MS)-based proteomics and ribosome profiling have detected productive translation of alternative open reading frames. In several cases, the alternative and annotated proteins interact. Thus, the expression of two or more proteins translated from the same mRNA may offer a mechanism to ensure the co-expression of proteins which have functional interactions. Translational mechanisms already described in eukaryotic cells indicate that the cellular machinery is able to translate different CDSs from a single viral or cellular mRNA. In addition to summarizing data showing that the protein coding potential of eukaryotic mRNAs has been underestimated, this review aims to challenge the single translated CDS dogma. PMID:26578573
DeVincenzo, John P
2009-10-01
A revolution in the understanding of RNA biological processing and control is leading to revolutionary new concepts in human therapeutics. It has become increasingly clear that the so called "non-coding RNA" exerts specific and profound functional control on regulation of protein production and indeed controls the expression of all genes. Harnessing this naturally-occurring RNA-mediated regulation of protein production has immense human therapeutic potential. These processes are collectively known as RNA interference (RNAi). RNAi is a recently discovered, naturally-occurring intracellular process that regulates gene expression through the silencing of specific mRNAs. Methods of harnessing this natural pathway are being developed that allow the catalytic degradation of targeted mRNAs using specifically designed complementary small inhibitory RNAs (siRNA). siRNAs are being chemically modified to acquire drug-like properties. Numerous recent high profile publications have provided proofs of concept that RNA interference may be useful therapeutically. Much of the design of these siRNAs can be accomplished bioinformatically, thus potentially expediting drug discovery and opening new avenues of therapy for many uncommon, orphan, or emerging diseases. This makes this approach very attractive for developing therapies targeting orphan diseases including neonatal diseases. Theoretically, any disease that can be ameliorated through knockdown of any endogenous or exogenous protein is a potential therapeutic target for RNAi-based therapeutics. Lung diseases are particularly attractive targets for RNAi therapeutics since the affected cells' location increases their accessibility to topical administration of siRNA, for example by aerosol. Respiratory viral infections and chronic lung disease are examples of such diseases. RNAi therapeutics have been shown to be active against RSV, parainfluenza and human metapneumoviruses in vitro and in vivo resulting in profound antiviral effects. The first proof of concept test of efficacy of an RNAi-based therapeutic in man has been initiated. A discussion of the science behind RNA interference is followed by a presentation of the potential practical issues in applying this technology to neonatal respiratory viral diseases. RNAi may offer new strategies for the treatment of a variety of orphan diseases including neonatal diseases, RSV infections, and other respiratory viruses.
Gap Analysis of Material Properties Data for Ferritic/Martensitic HT-9 Steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Neil R.; Serrano De Caro, Magdalena; Rodriguez, Edward A.
2012-08-28
The US Department of Energy (DOE), Office of Nuclear Energy (NE), is supporting the development of an ASME Code Case for adoption of 12Cr-1Mo-VW ferritic/martensitic (F/M) steel, commonly known as HT-9, primarily for use in elevated temperature design of liquid-metal fast reactors (LMFR) and components. In 2011, Los Alamos National Laboratory (LANL) nuclear engineering staff began assisting in the development of a small modular reactor (SMR) design concept, previously known as the Hyperion Module, now called the Gen4 Module. LANL staff immediately proposed HT-9 for the reactor vessel and components, as well as fuel clad and ducting, due to itsmore » superior thermal qualities. Although the ASME material Code Case, for adoption of HT-9 as an approved elevated temperature material for LMFR service, is the ultimate goal of this project, there are several key deliverables that must first be successfully accomplished. The most important key deliverable is the research, accumulation, and documentation of specific material parameters; physical, mechanical, and environmental, which becomes the basis for an ASME Code Case. Time-independent tensile and ductility data and time-dependent creep and creep-rupture behavior are some of the material properties required for a successful ASME Code case. Although this report provides a cursory review of the available data, a much more comprehensive study of open-source data would be necessary. This report serves three purposes: (a) provides a list of already existing material data information that could ultimately be made available to the ASME Code, (b) determines the HT-9 material properties data missing from available sources that would be required and (c) estimates the necessary material testing required to close the gap. Ultimately, the gap analysis demonstrates that certain material properties testing will be required to fulfill the necessary information package for an ASME Code Case.« less
Improving the accuracy of operation coding in surgical discharge summaries
Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine
2014-01-01
Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286
Neural correlates of concreteness in semantic categorization.
Pexman, Penny M; Hargreaves, Ian S; Edwards, Jodi D; Henry, Luke C; Goodyear, Bradley G
2007-08-01
In some contexts, concrete words (CARROT) are recognized and remembered more readily than abstract words (TRUTH). This concreteness effect has historically been explained by two theories of semantic representation: dual-coding [Paivio, A. Dual coding theory: Retrospect and current status. Canadian Journal of Psychology, 45, 255-287, 1991] and context-availability [Schwanenflugel, P. J. Why are abstract concepts hard to understand? In P. J. Schwanenflugel (Ed.), The psychology of word meanings (pp. 223-250). Hillsdale, NJ: Erlbaum, 1991]. Past efforts to adjudicate between these theories using functional magnetic resonance imaging have produced mixed results. Using event-related functional magnetic resonance imaging, we reexamined this issue with a semantic categorization task that allowed for uniform semantic judgments of concrete and abstract words. The participants were 20 healthy adults. Functional analyses contrasted activation associated with concrete and abstract meanings of ambiguous and unambiguous words. Results showed that for both ambiguous and unambiguous words, abstract meanings were associated with more widespread cortical activation than concrete meanings in numerous regions associated with semantic processing, including temporal, parietal, and frontal cortices. These results are inconsistent with both dual-coding and context-availability theories, as these theories propose that the representations of abstract concepts are relatively impoverished. Our results suggest, instead, that semantic retrieval of abstract concepts involves a network of association areas. We argue that this finding is compatible with a theory of semantic representation such as Barsalou's [Barsalou, L. W. Perceptual symbol systems. Behavioral & Brain Sciences, 22, 577-660, 1999] perceptual symbol systems, whereby concrete and abstract concepts are represented by similar mechanisms but with differences in focal content.
Network analysis for the visualization and analysis of qualitative data.
Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D
2018-03-01
We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raimondo, E.; Capman, J.L.; Herovard, M.
1985-05-01
Requirements for qualification of electrical equipment used in French-built nuclear power plants are stated in a national code, the RCC-E, or Regles de Construction et de Conception des Materiels Electriques. Under the RCC-E, safety related equipment is assigned to one of three different categories, according to location in the plant and anticipated normal, accident and post-accident behavior. Qualification tests differ for each category and procedures range in scope from the standard seismic test to the highly stringent VISA program, which specifies a predetermined sequence of aging, radiation, seismic and simulated accident testing. A network of official French test facilities wasmore » developed specifically to meet RCC-E requirements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arndt, S.A.
1997-07-01
The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for codemore » use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.« less
Di Rosa, Elisa; Bardi, Lara; Umiltà, Carlo; Masina, Fabio; Forgione, Margherita; Mapelli, Daniela
2017-08-01
The concept of stimulus response compatibility (SRC) refers to the existence of a privileged association between a specific stimulus feature and a specific response feature. Two examples of SRC are the Spatial Numerical Association of Response Codes (SNARC) and the Markedness Association of Response Codes (MARC) effects. According to the polarity correspondence principle, these two SRC effects occur because of a match between the most salient dimensions of stimulus and response. Specifically, the SNARC effect would be caused by a match between right-sided responses and large numbers, while a match between right-sided responses and even numbers would give rise to the MARC effect. The aim of the present study was to test the validity of the polarity correspondence principle in explaining these two SRC effects. To this end, we applied transcranial direct current stimulation (tDCS) over left and right posterior parietal cortex (PPC), which is thought to be the neural basis of salience processing, during a parity judgement task. Results showed that cathodal tDCS over the PPC significantly reduced the MARC effect but did not affect the SNARC effect, suggesting a dissociation between the two effects. That is, the MARC would rely on a salience processing mechanism, whereas the SNARC would not. Despite this interpretation is in need of further experimental confirmations (i.e., testing different tasks or using different tDCS montages), our results suggest that the polarity correspondence principle can be a plausible explanation only for the MARC effect but not for the SNARC effect. Copyright © 2017 Elsevier Ltd. All rights reserved.
Research on Ajax and Hibernate technology in the development of E-shop system
NASA Astrophysics Data System (ADS)
Yin, Luo
2011-12-01
Hibernate is a object relational mapping framework of open source code, which conducts light-weighted object encapsulation of JDBC to let Java programmers use the concept of object-oriented programming to manipulate database at will. The appearence of the concept of Ajax (asynchronous JavaScript and XML technology) begins the time prelude of page partial refresh so that developers can develop web application programs with stronger interaction. The paper illustrates the concrete application of Ajax and Hibernate to the development of E-shop in details and adopts them to design to divide the entire program code into relatively independent parts which can cooperate with one another as well. In this way, it is easier for the entire program to maintain and expand.
Do prominent quality measurement surveys capture the concerns of persons with disability?
Iezzoni, Lisa I; Marsella, Sarah A; Lopinsky, Tiffany; Heaphy, Dennis; Warsett, Kimberley S
2017-04-01
Demonstration programs nationwide aim to control costs and improve care for people dually-eligible for Medicare and Medicaid, including many persons with disability. Ensuring these initiatives maintain or improve care quality requires comprehensive evaluation of quality of care. To examine whether the common quality measures being used to evaluate the Massachusetts One Care duals demonstration program comprehensively address the concerns of persons with disability. Drawing upon existing conceptual frameworks, we developed a model of interrelationships of personal, health care, and environmental factors for achieving wellness for persons with disability. Based on this model, we specified a scheme to code individual quality measurement items and coded the items contained in 12 measures being used to assess Massachusetts One Care, which exclusively enrolls non-elderly adults with disability. Across these 12 measures, we assigned 376 codes to 302 items; some items received two codes. Taken together, the 12 measures contain items addressing most factors in our conceptual model that affect health care quality for persons with disability, including long-term services and supports. Some important gaps exist. No items examine sexual or reproductive health care, peer support, housing security, disability stigmatization, and specific services obtained outside the home like adult day care. Certain key concepts are covered only by a single or several of the 12 quality measures. Common quality metrics cover most - although not all-health care quality concerns of persons with disability. However, multiple different quality measures are required for this comprehensive coverage, raising questions about respondent burden. Copyright © 2017 Elsevier Inc. All rights reserved.
A feasibility study of reactor-based deep-burn concepts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, T. K.; Taiwo, T. A.; Hill, R. N.
2005-09-16
A systematic assessment of the General Atomics (GA) proposed Deep-Burn concept based on the Modular Helium-Cooled Reactor design (DB-MHR) has been performed. Preliminary benchmarking of deterministic physics codes was done by comparing code results to those from MONTEBURNS (MCNP-ORIGEN) calculations. Detailed fuel cycle analyses were performed in order to provide an independent evaluation of the physics and transmutation performance of the one-pass and two-pass concepts. Key performance parameters such as transuranic consumption, reactor performance, and spent fuel characteristics were analyzed. This effort has been undertaken in close collaborations with the General Atomics design team and Brookhaven National Laboratory evaluation team.more » The study was performed primarily for a 600 MWt reference DB-MHR design having a power density of 4.7 MW/m{sup 3}. Based on parametric and sensitivity study, it was determined that the maximum burnup (TRU consumption) can be obtained using optimum values of 200 {micro}m and 20% for the fuel kernel diameter and fuel packing fraction, respectively. These values were retained for most of the one-pass and two-pass design calculations; variation to the packing fraction was necessary for the second stage of the two-pass concept. Using a four-batch fuel management scheme for the one-pass DB-MHR core, it was possible to obtain a TRU consumption of 58% and a cycle length of 286 EFPD. By increasing the core power to 800 MWt and the power density to 6.2 MW/m{sup 3}, it was possible to increase the TRU consumption to 60%, although the cycle length decreased by {approx}64 days. The higher TRU consumption (burnup) is due to the reduction of the in-core decay of fissile Pu-241 to Am-241 relative to fission, arising from the higher power density (specific power), which made the fuel more reactivity over time. It was also found that the TRU consumption can be improved by utilizing axial fuel shuffling or by operating with lower material temperatures (colder core). Results also showed that the transmutation performance of the one-pass deep-burn concept is sensitive to the initial TRU vector, primarily because longer cooling time reduces the fissile content (Pu-241 specifically.) With a cooling time of 5 years, the TRU consumption increases to 67%, while conversely, with 20-year cooling the TRU consumption is about 58%. For the two-pass DB-MHR (TRU recycling option), a fuel packing fraction of about 30% is required in the second pass (the recycled TRU). It was found that using a heterogeneous core (homogeneous fuel element) concept, the TRU consumption is dependent on the cooling interval before the 2nd pass, again due to Pu-241 decay during the time lag between the first pass fuel discharge and the second pass fuel charge. With a cooling interval of 7 years (5 and 2 years before and after reprocessing) a TRU consumption of 55% is obtained. With an assumed ''no cooling'' interval, the TRU consumption is 63%. By using a cylindrical core to reduce neutron leakage, TRU consumption of the case with 7-year cooling interval increases to 58%. For a two-pass concept using a heterogeneous fuel element (and homogeneous core) with first and second pass volume ratio of 2:1, the TRU consumption is 62.4%. Finally, the repository loading benefits arising from the deep-burn and Inert Matrix Fuel (IMF) concepts were estimated and compared, for the same initial TRU vector. The DB-MHR concept resulted in slightly higher TRU consumption and repository loading benefit compared to the IMF concept (58.1% versus 55.1% for TRU consumption and 2.0 versus 1.6 for estimated repository loading benefit).« less
Number theoretical foundations in cryptography
NASA Astrophysics Data System (ADS)
Atan, Kamel Ariffin Mohd
2017-08-01
In recent times the hazards in relationships among entities in different establishments worldwide have generated exciting developments in cryptography. Central to this is the theory of numbers. This area of mathematics provides very rich source of fundamental materials for constructing secret codes. Some number theoretical concepts that have been very actively used in designing crypto systems will be highlighted in this presentation. This paper will begin with introduction to basic number theoretical concepts which for many years have been thought to have no practical applications. This will include several theoretical assertions that were discovered much earlier in the historical development of number theory. This will be followed by discussion on the "hidden" properties of these assertions that were later exploited by designers of cryptosystems in their quest for developing secret codes. This paper also highlights some earlier and existing cryptosystems and the role played by number theoretical concepts in their constructions. The role played by cryptanalysts in detecting weaknesses in the systems developed by cryptographers concludes this presentation.
Viscous diffusion of vorticity in unsteady wall layers using the diffusion velocity concept
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strickland, J.H.; Kempka, S.N.; Wolfe, W.P.
1995-03-01
The primary purpose of this paper is to provide a careful evaluation of the diffusion velocity concept with regard to its ability to predict the diffusion of vorticity near a moving wall. A computer code BDIF has been written which simulates the evolution of the vorticity field near a wall of infinite length which is moving in an arbitrary fashion. The simulations generated by this code are found to give excellent results when compared to several exact solutions. We also outline a two-dimensional unsteady viscous boundary layer model which utilizes the diffusion velocity concept and is compatible with vortex methods.more » A primary goal of this boundary layer model is to minimize the number of vortices generated on the surface at each time step while achieving good resolution of the vorticity field near the wall. Preliminary results have been obtained for simulating a simple two-dimensional laminar boundary layer.« less
NASA Technical Reports Server (NTRS)
Myers, David E.; Martin, Carl J.; Blosser, Max L.
2000-01-01
A parametric weight assessment of advanced metallic panel, ceramic blanket, and ceramic tile thermal protection systems (TPS) was conducted using an implicit, one-dimensional (I-D) finite element sizing code. This sizing code contained models to account for coatings fasteners, adhesives, and strain isolation pads. Atmospheric entry heating profiles for two vehicles, the Access to Space (ATS) vehicle and a proposed Reusable Launch Vehicle (RLV), were used to ensure that the trends were not unique to a certain trajectory. Ten TPS concepts were compared for a range of applied heat loads and substructural heat capacities to identify general trends. This study found the blanket TPS concepts have the lightest weights over the majority of their applicable ranges, and current technology ceramic tiles and metallic TPS concepts have similar weights. A proposed, state-of-the-art metallic system which uses a higher temperature alloy and efficient multilayer insulation was predicted to be significantly lighter than the ceramic tile stems and approaches blanket TPS weights for higher integrated heat loads.
Liu, Shuo; Cui, Tie Jun; Zhang, Lei; Xu, Quan; Wang, Qiu; Wan, Xiang; Gu, Jian Qiang; Tang, Wen Xuan; Qing Qi, Mei; Han, Jia Guang; Zhang, Wei Li; Zhou, Xiao Yang; Cheng, Qiang
2016-10-01
The concept of coding metasurface makes a link between physically metamaterial particles and digital codes, and hence it is possible to perform digital signal processing on the coding metasurface to realize unusual physical phenomena. Here, this study presents to perform Fourier operations on coding metasurfaces and proposes a principle called as scattering-pattern shift using the convolution theorem, which allows steering of the scattering pattern to an arbitrarily predesigned direction. Owing to the constant reflection amplitude of coding particles, the required coding pattern can be simply achieved by the modulus of two coding matrices. This study demonstrates that the scattering patterns that are directly calculated from the coding pattern using the Fourier transform have excellent agreements to the numerical simulations based on realistic coding structures, providing an efficient method in optimizing coding patterns to achieve predesigned scattering beams. The most important advantage of this approach over the previous schemes in producing anomalous single-beam scattering is its flexible and continuous controls to arbitrary directions. This work opens a new route to study metamaterial from a fully digital perspective, predicting the possibility of combining conventional theorems in digital signal processing with the coding metasurface to realize more powerful manipulations of electromagnetic waves.
Review of finite fields: Applications to discrete Fourier, transforms and Reed-Solomon coding
NASA Technical Reports Server (NTRS)
Wong, J. S. L.; Truong, T. K.; Benjauthrit, B.; Mulhall, B. D. L.; Reed, I. S.
1977-01-01
An attempt is made to provide a step-by-step approach to the subject of finite fields. Rigorous proofs and highly theoretical materials are avoided. The simple concepts of groups, rings, and fields are discussed and developed more or less heuristically. Examples are used liberally to illustrate the meaning of definitions and theories. Applications include discrete Fourier transforms and Reed-Solomon coding.
Barriers Against Implementing Blunt Abdominal Trauma Guidelines in a Hospital: A Qualitative Study.
Zaboli, Rouhollah; Tofighi, Shahram; Aghighi, Ali; Shokouh, Seyyed Javad Hosaini; Naraghi, Nader; Goodarzi, Hassan
2016-08-01
Clinical practice guidelines are structured recommendations that help physicians and patients to make proper decisions when dealing with a specific clinical condition. Because blunt abdominal trauma causes a various range of mild, single-system, and multisystem injuries, early detection will help to reduce mortality and resulting disability. Emergency treatment should be initiated based on CPGs. This study aimed to determine the variables affecting implementing blunt abdominal trauma CPGs in an Iranian hospital. This study was conducted as a qualitative and phenomenology study in the Family Hospital in Tehran (Iran) in 2015. The research population included eight experts and key people in the area of blunt abdominal trauma clinical practice guidelines. Sampling was based on purposive and nonrandom methods. A semistructured interview was done for the data collection. A framework method was applied for the data analysis by using Atlas.ti software. After framework analyzing and various reviewing and deleting and combining the codes from 251 codes obtained, 15 families and five super families were extracted, including technical knowledge barriers, economical barriers, barriers related to deployment and monitoring, political will barriers, and managing barriers. Structural reform is needed for eliminating the defects available in the healthcare system. As with most of the codes, subconcepts and concepts are classified into the field of human resources; it seems that the education and knowledge will be more important than other resources such as capital and equipment.
Challenges in assessing college students' conception of duality: the case of infinity
NASA Astrophysics Data System (ADS)
Babarinsa-Ochiedike, Grace Olutayo
Interpreting students' views of infinity posits a challenge for researchers due to the dynamic nature of the conception. There is diversity and variation among students' process-object perceptions. The fluctuations between students' views however reveal an undeveloped duality conception. This study examined college students' conception of duality in understanding and representing infinity with the intent to design strategies that could guide researchers in categorizing students' views of infinity into different levels. Data for the study were collected from N=238 college students enrolled in Calculus sequence courses (Pre-Calculus, Calculus I through Calculus III) at one of the southwestern universities in the U.S. using self-report questionnaires and semi-structured individual task-based interviews. Data was triangulated using multiple measures analyzed by three independent experts using self-designed coding sheets to assess students' externalization of the duality conception of infinity. Results of this study reveal that college students' experiences in traditional Calculus sequence courses are not supportive of the development of duality conception. On the contrary, it strengthens the singularity perspective on fundamental ideas of mathematics such as infinity. The study also found that coding and assessing college students' conception of duality is a challenging and complex process due to the dynamic nature of the conception that is task-dependent and context-dependent. Practical significance of the study is that it helps to recognize misconceptions and starts addressing them so students will have a more comprehensive view of fundamental mathematical ideas as they progress through the Calculus coursework sequence. The developed duality concept development framework called Action-Process-Object-Duality (APOD) adapted from the APOS theory could guide educators and researchers as they engage in assessing students' conception of duality. The results of this study could serve as a facilitating instrument to further analyze cognitive obstacles in college students' understanding of the infinity concept.
Beyond Molecular Codes: Simple Rules to Wire Complex Brains
Hassan, Bassem A.; Hiesinger, P. Robin
2015-01-01
Summary Molecular codes, like postal zip codes, are generally considered a robust way to ensure the specificity of neuronal target selection. However, a code capable of unambiguously generating complex neural circuits is difficult to conceive. Here, we re-examine the notion of molecular codes in the light of developmental algorithms. We explore how molecules and mechanisms that have been considered part of a code may alternatively implement simple pattern formation rules sufficient to ensure wiring specificity in neural circuits. This analysis delineates a pattern-based framework for circuit construction that may contribute to our understanding of brain wiring. PMID:26451480
Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).
Paivio, Allan
2013-02-01
Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved
Pulsed Ejector Wave Propogation Test Program
NASA Technical Reports Server (NTRS)
Fernandez, Rene; Slater, John W.; Paxson, Daniel E.
2003-01-01
The development of, and initial test data from, a nondetonating Pulse Detonation Engine (PDE) simulator tested in the NASA Glenn 1 x 1 foot Supersonic Wind Tunnel (SWT) is presented in this paper. The concept is a pulsed ejector driven by the simulated exhaust of a PDE. This pro- gram is applicable to a PDE entombed in a ramjet flowpath, i.e., a PDE combined-cycle propulsion system. The ejector primary flow is a pulsed, uiiderexpanded, supersonic nozzle simulating the supersonic waves ema- nating from a PDE, while the ejector secondary flow is the 1 x 1 foot SWT test section operated at subsonic Mach numbers. The objective is not to study the detonation details, but the wave physics including t,he start- ing vortices, the extent of propagation of the wave front, the reflection of the wave from the secondary flowpath walls, and the timing of these events of a pulsed ejector, and correlate these with Computational Fluid Dynamics (CFD) code predictions. Pulsed ejectors have been shown to result in a 3 to 1 improvement in LID (length-to-diameter) and a near 2 to 1 improvement in thrust augmentation over a steady ejector. This program will also explore the extent of upstream interactions between an inlet and large, periodically applied, backpressures to the inlet as would be present due to combustion tube detonations in a PDE. These interactions could result in inlet unstart or buzz for a supersonic mixed compression inlet. The design of the present experiment entailed the use of an 2-t diagram characteristics code to study the nozzle filling and purging timescales as well as a series of CFD analyses conducted using the WIND code. The WIND code is a general purpose CFD code for solution of the Reynolds averaged Navier-Stokes equations and can be applied to both steady state and time-accurate calculations. The first, proof-of-concept, test entry (spring 2001) pressure distributions shown here indicate the simulation concept was successful and therefore the experimental approach is sound.
From Verified Models to Verifiable Code
NASA Technical Reports Server (NTRS)
Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.
Conversion of the agent-oriented domain-specific language ALAS into JavaScript
NASA Astrophysics Data System (ADS)
Sredojević, Dejan; Vidaković, Milan; Okanović, Dušan; Mitrović, Dejan; Ivanović, Mirjana
2016-06-01
This paper shows generation of JavaScript code from code written in agent-oriented domain-specific language ALAS. ALAS is an agent-oriented domain-specific language for writing software agents that are executed within XJAF middleware. Since the agents can be executed on various platforms, they must be converted into a language of the target platform. We also try to utilize existing tools and technologies to make the whole conversion process as simple as possible, as well as faster and more efficient. We use the Xtext framework that is compatible with Java to implement ALAS infrastructure - editor and code generator. Since Xtext supports Java, generation of Java code from ALAS code is straightforward. To generate a JavaScript code that will be executed within the target JavaScript XJAF implementation, Google Web Toolkit (GWT) is used.
Prati, Gabriele; Pietrantoni, Luca
2013-01-01
The aim of the present study was to examine the comprehension of gesture in a situation in which the communicator cannot (or can only with difficulty) use verbal communication. Based on theoretical considerations, we expected to obtain higher semantic comprehension for emblems (gestures with a direct verbal definition or translation that is well known by all members of a group, or culture) compared to illustrators (gestures regarded as spontaneous and idiosyncratic and that do not have a conventional definition). Based on the extant literature, we predicted higher semantic specificity associated with arbitrarily coded and iconically coded emblems compared to intrinsically coded illustrators. Using a scenario of emergency evacuation, we tested the difference in semantic specificity between different categories of gestures. 138 participants saw 10 videos each illustrating a gesture performed by a firefighter. They were requested to imagine themselves in a dangerous situation and to report the meaning associated with each gesture. The results showed that intrinsically coded illustrators were more successfully understood than arbitrarily coded emblems, probably because the meaning of intrinsically coded illustrators is immediately comprehensible without recourse to symbolic interpretation. Furthermore, there was no significant difference between the comprehension of iconically coded emblems and that of both arbitrarily coded emblems and intrinsically coded illustrators. It seems that the difference between the latter two types of gestures was supported by their difference in semantic specificity, although in a direction opposite to that predicted. These results are in line with those of Hadar and Pinchas-Zamir (2004), which showed that iconic gestures have higher semantic specificity than conventional gestures.
Applying the Landscape Model to Comprehending Discourse from TV News Stories
ERIC Educational Resources Information Center
Lee, Mina; Roskos-Ewoldsen, Beverly; Roskos-Ewoldsen, David R.
2008-01-01
The Landscape Model of text comprehension was extended to the comprehension of audiovisual discourse from text and video TV news stories. Concepts from the story were coded for activation after each sequence, creating a matrix of activations that was reduced to a vector of the degree of total activation for each concept. In Study 1, the degree…
Concept For Generation Of Long Pseudorandom Sequences
NASA Technical Reports Server (NTRS)
Wang, C. C.
1990-01-01
Conceptual very-large-scale integrated (VLSI) digital circuit performs exponentiation in finite field. Algorithm that generates unusually long sequences of pseudorandom numbers executed by digital processor that includes such circuits. Concepts particularly advantageous for such applications as spread-spectrum communications, cryptography, and generation of ranging codes, synthetic noise, and test data, where usually desirable to make pseudorandom sequences as long as possible.
The Development of the Concept of "Matter": A Cross-Age Study of How Children Describe Materials
ERIC Educational Resources Information Center
Krnel, Dusan; Watson, Rod; Glazar, Sasa A.
2005-01-01
The development of the concept of matter was explored by interviewing 84 children aged 3-13 in Slovenia. Children were asked to describe objects and substances placed in front of them. Children's responses were coded and explored for patterns indicating development with age. The patterns of responses indicate that by acting on objects and…
Halftoning Algorithms and Systems.
1996-08-01
TERMS 15. NUMBER IF PAGESi. Halftoning algorithms; error diffusions ; color printing; topographic maps 16. PRICE CODE 17. SECURITY CLASSIFICATION 18...graylevels for each screen level. In the case of error diffusion algorithms, the calibration procedure using the new centering concept manifests itself as a...Novel Centering Concept for Overlapping Correction Paper / Transparency (Patent Applied 5/94)I * Applications To Error Diffusion * To Dithering (IS&T
Exploring Students' Conceptions of Science Learning via Drawing: A Cross-Sectional Analysis
ERIC Educational Resources Information Center
Hsieh, Wen-Min; Tsai, Chin-Chung
2017-01-01
This cross-sectional study explored students' conceptions of science learning via drawing analysis. A total of 906 Taiwanese students in 4th, 6th, 8th, 10th, and 12th grade were asked to use drawing to illustrate how they conceptualise science learning. Students' drawings were analysed using a coding checklist to determine the presence or absence…
ERIC Educational Resources Information Center
Hsieh, Wen-Min; Tsai, Chin-Chung
2018-01-01
Using the draw-a-picture technique, the authors explored the learning conceptions held by students across grade levels. A total of 1,067 Taiwanese students in Grades 2, 4, 6, 8, 10, and 12 participated in this study. Participants were asked to use drawing to illustrate how they conceptualize learning. A coding checklist was developed to analyze…
In their own words? A terminological analysis of e-mail to a cancer information service.
Smith, Catherine Arnott; Stavri, P. Zoë; Chapman, Wendy Webber
2002-01-01
OBJECTIVE: To better understand the terms used by consumers to describe their health information needs and determine if this "consumer terminology"differs from those used by health care professionals. METHODS: Features and findings identified in 139 e-mail messages to the University of Pittsburgh Cancer Institute's Cancer Information and Referral Service were coded and matched against the 2001 Unified Medical Language System Metathesaurus. RESULTS:504 unique terms were identified. 185 (36%) were exact matches to concepts in the 2001 UMLS Metathesaurus (MTH). 179 (35%) were partial string matches; 119 (24%) were known synonyms for MTH concepts; and 2 (<1%) were lexical variants. Only 19,or 4% of the total terms, were not found to be present in the 2001 MT1H. CONCLUSION: 96% of the clinical findings and features mentioned in e-mail by correspondents who did not self-identify as healthcare professionals were described using terms from controlled healthcare terminologies. The notion of a paradigmatic "consumer" who uses a particular vocabulary specific to her "consumer" status may be ill-founded. PMID:12463914
Green, Courtney A; Chern, Hueylan; O'Sullivan, Patricia S
2018-02-01
Current robot surgery curricula developed by industry were designed for expert surgeons. We sought to identify the robotic curricula that currently exist in general surgery residencies and describe their components. We identified 12 residency programs with robotic curricula. Using a structured coding form to identify themes including sequence, duration, emphasis and assessment, we generated a descriptive summary. Curricula followed a similar sequence: learners started with online modules and simulation exercises, followed by bedside experience during R2-R3 training years, and then operative opportunities on the console in the final years of training. Consistent portions of the curricula reflect a device-dependent training paradigm; they defined the sequence of instruction. Most curricula lacked specifics on duration and content of training activities. None clearly described cognitive or psychomotor skills needed by residents and none required a proficiency assessment before graduation. Resident-specific robotic curricula remain grounded in initial industrial efforts to train experienced surgeons, are non-specific regarding the type and nature of hands on experience, and do not include discussion of operative technique and surgical concepts. Copyright © 2017 Elsevier Inc. All rights reserved.
Formal Safety Certification of Aerospace Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2005-01-01
In principle, formal methods offer many advantages for aerospace software development: they can help to achieve ultra-high reliability, and they can be used to provide evidence of the reliability claims which can then be subjected to external scrutiny. However, despite years of research and many advances in the underlying formalisms of specification, semantics, and logic, formal methods are not much used in practice. In our opinion this is related to three major shortcomings. First, the application of formal methods is still expensive because they are labor- and knowledge-intensive. Second, they are difficult to scale up to complex systems because they are based on deep mathematical insights about the behavior of the systems (t.e., they rely on the "heroic proof"). Third, the proofs can be difficult to interpret, and typically stand in isolation from the original code. In this paper, we describe a tool for formally demonstrating safety-relevant aspects of aerospace software, which largely circumvents these problems. We focus on safely properties because it has been observed that safety violations such as out-of-bounds memory accesses or use of uninitialized variables constitute the majority of the errors found in the aerospace domain. In our approach, safety means that the program will not violate a set of rules that can range for the simple memory access rules to high-level flight rules. These different safety properties are formalized as different safety policies in Hoare logic, which are then used by a verification condition generator along with the code and logical annotations in order to derive formal safety conditions; these are then proven using an automated theorem prover. Our certification system is currently integrated into a model-based code generation toolset that generates the annotations together with the code. However, this automated formal certification technology is not exclusively constrained to our code generator and could, in principle, also be integrated with other code generators such as RealTime Workshop or even applied to legacy code. Our approach circumvents the historical problems with formal methods by increasing the degree of automation on all levels. The restriction to safety policies (as opposed to arbitrary functional behavior) results in simpler proof problems that can generally be solved by fully automatic theorem proves. An automated linking mechanism between the safety conditions and the code provides some of the traceability mandated by process standards such as DO-178B. An automated explanation mechanism uses semantic markup added by the verification condition generator to produce natural-language explanations of the safety conditions and thus supports their interpretation in relation to the code. It shows an automatically generated certification browser that lets users inspect the (generated) code along with the safety conditions (including textual explanations), and uses hyperlinks to automate tracing between the two levels. Here, the explanations reflect the logical structure of the safety obligation but the mechanism can in principle be customized using different sets of domain concepts. The interface also provides some limited control over the certification process itself. Our long-term goal is a seamless integration of certification, code generation, and manual coding that results in a "certified pipeline" in which specifications are automatically transformed into executable code, together with the supporting artifacts necessary for achieving and demonstrating the high level of assurance needed in the aerospace domain.
Administrative database code accuracy did not vary notably with changes in disease prevalence.
van Walraven, Carl; English, Shane; Austin, Peter C
2016-11-01
Previous mathematical analyses of diagnostic tests based on the categorization of a continuous measure have found that test sensitivity and specificity varies significantly by disease prevalence. This study determined if the accuracy of diagnostic codes varied by disease prevalence. We used data from two previous studies in which the true status of renal disease and primary subarachnoid hemorrhage, respectively, had been determined. In multiple stratified random samples from the two previous studies having varying disease prevalence, we measured the accuracy of diagnostic codes for each disease using sensitivity, specificity, and positive and negative predictive value. Diagnostic code sensitivity and specificity did not change notably within clinically sensible disease prevalence. In contrast, positive and negative predictive values changed significantly with disease prevalence. Disease prevalence had no important influence on the sensitivity and specificity of diagnostic codes in administrative databases. Copyright © 2016 Elsevier Inc. All rights reserved.
Optical image encryption using QR code and multilevel fingerprints in gyrator transform domains
NASA Astrophysics Data System (ADS)
Wei, Yang; Yan, Aimin; Dong, Jiabin; Hu, Zhijuan; Zhang, Jingtao
2017-11-01
A new concept of GT encryption scheme is proposed in this paper. We present a novel optical image encryption method by using quick response (QR) code and multilevel fingerprint keys in gyrator transform (GT) domains. In this method, an original image is firstly transformed into a QR code, which is placed in the input plane of cascaded GTs. Subsequently, the QR code is encrypted into the cipher-text by using multilevel fingerprint keys. The original image can be obtained easily by reading the high-quality retrieved QR code with hand-held devices. The main parameters used as private keys are GTs' rotation angles and multilevel fingerprints. Biometrics and cryptography are integrated with each other to improve data security. Numerical simulations are performed to demonstrate the validity and feasibility of the proposed encryption scheme. In the future, the method of applying QR codes and fingerprints in GT domains possesses much potential for information security.
On the validation of a code and a turbulence model appropriate to circulation control airfoils
NASA Technical Reports Server (NTRS)
Viegas, J. R.; Rubesin, M. W.; Maccormack, R. W.
1988-01-01
A computer code for calculating flow about a circulation control airfoil within a wind tunnel test section has been developed. This code is being validated for eventual use as an aid to design such airfoils. The concept of code validation being used is explained. The initial stages of the process have been accomplished. The present code has been applied to a low-subsonic, 2-D flow about a circulation control airfoil for which extensive data exist. Two basic turbulence models and variants thereof have been successfully introduced into the algorithm, the Baldwin-Lomax algebraic and the Jones-Launder two-equation models of turbulence. The variants include adding a history of the jet development for the algebraic model and adding streamwise curvature effects for both models. Numerical difficulties and difficulties in the validation process are discussed. Turbulence model and code improvements to proceed with the validation process are also discussed.
Formal specification and verification of Ada software
NASA Technical Reports Server (NTRS)
Hird, Geoffrey R.
1991-01-01
The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.
Research and Development Roadmaps for Liquid Metal Cooled Fast Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, T. K.; Grandy, C.; Natesan, K.
The United States Department of Energy (DOE) commissioned the development of technology roadmaps for advanced (non-light water reactor) reactor concepts to help focus research and development funding over the next five years. The roadmaps show the research and development needed to support demonstration of an advanced (non-LWR) concept by the early 2030s, consistent with DOE’s Vision and Strategy for the Development and Deployment of Advanced Reactors. The intent is only to convey the technical steps that would be required to achieve such a goal; the means by which DOE will determine whether to invest in specific tasks will be treatedmore » separately. The starting point for the roadmaps is the Technical Readiness Assessment performed as part of an Advanced Test and Demonstration Reactor study released in 2016. The roadmaps were developed based upon a review of technical reports and vendor literature summarizing the technical maturity of each concept and the outstanding research and development needs. Critical path tasks for specific systems were highlighted on the basis of time and resources needed to complete the tasks and the importance of the system to the performance of the reactor concept. The roadmaps are generic, i.e. not specific to a particular vendor’s design but vendor design information may have been used as representative of the concept family. In the event that both near-term and more advanced versions of a concept are being developed, either a single roadmap with multiple branches or separate roadmaps for each version were developed. In each case, roadmaps point to a demonstration reactor (engineering or commercial) and show the activities that must be completed in parallel to support that demonstration in the 2030-2035 window. This report provides the roadmaps for two fast reactor concepts, the Sodium-cooled Fast Reactor (SFR) and the Lead-cooled Fast Reactor (LFR). The SFR technology is mature enough for commercial demonstration by the early 2030s, and the remaining critical paths and R&D needs are generally related to the completion of qualification of fuel and structural materials, validation of reactor design codes and methods, and support of the licensing frameworks. The LFR’s technology is instead less-mature compared to the SFR’s, and will be at the engineering demonstration stage by the early 2030s. Key LFR technology development activities will focus on resolving remaining design challenges and demonstrating the viability of systems and components in the integral system, which will be done in parallel with addressing the gaps shared with SFR technology. The approach and timeline presented here assume that, for the first module demonstration, vendors would pursue a two-step licensing process based on 10CFR Part 50.« less
Kwon, Inchan; Choi, Eun Sil
2016-01-01
Multiple-site-specific incorporation of a noncanonical amino acid into a recombinant protein would be a very useful technique to generate multiple chemical handles for bioconjugation and multivalent binding sites for the enhanced interaction. Previously combination of a mutant yeast phenylalanyl-tRNA synthetase variant and the yeast phenylalanyl-tRNA containing the AAA anticodon was used to incorporate a noncanonical amino acid into multiple UUU phenylalanine (Phe) codons in a site-specific manner. However, due to the less selective codon recognition of the AAA anticodon, there was significant misincorporation of a noncanonical amino acid into unwanted UUC Phe codons. To enhance codon selectivity, we explored degenerate leucine (Leu) codons instead of Phe degenerate codons. Combined use of the mutant yeast phenylalanyl-tRNA containing the CAA anticodon and the yPheRS_naph variant allowed incorporation of a phenylalanine analog, 2-naphthylalanine, into murine dihydrofolate reductase in response to multiple UUG Leu codons, but not to other Leu codon sites. Despite the moderate UUG codon occupancy by 2-naphthylalaine, these results successfully demonstrated that the concept of forced ambiguity of the genetic code can be achieved for the Leu codons, available for multiple-site-specific incorporation. PMID:27028506
Kwon, Inchan; Choi, Eun Sil
2016-01-01
Multiple-site-specific incorporation of a noncanonical amino acid into a recombinant protein would be a very useful technique to generate multiple chemical handles for bioconjugation and multivalent binding sites for the enhanced interaction. Previously combination of a mutant yeast phenylalanyl-tRNA synthetase variant and the yeast phenylalanyl-tRNA containing the AAA anticodon was used to incorporate a noncanonical amino acid into multiple UUU phenylalanine (Phe) codons in a site-specific manner. However, due to the less selective codon recognition of the AAA anticodon, there was significant misincorporation of a noncanonical amino acid into unwanted UUC Phe codons. To enhance codon selectivity, we explored degenerate leucine (Leu) codons instead of Phe degenerate codons. Combined use of the mutant yeast phenylalanyl-tRNA containing the CAA anticodon and the yPheRS_naph variant allowed incorporation of a phenylalanine analog, 2-naphthylalanine, into murine dihydrofolate reductase in response to multiple UUG Leu codons, but not to other Leu codon sites. Despite the moderate UUG codon occupancy by 2-naphthylalaine, these results successfully demonstrated that the concept of forced ambiguity of the genetic code can be achieved for the Leu codons, available for multiple-site-specific incorporation.
Coding conventions and principles for a National Land-Change Modeling Framework
Donato, David I.
2017-07-14
This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.
NASA Technical Reports Server (NTRS)
Wang, Yongli; Benson, Robert F.
2011-01-01
Two software applications have been produced specifically for the analysis of some million digital topside ionograms produced by a recent analog-to-digital conversion effort of selected analog telemetry tapes from the Alouette-2, ISIS-1 and ISIS-2 satellites. One, TOPIST (TOPside Ionogram Scalar with True-height algorithm) from the University of Massachusetts Lowell, is designed for the automatic identification of the topside-ionogram ionospheric-reflection traces and their inversion into vertical electron-density profiles Ne(h). TOPIST also has the capability of manual intervention. The other application, from the Goddard Space Flight Center based on the FORTRAN code of John E. Jackson from the 1960s, is designed as an IDL-based interactive program for the scaling of selected digital topside-sounder ionograms. The Jackson code has also been modified, with some effort, so as to run on modern computers. This modification was motivated by the need to scale selected ionograms from the millions of Alouette/ISIS topside-sounder ionograms that only exist on 35-mm film. During this modification, it became evident that it would be more efficient to design a new code, based on the capabilities of present-day computers, than to continue to modify the old code. Such a new code has been produced and here we will describe its capabilities and compare Ne(h) profiles produced from it with those produced by the Jackson code. The concept of the new code is to assume an initial Ne(h) and derive a final Ne(h) through an iteration process that makes the resulting apparent-height profile fir the scaled values within a certain error range. The new code can be used on the X-, O-, and Z-mode traces. It does not assume any predefined profile shape between two contiguous points, like the exponential rule used in Jackson s program. Instead, Monotone Piecewise Cubic Interpolation is applied in the global profile to keep the monotone nature of the profile, which also ensures better smoothness in the final profile than in Jackson s program. The new code uses the complete refractive index expression for a cold collisionless plasma and can accommodate the IGRF, T96, and other geomagnetic field models.
Extracting and standardizing medication information in clinical text - the MedEx-UIMA system.
Jiang, Min; Wu, Yonghui; Shah, Anushi; Priyanka, Priyanka; Denny, Joshua C; Xu, Hua
2014-01-01
Extraction of medication information embedded in clinical text is important for research using electronic health records (EHRs). However, most of current medication information extraction systems identify drug and signature entities without mapping them to standard representation. In this study, we introduced the open source Java implementation of MedEx, an existing high-performance medication information extraction system, based on the Unstructured Information Management Architecture (UIMA) framework. In addition, we developed new encoding modules in the MedEx-UIMA system, which mapped an extracted drug name/dose/form to both generalized and specific RxNorm concepts and translated drug frequency information to ISO standard. We processed 826 documents by both systems and verified that MedEx-UIMA and MedEx (the Python version) performed similarly by comparing both results. Using two manually annotated test sets that contained 300 drug entries from medication list and 300 drug entries from narrative reports, the MedEx-UIMA system achieved F-measures of 98.5% and 97.5% respectively for encoding drug names to corresponding RxNorm generic drug ingredients, and F-measures of 85.4% and 88.1% respectively for mapping drug names/dose/form to the most specific RxNorm concepts. It also achieved an F-measure of 90.4% for normalizing frequency information to ISO standard. The open source MedEx-UIMA system is freely available online at http://code.google.com/p/medex-uima/.
[Standardization of terminology in laboratory medicine I].
Yoon, Soo Young; Yoon, Jong Hyun; Min, Won Ki; Lim, Hwan Sub; Song, Junghan; Chae, Seok Lae; Lee, Chang Kyu; Kwon, Jung Ah; Lee, Kap No
2007-04-01
Standardization of medical terminology is essential for data transmission between health-care institutions or clinical laboratories and for maximizing the benefits of information technology. Purpose of our study was to standardize the medical terms used in the clinical laboratory, such as test names, units, terms used in result descriptions, etc. During the first year of the study, we developed a standard database of concept names for laboratory terms, which covered the terms used in government health care centers, their branch offices, and primary health care units. Laboratory terms were collected from the electronic data interchange (EDI) codes from National Health Insurance Corporation (NHIC), Logical Observation Identifier Names and Codes (LOINC) database, community health centers and their branch offices, and clinical laboratories of representative university medical centers. For standard expression, we referred to the English-Korean/ Korean-English medical dictionary of Korean Medical Association and the rules for foreign language translation. Programs for mapping between LOINC DB and EDI code and for translating English to Korean were developed. A Korean standard laboratory terminology database containing six axial concept names such as components, property, time aspect, system (specimen), scale type, and method type was established for 7,508 test observations. Short names and a mapping table for EDI codes and Unified Medical Language System (UMLS) were added. Synonym tables for concept names, words used in the database, and six axial terms were prepared to make it easier to find the standard terminology with common terms used in the field of laboratory medicine. Here we report for the first time a Korean standard laboratory terminology database for test names, result description terms, result units covering most laboratory tests in primary healthcare centers.
Color inference in visual communication: the meaning of colors in recycling.
Schloss, Karen B; Lessard, Laurent; Walmsley, Charlotte S; Foley, Kathleen
2018-01-01
People interpret abstract meanings from colors, which makes color a useful perceptual feature for visual communication. This process is complicated, however, because there is seldom a one-to-one correspondence between colors and meanings. One color can be associated with many different concepts (one-to-many mapping) and many colors can be associated with the same concept (many-to-one mapping). We propose that to interpret color-coding systems, people perform assignment inference to determine how colors map onto concepts. We studied assignment inference in the domain of recycling. Participants saw images of colored but unlabeled bins and were asked to indicate which bins they would use to discard different kinds of recyclables and trash. In Experiment 1, we tested two hypotheses for how people perform assignment inference. The local assignment hypothesis predicts that people simply match objects with their most strongly associated color. The global assignment hypothesis predicts that people also account for the association strengths between all other objects and colors within the scope of the color-coding system. Participants discarded objects in bins that optimized the color-object associations of the entire set, which is consistent with the global assignment hypothesis. This sometimes resulted in discarding objects in bins whose colors were weakly associated with the object, even when there was a stronger associated option available. In Experiment 2, we tested different methods for encoding color-coding systems and found that people were better at assignment inference when color sets simultaneously maximized the association strength between assigned color-object parings while minimizing associations between unassigned pairings. Our study provides an approach for designing intuitive color-coding systems that facilitate communication through visual media such as graphs, maps, signs, and artifacts.
Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes
NASA Technical Reports Server (NTRS)
DeWitt, Kenneth; Garg Vijay; Ameri, Ali
2005-01-01
The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.
Nurse prescribing ethics and medical marketing.
Adams, J
This article suggests that nurse prescribers require an awareness of key concepts in ethics, such as deontology and utilitarianism to reflect on current debates and contribute to them. The principles of biomedical ethics have also been influential in the development of professional codes of conduct. Attention is drawn to the importance of the Association of the British Pharmaceutical Industry's code of practice for the pharmaceutical industry in regulating marketing aimed at prescribers.
Relativistic Klystron Amplifiers Driven by Modulated Intense Relativistic Electron Beams
1990-04-11
electrical parameters of the cavity were calculated using the SUPERFISH computer code. We found: (1) that the gap voltage, V was half as high as the...SUPERFISH computer code and experimenting with various cavities we found the best cavity geometry that fulfilled the above conditions. For this cavity...paths. Experiments along this line are being planned (T. Godlove and F. Mako, private communciation ). A somewhat different concept which also
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-20
... (Code 324), Field Border (Code 386), Filter Strip (Code 393), Land Smoothing (Code 466), Livestock... the implementation requirement document to the specifications and plans. Filter Strip (Code 393)--The...
Water cycle algorithm: A detailed standard code
NASA Astrophysics Data System (ADS)
Sadollah, Ali; Eskandar, Hadi; Lee, Ho Min; Yoo, Do Guen; Kim, Joong Hoon
Inspired by the observation of the water cycle process and movements of rivers and streams toward the sea, a population-based metaheuristic algorithm, the water cycle algorithm (WCA) has recently been proposed. Lately, an increasing number of WCA applications have appeared and the WCA has been utilized in different optimization fields. This paper provides detailed open source code for the WCA, of which the performance and efficiency has been demonstrated for solving optimization problems. The WCA has an interesting and simple concept and this paper aims to use its source code to provide a step-by-step explanation of the process it follows.
Flowgen: Flowchart-based documentation for C + + codes
NASA Astrophysics Data System (ADS)
Kosower, David A.; Lopez-Villarejo, J. J.
2015-11-01
We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.
Imitation Learning Errors Are Affected by Visual Cues in Both Performance and Observation Phases.
Mizuguchi, Takashi; Sugimura, Ryoko; Shimada, Hideaki; Hasegawa, Takehiro
2017-08-01
Mechanisms of action imitation were examined. Previous studies have suggested that success or failure of imitation is determined at the point of observing an action. In other words, cognitive processing after observation is not related to the success of imitation; 20 university students participated in each of three experiments in which they observed a series of object manipulations consisting of four elements (hands, tools, object, and end points) and then imitated the manipulations. In Experiment 1, a specific intially observed element was color coded, and the specific manipulated object at the imitation stage was identically color coded; participants accurately imitated the color coded element. In Experiment 2, a specific element was color coded at the observation but not at the imitation stage, and there were no effects of color coding on imitation. In Experiment 3, participants were verbally instructed to attend to a specific element at the imitation stage, but the verbal instructions had no effect. Thus, the success of imitation may not be determined at the stage of observing an action and color coding can provide a clue for imitation at the imitation stage.
The ASSERT Virtual Machine Kernel: Support for Preservation of Temporal Properties
NASA Astrophysics Data System (ADS)
Zamorano, J.; de la Puente, J. A.; Pulido, J. A.; Urueña
2008-08-01
A new approach to building embedded real-time software has been developed in the ASSERT project. One of its key elements is the concept of a virtual machine preserving the non-functional properties of the system, and especially real-time properties, all the way down from high- level design models down to executable code. The paper describes one instance of the virtual machine concept that provides support for the preservation of temporal properties both at the source code level —by accept- ing only "legal" entities, i.e. software components with statically analysable real-tim behaviour— and at run-time —by monitoring the temporal behaviour of the system. The virtual machine has been validated on several pilot projects carried out by aerospace companies in the framework of the ASSERT project.
ERIC Educational Resources Information Center
Fulmer, Gavin W.; Liang, Ling L.; Liu, Xiufeng
2014-01-01
This exploratory study applied a proposed force and motion learning progression (LP) to high-school and university students and to content involving both one- and two-dimensional force and motion situations. The Force Concept Inventory (FCI) was adapted, based on a previous content analysis and coding of the questions in the FCI in terms of the…
ERIC Educational Resources Information Center
Erduran, Sibel
Eight physical science textbooks were analyzed for coverage on acids, bases, and neutralization. At the level of the text, clarity and coherence of statements were investigated. The conceptual framework for this topic was represented in a concept map which was used as a coding tool for tracing concepts and links present in textbooks. Cognitive…
Death of a dogma: eukaryotic mRNAs can code for more than one protein.
Mouilleron, Hélène; Delcourt, Vivian; Roucou, Xavier
2016-01-08
mRNAs carry the genetic information that is translated by ribosomes. The traditional view of a mature eukaryotic mRNA is a molecule with three main regions, the 5' UTR, the protein coding open reading frame (ORF) or coding sequence (CDS), and the 3' UTR. This concept assumes that ribosomes translate one ORF only, generally the longest one, and produce one protein. As a result, in the early days of genomics and bioinformatics, one CDS was associated with each protein-coding gene. This fundamental concept of a single CDS is being challenged by increasing experimental evidence indicating that annotated proteins are not the only proteins translated from mRNAs. In particular, mass spectrometry (MS)-based proteomics and ribosome profiling have detected productive translation of alternative open reading frames. In several cases, the alternative and annotated proteins interact. Thus, the expression of two or more proteins translated from the same mRNA may offer a mechanism to ensure the co-expression of proteins which have functional interactions. Translational mechanisms already described in eukaryotic cells indicate that the cellular machinery is able to translate different CDSs from a single viral or cellular mRNA. In addition to summarizing data showing that the protein coding potential of eukaryotic mRNAs has been underestimated, this review aims to challenge the single translated CDS dogma. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Micromagnetic Code Development of Advanced Magnetic Structures Final Report CRADA No. TC-1561-98
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cerjan, Charles J.; Shi, Xizeng
The specific goals of this project were to: Further develop the previously written micromagnetic code DADIMAG (DOE code release number 980017); Validate the code. The resulting code was expected to be more realistic and useful for simulations of magnetic structures of specific interest to Read-Rite programs. We also planned to further the code for use in internal LLNL programs. This project complemented LLNL CRADA TC-840-94 between LLNL and Read-Rite, which allowed for simulations of the advanced magnetic head development completed under the CRADA. TC-1561-98 was effective concurrently with LLNL non-exclusive copyright license (TL-1552-98) to Read-Rite for DADIMAG Version 2 executablemore » code.« less
On (scientific) integrity: conceptual clarification.
Patrão Neves, Maria do Céu
2018-06-01
The notion of "integrity" is currently quite common and broadly recognized as complex, mostly due to its recurring and diverse application in various distinct domains such as the physical, psychic or moral, the personal or professional, that of the human being or of the totality of beings. Nevertheless, its adjectivation imprints a specific meaning, as happens in the case of "scientific integrity". This concept has been defined mostly by via negativa, by pointing out what goes against integrity, that is, through the identification of its infringements, which has also not facilitated the elaboration of an overarching and consensual code of scientific integrity. In this context, it is deemed necessary to clarify the notion of "integrity", first etymologically, recovering the original meaning of the term, and then in a specifically conceptual way, through the identification of the various meanings with which the term can be legitimately used, particularly in the domain of scientific research and innovation. These two steps are fundamental and indispensable for a forthcoming attempt at systematizing the requirements of "scientific integrity".
NASA Astrophysics Data System (ADS)
Xu, Jinyang; El Mansori, Mohamed
2016-10-01
This paper studied the machinability of hybrid CFRP/Ti stack via the numerical approach. To this aim, an original FE model consisting of three fundamental physical constituents, i.e., CFRP phase, interface and Ti phase, was established in the Abaqus Explicit/code to construct the machining behavior of the composite-to-metal alliance. The CFRP phase was modeled as an equivalent homogeneous material (EHM) by considering its anisotropic behavior relative to the fiber orientation (θ) while the Ti alloy phase was assumed to exhibit isotropic and elastic-plastic behavior. The "interface" linking the "CFRP-to-Ti" contact boundary was physically modeled as an intermediate transition region through the concept of cohesive zone (CZ). Different constitutive laws and damage criteria were implemented to simulate the chip separation process of the bi-material system. The key cutting responses including specific cutting energy consumption, induced subsurface damage, and interface delamination were precisely addressed via the comprehensive FE analyses, and several key conclusions were drawn from this study.
Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
NASA. Lewis Research Center Advanced Modulation and Coding Project: Introduction and overview
NASA Technical Reports Server (NTRS)
Budinger, James M.
1992-01-01
The Advanced Modulation and Coding Project at LeRC is sponsored by the Office of Space Science and Applications, Communications Division, Code EC, at NASA Headquarters and conducted by the Digital Systems Technology Branch of the Space Electronics Division. Advanced Modulation and Coding is one of three focused technology development projects within the branch's overall Processing and Switching Program. The program consists of industry contracts for developing proof-of-concept (POC) and demonstration model hardware, university grants for analyzing advanced techniques, and in-house integration and testing of performance verification and systems evaluation. The Advanced Modulation and Coding Project is broken into five elements: (1) bandwidth- and power-efficient modems; (2) high-speed codecs; (3) digital modems; (4) multichannel demodulators; and (5) very high-data-rate modems. At least one contract and one grant were awarded for each element.
NASA Astrophysics Data System (ADS)
Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi
2018-01-01
Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.
Identifying Pediatric Severe Sepsis and Septic Shock: Accuracy of Diagnosis Codes.
Balamuth, Fran; Weiss, Scott L; Hall, Matt; Neuman, Mark I; Scott, Halden; Brady, Patrick W; Paul, Raina; Farris, Reid W D; McClead, Richard; Centkowski, Sierra; Baumer-Mouradian, Shannon; Weiser, Jason; Hayes, Katie; Shah, Samir S; Alpern, Elizabeth R
2015-12-01
To evaluate accuracy of 2 established administrative methods of identifying children with sepsis using a medical record review reference standard. Multicenter retrospective study at 6 US children's hospitals. Subjects were children >60 days to <19 years of age and identified in 4 groups based on International Classification of Diseases, Ninth Revision, Clinical Modification codes: (1) severe sepsis/septic shock (sepsis codes); (2) infection plus organ dysfunction (combination codes); (3) subjects without codes for infection, organ dysfunction, or severe sepsis; and (4) infection but not severe sepsis or organ dysfunction. Combination codes were allowed, but not required within the sepsis codes group. We determined the presence of reference standard severe sepsis according to consensus criteria. Logistic regression was performed to determine whether addition of codes for sepsis therapies improved case identification. A total of 130 out of 432 subjects met reference SD of severe sepsis. Sepsis codes had sensitivity 73% (95% CI 70-86), specificity 92% (95% CI 87-95), and positive predictive value 79% (95% CI 70-86). Combination codes had sensitivity 15% (95% CI 9-22), specificity 71% (95% CI 65-76), and positive predictive value 18% (95% CI 11-27). Slight improvements in model characteristics were observed when codes for vasoactive medications and endotracheal intubation were added to sepsis codes (c-statistic 0.83 vs 0.87, P = .008). Sepsis specific International Classification of Diseases, Ninth Revision, Clinical Modification codes identify pediatric patients with severe sepsis in administrative data more accurately than a combination of codes for infection plus organ dysfunction. Copyright © 2015 Elsevier Inc. All rights reserved.
Specific and Modular Binding Code for Cytosine Recognition in Pumilio/FBF (PUF) RNA-binding Domains
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Shuyun; Wang, Yang; Cassidy-Amstutz, Caleb
2011-10-28
Pumilio/fem-3 mRNA-binding factor (PUF) proteins possess a recognition code for bases A, U, and G, allowing designed RNA sequence specificity of their modular Pumilio (PUM) repeats. However, recognition side chains in a PUM repeat for cytosine are unknown. Here we report identification of a cytosine-recognition code by screening random amino acid combinations at conserved RNA recognition positions using a yeast three-hybrid system. This C-recognition code is specific and modular as specificity can be transferred to different positions in the RNA recognition sequence. A crystal structure of a modified PUF domain reveals specific contacts between an arginine side chain and themore » cytosine base. We applied the C-recognition code to design PUF domains that recognize targets with multiple cytosines and to generate engineered splicing factors that modulate alternative splicing. Finally, we identified a divergent yeast PUF protein, Nop9p, that may recognize natural target RNAs with cytosine. This work deepens our understanding of natural PUF protein target recognition and expands the ability to engineer PUF domains to recognize any RNA sequence.« less
Low Cost Large Core Vehicle Structures Assessment
NASA Technical Reports Server (NTRS)
Hahn, Steven E.
1998-01-01
Boeing Information, Space, and Defense Systems executed a Low Cost Large Core Vehicle Structures Assessment (LCLCVSA) under contract to NASA Marshall Space Flight Center (MSFC) between November 1997 and March 1998. NASA is interested in a low-cost launch vehicle, code named Magnum, to place heavy payloads into low earth orbit for missions such as a manned mission to Mars, a Next Generation Space Telescope, a lunar-based telescope, the Air Force's proposed space based laser, and large commercial satellites. In this study, structural concepts with the potential to reduce fabrication costs were evaluated in application to the Magnum Launch Vehicle (MLV) and the Liquid Fly Back Booster (LFBB) shuttle upgrade program. Seventeen concepts were qualitatively evaluated to select four concepts for more in-depth study. The four structural concepts selected were: an aluminum-lithium monocoque structure, an aluminum-lithium machined isogrid structure, a unitized composite sandwich structure, and a unitized composite grid structure. These were compared against a baseline concept based on the Space Shuttle External Tank (ET) construction. It was found that unitized composite structures offer significant cost and weight benefits to MLV structures. The limited study of application to LFBB structures indicated lower, but still significant benefits. Technology and facilities development roadmaps to prepare the approaches studied for application to MLV and LFBB were constructed. It was found that the cost and schedule to develop these approaches were in line with both MLV and LFBB development schedules. Current Government and Boeing programs which address elements of the development of the technologies identified are underway. It is recommended that NASA devote resources in a timely fashion to address the specific elements related to MLV and LFBB structures.
Moyson, T; Roeyers, H
2012-01-01
The concept of family quality of life is becoming increasingly important in family support programmes. This concept describes the quality of life of all family members and the family system as a whole, but only the opinion of the parents has been included. The opinion of the siblings has been incorporated in the opinions of the parents, although research has shown that there is discordance between parents' and siblings' reports. The principal goal of this study is to investigate how young siblings of children with intellectual disability define their quality of life as a sibling. As we were more concerned with understanding the experience of being a sibling from the siblings' own frame of reference, we opted for a qualitative research design and more specifically used in-depth, phenomenology-based interviews. Data were sorted by means of a process of continuously comparing the codes according to the principles of grounded theory. Siblings described the following nine domains as domains of sibling quality of life: joint activities, mutual understanding, private time, acceptance, forbearance, trust in well-being, exchanging experiences, social support and dealing with the outside world. This study shows not only that siblings can define their quality of life, but also that this definition of sibling quality of life differs from the family quality of life concept. Therefore, it may be not only a valuable addition to the family quality of life concept but also an appropriate concept to describe siblings' experience. © 2011 The Authors. Journal of Intellectual Disability Research © 2011 Blackwell Publishing Ltd.
Adapting a Clinical Data Repository to ICD-10-CM through the use of a Terminology Repository
Cimino, James J.; Remennick, Lyubov
2014-01-01
Clinical data repositories frequently contain patient diagnoses coded with the International Classification of Diseases, Ninth Revision (ICD-9-CM). These repositories now need to accommodate data coded with the Tenth Revision (ICD-10-CM). Database users wish to retrieve relevant data regardless of the system by which they are coded. We demonstrate how a terminology repository (the Research Entities Dictionary or RED) serves as an ontology relating terms of both ICD versions to each other to support seamless version-independent retrieval from the Biomedical Translational Research Information System (BTRIS) at the National Institutes of Health. We make use of the Center for Medicare and Medicaid Services’ General Equivalence Mappings (GEMs) to reduce the modeling effort required to determine whether ICD-10-CM terms should be added to the RED as new concepts or as synonyms of existing concepts. A divide-and-conquer approach is used to develop integration heuristics that offer a satisfactory interim solution and facilitate additional refinement of the integration as time and resources allow. PMID:25954344
Optical information encryption based on incoherent superposition with the help of the QR code
NASA Astrophysics Data System (ADS)
Qin, Yi; Gong, Qiong
2014-01-01
In this paper, a novel optical information encryption approach is proposed with the help of QR code. This method is based on the concept of incoherent superposition which we introduce for the first time. The information to be encrypted is first transformed into the corresponding QR code, and thereafter the QR code is further encrypted into two phase only masks analytically by use of the intensity superposition of two diffraction wave fields. The proposed method has several advantages over the previous interference-based method, such as a higher security level, a better robustness against noise attack, a more relaxed work condition, and so on. Numerical simulation results and actual smartphone collected results are shown to validate our proposal.
Djordjevic, Ivan B
2007-08-06
We describe a coded power-efficient transmission scheme based on repetition MIMO principle suitable for communication over the atmospheric turbulence channel, and determine its channel capacity. The proposed scheme employs the Q-ary pulse-position modulation. We further study how to approach the channel capacity limits using low-density parity-check (LDPC) codes. Component LDPC codes are designed using the concept of pairwise-balanced designs. Contrary to the several recent publications, bit-error rates and channel capacities are reported assuming non-ideal photodetection. The atmospheric turbulence channel is modeled using the Gamma-Gamma distribution function due to Al-Habash et al. Excellent bit-error rate performance improvement, over uncoded case, is found.
2011-01-01
Background The identification of patients who pose an epidemic hazard when they are admitted to a health facility plays a role in preventing the risk of hospital acquired infection. An automated clinical decision support system to detect suspected cases, based on the principle of syndromic surveillance, is being developed at the University of Lyon's Hôpital de la Croix-Rousse. This tool will analyse structured data and narrative reports from computerized emergency department (ED) medical records. The first step consists of developing an application (UrgIndex) which automatically extracts and encodes information found in narrative reports. The purpose of the present article is to describe and evaluate this natural language processing system. Methods Narrative reports have to be pre-processed before utilizing the French-language medical multi-terminology indexer (ECMT) for standardized encoding. UrgIndex identifies and excludes syntagmas containing a negation and replaces non-standard terms (abbreviations, acronyms, spelling errors...). Then, the phrases are sent to the ECMT through an Internet connection. The indexer's reply, based on Extensible Markup Language, returns codes and literals corresponding to the concepts found in phrases. UrgIndex filters codes corresponding to suspected infections. Recall is defined as the number of relevant processed medical concepts divided by the number of concepts evaluated (coded manually by the medical epidemiologist). Precision is defined as the number of relevant processed concepts divided by the number of concepts proposed by UrgIndex. Recall and precision were assessed for respiratory and cutaneous syndromes. Results Evaluation of 1,674 processed medical concepts contained in 100 ED medical records (50 for respiratory syndromes and 50 for cutaneous syndromes) showed an overall recall of 85.8% (95% CI: 84.1-87.3). Recall varied from 84.5% for respiratory syndromes to 87.0% for cutaneous syndromes. The most frequent cause of lack of processing was non-recognition of the term by UrgIndex (9.7%). Overall precision was 79.1% (95% CI: 77.3-80.8). It varied from 81.4% for respiratory syndromes to 77.0% for cutaneous syndromes. Conclusions This study demonstrates the feasibility of and interest in developing an automated method for extracting and encoding medical concepts from ED narrative reports, the first step required for the detection of potentially infectious patients at epidemic risk. PMID:21798029
Researcher Perceptions of Ethical Guidelines and Codes of Conduct
Giorgini, Vincent; Mecca, Jensen T.; Gibson, Carter; Medeiros, Kelsey; Mumford, Michael D.; Connelly, Shane; Devenport, Lynn D.
2014-01-01
Ethical codes of conduct exist in almost every profession. Field-specific codes of conduct have been around for decades, each articulating specific ethical and professional guidelines. However, there has been little empirical research on researchers’ perceptions of these codes of conduct. In the present study, we interviewed faculty members in six research disciplines and identified five themes bearing on the circumstances under which they use ethical guidelines and the underlying reasons for not adhering to such guidelines. We then identify problems with the manner in which codes of conduct in academia are constructed and offer solutions for overcoming these problems. PMID:25635845
From 2D to 3D modelling in long term tectonics: Modelling challenges and HPC solutions (Invited)
NASA Astrophysics Data System (ADS)
Le Pourhiet, L.; May, D.
2013-12-01
Over the last decades, 3D thermo-mechanical codes have been made available to the long term tectonics community either as open source (Underworld, Gale) or more limited access (Fantom, Elvis3D, Douar, LaMem etc ...). However, to date, few published results using these methods have included the coupling between crustal and lithospheric dynamics at large strain. The fact that these computations are computational expensive is not the primary reason for the relatively slow development of 3D modeling in the long term tectonics community, as compare to the rapid development observed within the mantle dynamic community, or in the short-term tectonics field. Long term tectonics problems have specific issues not found in either of these two field, including; large strain (not an issue for short-term), the inclusion of free surface and the occurence of large viscosity contrasts. The first issue is typically eliminated using a combined marker-ALE method instead of fully lagrangian method, however, the marker-ALE approach can pose some algorithmic challenges in a massively parallel environment. The two last issues are more problematic because they affect the convergence of the linear/non-linear solver and the memory cost. Two options have been tested so far, using low order element and solving with a sparse direct solver, or using higher order stable elements together with a multi-grid solver. The first options, is simpler to code and to use but reaches its limit at around 80^3 low order elements. The second option requires more operations but allows using iterative solver on extremely large computers. In this presentation, I will describe the design philosophy and highlight results obtained using a code from the second-class method. The presentation will be oriented from an end-user point of view, using an application from 3D continental break up to illustrate key concepts. The description will proceed point by point from implementing physics into the code, to dealing with specific issues related to solving the discrete system of non linear equations.
Plant Habitat Telemetry / Command Interface and E-MIST
NASA Technical Reports Server (NTRS)
Walker, Uriae M.
2013-01-01
Plant Habitat (PH) is an experiment to be taken to the International Space Station (ISS) in 2016. It is critical that ground support computers have the ability to uplink commands to control PH, and that ISS computers have the ability to downlink PH telemetry data to ground support. This necessitates communication software that can send, receive, and process, PH specific commands and telemetry. The objective of the Plant Habitat Telemetry/ Command Interface is to provide this communication software, and to couple it with an intuitive Graphical User Interface (GUI). Initial investigation of the project objective led to the decision that code be written in C++ because of its compatibility with existing source code infrastructures and robustness. Further investigation led to a determination that multiple Ethernet packet structures would need to be created to effectively transmit data. Setting a standard for packet structures would allow us to distinguish these packets that would range from command type packets to sub categories of telemetry packets. In order to handle this range of packet types, the conclusion was made to take an object-oriented programming approach which complemented our decision to use the C++ programming language. In addition, extensive utilization of port programming concepts was required to implement the core functionality of the communication software. Also, a concrete understanding of a packet processing software was required in order to put aU the components of ISS-to-Ground Support Equipment (GSE) communication together and complete the objective. A second project discussed in this paper is Exposing Microbes to the Stratosphere (EMIST). This project exposes microbes into the stratosphere to observe how they are impacted by atmospheric effects. This paper focuses on the electrical and software expectations of the project, specifically drafting the printed circuit board, and programming the on-board sensors. The Eagle Computer-Aided Drafting (CAD) software was used to draft the E-MIST circuit. This required several component libraries to be created. Coding the sensors and obtaining sensor data involved using the Arduino Uno developmental board and coding language, and properly wiring peripheral sensors to the microcontroller (the central control unit of the experiment).
Towards a Consolidated Approach for the Assessment of Evaluation Models of Nuclear Power Reactors
Epiney, A.; Canepa, S.; Zerkak, O.; ...
2016-11-02
The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less
Tang, G.; Andre, B.; Hoffman, F. M.; Painter, S. L.; Thornton, P. E.; Yuan, F.; Bisht, G.; Hammond, G. E.; Lichtner, P. C.; Kumar, J.; Mills, R. T.; Xu, X.
2016-04-19
This Modeling Archive is in support of an NGEE Arctic discussion paper under review and available at doi:10.5194/gmd-9-927-2016. The purpose is to document the simulations to allow verification, reproducibility, and follow-up studies. This dataset contains shell scripts to create the CLM-PFLOTRAN cases, specific input files for PFLOTRAN and CLM, outputs, and python scripts to make the figures using the outputs in the publication. Through these results, we demonstrate that CLM-PFLOTRAN can approximately reproduce CLM results in selected cases for the Arctic, temperate and tropic sites. In addition, the new framework facilitates mechanistic representations of soil biogeochemistry processes in the land surface model.
NASA Technical Reports Server (NTRS)
Metcalf, David
1995-01-01
Multimedia Information eXchange (MIX) is a multimedia information system that accommodates multiple data types and provides consistency across platforms. Information from all over the world can be accessed quickly and efficiently with the Internet-based system. I-NET's MIX uses the World Wide Web and Mosaic graphical user interface. Mosaic is available on all platforms used at I-NET's Kennedy Space Center (KSC) facilities. Key information system design concepts and benefits are reviewed. The MIX system also defines specific configuration and helper application parameters to ensure consistent operations across the entire organization. Guidelines and procedures for other areas of importance in information systems design are also addressed. Areas include: code of ethics, content, copyright, security, system administration, and support.
Pinto, R M; Rahman, R; Williams, A
2014-12-01
There is limited knowledge on re-entry initiatives for formerly incarcerated women specifically on building women's advocacy and leadership skills. Our research highlights an empowerment evaluation on ReConnect, a 12-session; innovative advocacy and leadership development program rooted in an integrated framework of empowerment, and transformational leadership theories. Using thematic analysis, we coded three focus groups with 24 graduates, for themes that matched our framework's concepts. ReConnect graduates reported being empowered by the information they received on parental rights, housing, and employment. Participants agreed that ReConnect improved their communication skills, preparing them to advocate for themselves and community members. Copyright © 2014 Elsevier Ltd. All rights reserved.
Aminoacyl-tRNA synthetases: versatile players in the changing theater of translation.
Francklyn, Christopher; Perona, John J; Puetz, Joern; Hou, Ya-Ming
2002-01-01
Aminoacyl-tRNA synthetases attach amino acids to the 3' termini of cognate tRNAs to establish the specificity of protein synthesis. A recent Asilomar conference (California, January 13-18, 2002) discussed new research into the structure-function relationship of these crucial enzymes, as well as a multitude of novel functions, including participation in amino acid biosynthesis, cell cycle control, RNA splicing, and export of tRNAs from nucleus to cytoplasm in eukaryotic cells. Together with the discovery of their role in the cellular synthesis of proteins to incorporate selenocysteine and pyrrolysine, these diverse functions of aminoacyl-tRNA synthetases underscore the flexibility and adaptability of these ancient enzymes and stimulate the development of new concepts and methods for expanding the genetic code. PMID:12458790
NASA Lewis Research Center Workshop on Forced Response in Turbomachinery
NASA Technical Reports Server (NTRS)
Stefko, George L. (Compiler); Murthy, Durbha V. (Compiler); Morel, Michael (Compiler); Hoyniak, Dan (Compiler); Gauntner, Jim W. (Compiler)
1994-01-01
A summary of the NASA Lewis Research Center (LeRC) Workshop on Forced Response in Turbomachinery in August, 1993 is presented. It was sponsored by the following NASA organizations: Structures, Space Propulsion Technology, and Propulsion Systems Divisions of NASA LeRC and the Aeronautics and Advanced Concepts & Technology Offices of NASA Headquarters. In addition, the workshop was held in conjunction with the GUIde (Government/Industry/Universities) Consortium on Forced Response. The workshop was specifically designed to receive suggestions and comments from industry on current research at NASA LeRC in the area of forced vibratory response of turbomachinery blades which includes both computational and experimental approaches. There were eight presentations and a code demonstration. Major areas of research included aeroelastic response, steady and unsteady fluid dynamics, mistuning, and corresponding experimental work.
NASA Technical Reports Server (NTRS)
Cross, James H., II; Morrison, Kelly I.; May, Charles H., Jr.; Waddel, Kathryn C.
1989-01-01
The first phase of a three-phase effort to develop a new graphically oriented specification language which will facilitate the reverse engineering of Ada source code into graphical representations (GRs) as well as the automatic generation of Ada source code is described. A simplified view of the three phases of Graphical Representations for Algorithms, Structure, and Processes for Ada (GRASP/Ada) with respect to three basic classes of GRs is presented. Phase 1 concentrated on the derivation of an algorithmic diagram, the control structure diagram (CSD) (CRO88a) from Ada source code or Ada PDL. Phase 2 includes the generation of architectural and system level diagrams such as structure charts and data flow diagrams and should result in a requirements specification for a graphically oriented language able to support automatic code generation. Phase 3 will concentrate on the development of a prototype to demonstrate the feasibility of this new specification language.
CONFERENCE REPORT: Summary of the 8th IAEA Technical Meeting on Fusion Power Plant Safety
NASA Astrophysics Data System (ADS)
Girard, J. Ph.; Gulden, W.; Kolbasov, B.; Louzeiro-Malaquias, A.-J.; Petti, D.; Rodriguez-Rodrigo, L.
2008-01-01
Reports were presented covering a selection of topics on the safety of fusion power plants. These included a review on licensing studies developed for ITER site preparation surveying common and non-common issues (i.e. site dependent) as lessons to a broader approach for fusion power plant safety. Several fusion power plant models, spanning from accessible technology to more advanced-materials based concepts, were discussed. On the topic related to fusion-specific technology, safety studies were reported on different concepts of breeding blanket modules, tritium handling and auxiliary systems under normal and accident scenarios' operation. The testing of power plant relevant technology in ITER was also assessed in terms of normal operation and accident scenarios, and occupational doses and radioactive releases under these testings have been determined. Other specific safety issues for fusion have also been discussed such as availability and reliability of fusion power plants, dust and tritium inventories and component failure databases. This study reveals that the environmental impact of fusion power plants can be minimized through a proper selection of low activation materials and using recycling technology helping to reduce waste volume and potentially open the route for its reutilization for the nuclear sector or even its clearance into the commercial circuit. Computational codes for fusion safety have been presented in support of the many studies reported. The on-going work on establishing validation approaches aiming at improving the prediction capability of fusion codes has been supported by experimental results and new directions for development have been identified. Fusion standards are not available and fission experience is mostly used as the framework basis for licensing and target design for safe operation and occupational and environmental constraints. It has been argued that fusion can benefit if a specific fusion approach is implemented, in particular for materials selection which will have a large impact on waste disposal and recycling and in the real limits of radiation releases if indexed to the real impact on individuals and the environment given the differences in the types of radiation emitted by tritium when compared with the fission products. Round table sessions resulted in some common recommendations. The discussions also created the awareness of the need for a larger involvement of the IAEA in support of fusion safety standards development.
Geometric Processing and Its Relational Graphics
1976-10-01
20, If different from Report) f3. SUPPLEMENTARY NOTES 9. KEY WORDS (Cbnttnue on reverse aide if neceaaary .mdldentlfy by bfock number) Graphics GIFT ...are typified by defining an object as a series of adjacent triangular or rectangular patches or surfaces (ruled surfaces may also be used). The GIFT ...code embodies the Patch code concept in one of its solids, the ARS; however, processing of a many-faceted GIFT solid takes longer to process than its
High Performance Object-Oriented Scientific Programming in Fortran 90
NASA Technical Reports Server (NTRS)
Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.
1997-01-01
We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.
Thin family: a new barcode concept
NASA Astrophysics Data System (ADS)
Allais, David C.
1991-02-01
This paper describes a new space-efficient family of thin bar code symbologies which are appropriate for representing small amounts of information. The proposed structure is 30 to 50 percent more compact than the narrowest existing bar code when 12 or fewer bits of information are to be encoded in each symbol. Potential applications for these symbologies include menus catalogs automated test and survey scoring and biological research such as the tracking of honey bees.
Coding Gains for Rank Decoding
1990-02-01
PM PUB=C RERZASB DISThIDUnO UNLI M . U.S. ARMY LABORATORY COWMAND BALLISTIC RESEARCH LABORATORY ABERDEEN PROVING GROUND, MARYLAND 9o 03 is.032...Proving Ground, MD 21005-5066 ATITN: SLCBR-D Aberdeen Proving Ground, M 21005-5066 8a NAME OF FUNDING , SPONSORING 8b OFFICE SYMBOL 9 PROCUREMENT...Previouseditionsare obsolete. SECURITY CLASSIFILATION OF THIS PAGE mm m ini IIIIIIIIIIIIIII I Isn FI E Contents 1 Soft Decision Concepts 1 2 Coding Gain 2 3
2008-12-01
multiconductor transmission line theory. The per-unit capacitance, inductance , and characteristic impedance matrices generated from the companion LAPLACE...code based on the Method of Moments application, by meshing different sections of the multiconductor cable for capacitance and inductance matrices [21...conductors held together in four pairs and resided in the cable jacket. Each of eight conductors was also designed with the per unit length resistance
Spotted star mapping by light curve inversion: Tests and application to HD 12545
NASA Astrophysics Data System (ADS)
Kolbin, A. I.; Shimansky, V. V.
2013-06-01
A code for mapping the surfaces of spotted stars is developed. The concept of the code is to analyze rotational-modulated light curves. We simulate the process of reconstruction for the star surface and the results of simulation are presented. The reconstruction atrifacts caused by the ill-posed nature of the problem are deduced. The surface of the spotted component of system HD 12545 is mapped using the procedure.
Juul, Malene; Bertl, Johanna; Guo, Qianyun; Nielsen, Morten Muhlig; Świtnicki, Michał; Hornshøj, Henrik; Madsen, Tobias; Hobolth, Asger; Pedersen, Jakob Skou
2017-01-01
Non-coding mutations may drive cancer development. Statistical detection of non-coding driver regions is challenged by a varying mutation rate and uncertainty of functional impact. Here, we develop a statistically founded non-coding driver-detection method, ncdDetect, which includes sample-specific mutational signatures, long-range mutation rate variation, and position-specific impact measures. Using ncdDetect, we screened non-coding regulatory regions of protein-coding genes across a pan-cancer set of whole-genomes (n = 505), which top-ranked known drivers and identified new candidates. For individual candidates, presence of non-coding mutations associates with altered expression or decreased patient survival across an independent pan-cancer sample set (n = 5454). This includes an antigen-presenting gene (CD1A), where 5’UTR mutations correlate significantly with decreased survival in melanoma. Additionally, mutations in a base-excision-repair gene (SMUG1) correlate with a C-to-T mutational-signature. Overall, we find that a rich model of mutational heterogeneity facilitates non-coding driver identification and integrative analysis points to candidates of potential clinical relevance. DOI: http://dx.doi.org/10.7554/eLife.21778.001 PMID:28362259
Improving the sensitivity and specificity of the abbreviated injury scale coding system.
Kramer, C F; Barancik, J I; Thode, H C
1990-01-01
The Abbreviated Injury Scale with Epidemiologic Modifications (AIS 85-EM) was developed to make it possible to code information about anatomic injury types and locations that, although generally available from medical records, is not codable under the standard Abbreviated Injury Scale, published by the American Association for Automotive Medicine in 1985 (AIS 85). In a population-based sample of 3,223 motor vehicle trauma cases, 68 percent of the patients had one or more injuries that were coded to the AIS 85 body region nonspecific category external. When the same patients' injuries were coded using the AIS 85-EM coding procedure, only 15 percent of the patients had injuries that could not be coded to a specific body region. With AIS 85-EM, the proportion of codable head injury cases increased from 16 percent to 37 percent, thereby improving the potential for identifying cases with head and threshold brain injury. The data suggest that body region coding of all injuries is necessary to draw valid and reliable conclusions about changes in injury patterns and their sequelae. The increased specificity of body region coding improves assessments of the efficacy of injury intervention strategies and countermeasure programs using epidemiologic methodology. PMID:2116633
Clinical wisdom: the essential foundation of "good" nursing care.
Haggerty, Lois A; Grace, Pamela
2008-01-01
Clinical wisdom, an essential foundation of nursing care that provides for the "good" of individual patients while taking into account the common good, is a concept that is difficult to define and comprehend. However, understanding what constitutes clinical wisdom is essential for the education of the types of nurses who are most likely to provide leadership that is consistent with the goals of nursing as outlined in the 2005 Code of Ethics for Nurses of the International Council of Nurses and the 2001 Code of Ethics for Nurses With Interpretive Statements of the American Nurses Association. The three key elements of wisdom, derived from the psychology and philosophy literature, are (1) balancing and providing for the good of another and the common good, (2) the use of intellect and affect in problem solving, and (3) the demonstration of experience-based tacit knowing in problematic situations. We conceptualized clinical wisdom as a more specific variant of general wisdom by examining how the core elements described can be linked to wisdom for nursing practice. In doing so, the nature of clinical wisdom is clarified and strategies are suggested to assist nurse educators in developing wise nurses.
Baker, Robert
2009-01-01
Although bioethics societies are developing standards for clinical ethicists and a code of ethics, they have been castigated in this journal as "a moral, if not an ethics, disaster" for not having completed this task. Compared with the development of codes of ethics and educational standards in law and medicine, however, the pace of professionalization in bioethics appears appropriate. Assessed by this metric, none of the charges leveled against bioethics are justified. The specific charges leveled against the American Society for Bioethics and Humanities (ASBH) and its Core Competencies report are analyzed and rejected as artifacts of an ahistoric conception of the stages by which organizations professionalize. For example, the charge that the ASBH should provide definitive criteria for what counts as "medical ethics consultation" antecedent to further progress towards professionalization is assessed by comparing it with the American Medical Association's decades-long struggle to define who can legitimately claim the title "medical doctor." Historically, clarity about who is legitimately a doctor, a lawyer - or a "clinical ethicist"- is a byproduct of, and never antecedent to, the decades-long process by which a field professionalizes. The charges leveled against ASBH thus appear to be a function of impatient, ahistoric perfectionism.
ERIC Educational Resources Information Center
Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed
2013-01-01
Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…
Summary of Pressure Gain Combustion Research at NASA
NASA Technical Reports Server (NTRS)
Perkins, H. Douglas; Paxson, Daniel E.
2018-01-01
NASA has undertaken a systematic exploration of many different facets of pressure gain combustion over the last 25 years in an effort to exploit the inherent thermodynamic advantage of pressure gain combustion over the constant pressure combustion process used in most aerospace propulsion systems. Applications as varied as small-scale UAV's, rotorcraft, subsonic transports, hypersonics and launch vehicles have been considered. In addition to studying pressure gain combustor concepts such as wave rotors, pulse detonation engines, pulsejets, and rotating detonation engines, NASA has studied inlets, nozzles, ejectors and turbines which must also process unsteady flow in an integrated propulsion system. Other design considerations such as acoustic signature, combustor material life and heat transfer that are unique to pressure gain combustors have also been addressed in NASA research projects. In addition to a wide range of experimental studies, a number of computer codes, from 0-D up through 3-D, have been developed or modified to specifically address the analysis of unsteady flow fields. Loss models have also been developed and incorporated into these codes that improve the accuracy of performance predictions and decrease computational time. These codes have been validated numerous times across a broad range of operating conditions, and it has been found that once validated for one particular pressure gain combustion configuration, these codes are readily adaptable to the others. All in all, the documentation of this work has encompassed approximately 170 NASA technical reports, conference papers and journal articles to date. These publications are very briefly summarized herein, providing a single point of reference for all of NASA's pressure gain combustion research efforts. This documentation does not include the significant contributions made by NASA research staff to the programs of other agencies, universities, industrial partners and professional society committees through serving as technical advisors, technical reviewers and research consultants.
NASA Technical Reports Server (NTRS)
Hinds, Erold W. (Principal Investigator)
1996-01-01
This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.
May, Folasade P; Whitman, Cynthia B; Varlyguina, Ksenia; Bromley, Erica G; Spiegel, Brennan M R
2016-09-01
African Americans have the highest burden of colorectal cancer (CRC) in the United States of America (USA) yet lower CRC screening rates than whites. Although poor screening has prompted efforts to increase screening uptake, there is a persistent need to develop public health interventions in partnership with the African American community. The aim of this study was to conduct focus groups with African Americans to determine preferences for the content and mode of dissemination of culturally tailored CRC screening interventions. In June 2013, 45-75-year-old African Americans were recruited through online advertisements and from an urban Veterans Affairs system to create four focus groups. A semi-structured interview script employing open-ended elicitation was used, and transcripts were analyzed using ATLAS.ti software to code and group data into a concept network. A total of 38 participants (mean age = 54) were enrolled, and 59 ATLAS.ti codes were generated. Commonly reported barriers to screening included perceived invasiveness of colonoscopy, fear of pain, and financial concerns. Facilitators included poor diet/health and desire to prevent CRC. Common sources of health information included media and medical providers. CRC screening information was commonly obtained from medical personnel or media. Participants suggested dissemination of CRC screening education through commercials, billboards, influential African American public figures, Internet, and radio. Participants suggested future interventions include culturally specific information, including details about increased risk, accessing care, and dispelling of myths. Public health interventions to improve CRC screening among African Americans should employ media outlets, emphasize increased risk among African Americans, and address race-specific barriers. Specific recommendations are presented for developing future interventions.
Zhu, Xun; Xie, Shangbo; Armengaud, Jean; Xie, Wen; Guo, Zhaojiang; Kang, Shi; Wu, Qingjun; Wang, Shaoli; Xia, Jixing; He, Rongjun; Zhang, Youjun
2016-01-01
The diamondback moth, Plutella xylostella (L.), is the major cosmopolitan pest of brassica and other cruciferous crops. Its larval midgut is a dynamic tissue that interfaces with a wide variety of toxicological and physiological processes. The draft sequence of the P. xylostella genome was recently released, but its annotation remains challenging because of the low sequence coverage of this branch of life and the poor description of exon/intron splicing rules for these insects. Peptide sequencing by computational assignment of tandem mass spectra to genome sequence information provides an experimental independent approach for confirming or refuting protein predictions, a concept that has been termed proteogenomics. In this study, we carried out an in-depth proteogenomic analysis to complement genome annotation of P. xylostella larval midgut based on shotgun HPLC-ESI-MS/MS data by means of a multialgorithm pipeline. A total of 876,341 tandem mass spectra were searched against the predicted P. xylostella protein sequences and a whole-genome six-frame translation database. Based on a data set comprising 2694 novel genome search specific peptides, we discovered 439 novel protein-coding genes and corrected 128 existing gene models. To get the most accurate data to seed further insect genome annotation, more than half of the novel protein-coding genes, i.e. 235 over 439, were further validated after RT-PCR amplification and sequencing of the corresponding transcripts. Furthermore, we validated 53 novel alternative splicings. Finally, a total of 6764 proteins were identified, resulting in one of the most comprehensive proteogenomic study of a nonmodel animal. As the first tissue-specific proteogenomics analysis of P. xylostella, this study provides the fundamental basis for high-throughput proteomics and functional genomics approaches aimed at deciphering the molecular mechanisms of resistance and controlling this pest. PMID:26902207
Re-designing Orem's Self-care Theory for Patients with Chronic Hepatitis.
Hasanpour-Dehkordi, Ali; Mohammadi, Nooredin; Nikbakht-Nasrabadi, Alireza
2016-01-01
Hepatitis is an inflammatory disease which has many adverse effects on patients' life because of its chronic nature. Since Orem's theory of self-care is a grounded theory, the concepts and applications of this theory in patients with chronic hepatitis who have special needs may lead to some challenges. The purpose of this study was to explore self-care in patients with chronic hepatitis. A directed content analysis was used in this qualitative study. Participants were recruited from a metropolitan area. Data were collected through semi-structured interviews. The verbatim transcripts of the participants' interviews were analyzed according to directed content analysis. In this study, four themes, suggested by Orem, were drawn from the data according to directed content analysis. The codes generated from the data were classified into concepts and then the concepts were assigned into these four themes. These themes were needs in the matrix of time and place, self-care agency, need for change in self-care and consequences of hepatitis. The use of Orem's self-care theory cannot meet the need for self-care in hepatitis patients because these patients have vital sexual, respect and belonging, physical, economical, and psychological-behavioral needs, and lack adequate knowledge about self-care. Consequently, the specific self-care model developed in this study helps health professionals identify self-care activities in patients with chronic hepatitis.
Standardized verification of fuel cycle modeling
Feng, B.; Dixon, B.; Sunny, E.; ...
2016-04-05
A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less
A Ceramic Fracture Model for High Velocity Impact
1993-05-01
employ damage concepts appear more relevant than crack growth models for this application . This research adopts existing fracture model concepts and...extends them through applications in an existing finite element continuum mechanics code (hydrocode) to the prediction of the damage and fracture processes...to be accurate in the lower velocity range of this work. Mescall and Tracy 15] investigated the selection of ceramic material for application in armors
Developing Trustworthy Commissioned Officers: Transcending the Honor Codes and Concept
2012-10-01
extracurricular activities ). This developmental concept recognizes that individuals...tangible activities within the developmental programs at each SOC must be designed and implemented...develop simultaneously across and within all domains as they complete the activities
A study of concept-based similarity approaches for recommending program examples
NASA Astrophysics Data System (ADS)
Hosseini, Roya; Brusilovsky, Peter
2017-07-01
This paper investigates a range of concept-based example recommendation approaches that we developed to provide example-based problem-solving support in the domain of programming. The goal of these approaches is to offer students a set of most relevant remedial examples when they have trouble solving a code comprehension problem where students examine a program code to determine its output or the final value of a variable. In this paper, we use the ideas of semantic-level similarity-based linking developed in the area of intelligent hypertext to generate examples for the given problem. To determine the best-performing approach, we explored two groups of similarity approaches for selecting examples: non-structural approaches focusing on examples that are similar to the problem in terms of concept coverage and structural approaches focusing on examples that are similar to the problem by the structure of the content. We also explored the value of personalized example recommendation based on student's knowledge levels and learning goal of the exercise. The paper presents concept-based similarity approaches that we developed, explains the data collection studies and reports the result of comparative analysis. The results of our analysis showed better ranking performance of the personalized structural variant of cosine similarity approach.
Script, code, information: how to differentiate analogies in the "prehistory" of molecular biology.
Kogge, Werner
2012-01-01
The remarkable fact that twentieth-century molecular biology developed its conceptual system on the basis of sign-like terms has been the object of numerous studies and debates. Throughout these, the assumption is made that this vocabulary's emergence should be seen in the historical context of mathematical communication theory and cybernetics. This paper, in contrast, sets out the need for a more differentiated view: whereas the success of the terms "code" and "information" would probably be unthinkable outside that historical context, general semiotic and especially scriptural concepts arose far earlier in the "prehistory" of molecular biology, and in close association with biological research and phenomena. This distinction, established through a reconstruction of conceptual developments between 1870 and 1950, makes it possible to separate off a critique of the reductive implications of particular information-based concepts from the use of semiotic and scriptural concepts, which is fundamental to molecular biology. Gene-centrism and determinism are not implications of semiotic and scriptural analogies, but arose only when the vocabulary of information was superimposed upon them.
RB-ARD: A proof of concept rule-based abort
NASA Technical Reports Server (NTRS)
Smith, Richard; Marinuzzi, John
1987-01-01
The Abort Region Determinator (ARD) is a console program in the space shuttle mission control center. During shuttle ascent, the Flight Dynamics Officer (FDO) uses the ARD to determine the possible abort modes and make abort calls for the crew. The goal of the Rule-based Abort region Determinator (RB/ARD) project was to test the concept of providing an onboard ARD for the shuttle or an automated ARD for the mission control center (MCC). A proof of concept rule-based system was developed on a LMI Lambda computer using PICON, a knowdedge-based system shell. Knowdedge derived from documented flight rules and ARD operation procedures was coded in PICON rules. These rules, in conjunction with modules of conventional code, enable the RB-ARD to carry out key parts of the ARD task. Current capabilities of the RB-ARD include: continuous updating of the available abort mode, recognition of a limited number of main engine faults and recommendation of safing actions. Safing actions recommended by the RB-ARD concern the Space Shuttle Main Engine (SSME) limit shutdown system and powerdown of the SSME Ac buses.
Discovering Related Clinical Concepts Using Large Amounts of Clinical Notes
Ganesan, Kavita; Lloyd, Shane; Sarkar, Vikren
2016-01-01
The ability to find highly related clinical concepts is essential for many applications such as for hypothesis generation, query expansion for medical literature search, search results filtering, ICD-10 code filtering and many other applications. While manually constructed medical terminologies such as SNOMED CT can surface certain related concepts, these terminologies are inadequate as they depend on expertise of several subject matter experts making the terminology curation process open to geographic and language bias. In addition, these terminologies also provide no quantifiable evidence on how related the concepts are. In this work, we explore an unsupervised graphical approach to mine related concepts by leveraging the volume within large amounts of clinical notes. Our evaluation shows that we are able to use a data driven approach to discovering highly related concepts for various search terms including medications, symptoms and diseases. PMID:27656096
A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals
NASA Technical Reports Server (NTRS)
Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.
1994-01-01
Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.
specification How to install the software How to use the software Download the source code (using .gz). Standard Exchange Format (SHEF) is a documented set of rules for coding of data in a form for both visual and information to describe the data. Current SHEF specification How to install the software How to use the
ERIC Educational Resources Information Center
Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela
2015-01-01
Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…
NASA Technical Reports Server (NTRS)
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael
1992-01-01
Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Thomas; Hamilton, Steven; Slattery, Stuart
Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less
NASA Technical Reports Server (NTRS)
Rosen, Bruce S.
1991-01-01
An upwind three-dimensional volume Navier-Stokes code is modified to facilitate modeling of complex geometries and flow fields represented by proposed National Aerospace Plane concepts. Code enhancements include an equilibrium air model, a generalized equilibrium gas model and several schemes to simplify treatment of complex geometric configurations. The code is also restructured for inclusion of an arbitrary number of independent and dependent variables. This latter capability is intended for eventual use to incorporate nonequilibrium/chemistry gas models, more sophisticated turbulence and transition models, or other physical phenomena which will require inclusion of additional variables and/or governing equations. Comparisons of computed results with experimental data and results obtained using other methods are presented for code validation purposes. Good correlation is obtained for all of the test cases considered, indicating the success of the current effort.
Development of structured ICD-10 and its application to computer-assisted ICD coding.
Imai, Takeshi; Kajino, Masayuki; Sato, Megumi; Ohe, Kazuhiko
2010-01-01
This paper presents: (1) a framework of formal representation of ICD10, which functions as a bridge between ontological information and natural language expressions; and (2) a methodology to use formally described ICD10 for computer-assisted ICD coding. First, we analyzed and structurized the meanings of categories in 15 chapters of ICD10. Then we expanded the structured ICD10 (S-ICD10) by adding subordinate concepts and labels derived from Japanese Standard Disease Names. The information model to describe formal representation was refined repeatedly. The resultant model includes 74 types of semantic links. We also developed an ICD coding module based on S-ICD10 and a 'Coding Principle,' which achieved high accuracy (>70%) for four chapters. These results not only demonstrate the basic feasibility of our coding framework but might also inform the development of the information model for formal description framework in the ICD11 revision.
NASA Astrophysics Data System (ADS)
Sinha, Gautam
2018-02-01
A concept is presented to design magnets using cylindrical-shaped permanent-magnet blocks, where various types of magnetic fields can be produced by either rotating or varying the size of the magnetic blocks within a given mechanical structure. A general method is introduced to calculate the 3D magnetic field produced by a set of permanent magnets. An analytical expression of the 2D field and the condition to generate various magnetic fields like dipole, quadrupole, and sextupole are derived. Using the 2D result as a starting point, a computer code is developed to get the optimum orientation of the magnets to obtain the user-specific target field profile over a given volume in 3D. Designs of two quadrupole magnets are presented, one using 12 and the other using 24 permanent-magnet blocks. Variation of the quadrupole strength is achieved using tuning coils of a suitable current density and specially designed end tubes. A new concept is introduced to reduce the integrated quadrupole field strength by inserting two hollow cylindrical tubes made of iron, one at each end. This will not affect the field gradient at the center but reduce the integrated field strength by shielding the magnetic field near the ends where the tubes are inserted. The advantages of this scheme are that it is easy to implement, the magnetic axis will not shift, and it will prevent interference with nearby devices. Around 40% integrated field variation is achieved using this method in the present example. To get a realistic estimation of the field quality, a complete 3D model using a nonlinear B -H curve is also studied using a finite-element-based computer code. An example to generate around an 80 T /m quadrupole field gradient is also presented.
An evolutionary framework for cultural change: Selectionism versus communal exchange
NASA Astrophysics Data System (ADS)
Gabora, Liane
2013-06-01
Dawkins' replicator-based conception of evolution has led to widespread mis-application of selectionism across the social sciences because it does not address the paradox that necessitated the theory of natural selection in the first place: how do organisms accumulate change when traits acquired over their lifetime are obliterated? This is addressed by von Neumann's concept of a self-replicating automaton (SRA). A SRA consists of a self-assembly code that is used in two distinct ways: (1) actively deciphered during development to construct a self-similar replicant, and (2) passively copied to the replicant to ensure that it can reproduce. Information that is acquired over a lifetime is not transmitted to offspring, whereas information that is inherited during copying is transmitted. In cultural evolution there is no mechanism for discarding acquired change. Acquired change can accumulate orders of magnitude faster than, and quickly overwhelm, inherited change due to differential replication of variants in response to selection. This prohibits a selectionist but not an evolutionary framework for culture and the creative processes that fuel it. The importance non-Darwinian processes in biological evolution is increasingly recognized. Recent work on the origin of life suggests that early life evolved through a non-Darwinian process referred to as communal exchange that does not involve a self-assembly code, and that natural selection emerged from this more haphazard, ancestral evolutionary process. It is proposed that communal exchange provides an evolutionary framework for culture that enables specification of cognitive features necessary for a (real or artificial) societies to evolve culture. This is supported by a computational model of cultural evolution and a conceptual network based program for documenting material cultural history, and it is consistent with high levels of human cooperation.
Gitai go: the art of deepening everyday life through exceeding codes.
Traversa, Rosa
2010-06-01
The present commentary is focused on exploring holistic ways to approach sense-making processes by following the usage of specific Japanese mimic words, Gitai go, and describing how its functioning cannot be disengaged from an embodied lens to approach language-in-use. In fact, according to Komatsu's (2010) discussion about the extension of meaning derived from Gitai go and its intrinsic flexible characteristics, it is possible--in terms of semiotics--to inquire into vaguely coded systems of mutual understanding, trying to make sense of the general functioning of signs through their peculiar ambiguity as well as their potential to evoke a vivid negotiation of meaning. This seems to show the openness of meaning highlighted by Gitai go, as it is to be referred to the logic of multiplicity deeply linked with the actors' feelings in the setting that could in general terms be labeled as the carnal knowledge. Furthermore, it has been arguing about the complexity of daily life experience and its close relation to a concept of "ordinary art", as the active involvement people show in imagining, changing and creating their personal experience of the world is always performed in their day-by-day frameworks, deeply suggesting a unique strive for appropriating-negotiating-contesting networks of meanings. And this is to be approached as an artistic mode of experiencing, since art too is just embedded in this ever-emerging ambivalence coming from the complex we call "ordinary life" and relating to our deep feelings of facing our futures. Along these lines I suggest that a particular role exists in communicative messages for what is labeled as "redundant" or "superfluous"--since the ambivalence of those messages explicates the dialogical frame of sense-making, in everyday life as a concept of art.
Extension of analog network coding in wireless information exchange
NASA Astrophysics Data System (ADS)
Chen, Cheng; Huang, Jiaqing
2012-01-01
Ever since the concept of analog network coding(ANC) was put forward by S.Katti, much attention has been focused on how to utilize analog network coding to take advantage of wireless interference, which used to be considered generally harmful, to improve throughput performance. Previously, only the case of two nodes that need to exchange information has been fully discussed while the issue of extending analog network coding to more than three nodes remains undeveloped. In this paper, we propose a practical transmission scheme to extend analog network coding to more than two nodes that need to exchange information among themselves. We start with the case of three nodes that need to exchange information and demonstrate that through utilizing our algorithm, the throughput can achieve 33% and 20% increase compared with that of traditional transmission scheduling and digital network coding, respectively. Then, we generalize the algorithm so that it can fit for occasions with any number of nodes. We also discuss some technical issues and throughput analysis as well as the bit error rate.
Realizing the child's perspective: An exploration of sixth-graders' ideas about land use
NASA Astrophysics Data System (ADS)
Wee, Bryan Shao-Chang
Given the rapid rate of urbanization in the U.S., it is important to explore children's conceptions of land use and to understand children's relationships to the environment. In addition, the school is an important source of environmental information where curriculum and instruction play critical roles in shaping children's ideas. The purpose of this study, therefore, was to investigate children's conceptions of land use in the context of an environmental science class. This was a naturalistic study conducted with 13 sixth-graders and their teacher in West-central Indiana. A social constructivist framework was utilized to steer data collection and to guide interpretation. Qualitative methods such as interviews, drawings and photograph journals were used to elicit children's ideas and field notes provided a rich description of the learning environment. Data were analyzed inductively and coded using case-specific criteria to organize and interpret data on an emergent basis. It was found that children in this study did not view humans as part of the environment. Land use was conceptualized as a human activity for human benefit, that is, children's conceptions of land use were framed by an anthropocentric worldview. Furthermore, children's conceptions of land use-related outcomes were negative and limited to large-scale, visible forms of environmental impacts. Environmental science instruction did not change these ideas; in fact, they were reinforced by the school curriculum. These findings suggest that exploring and applying the fundamental nature of children's ideas in environmental education and research is essential to the development of a land ethic as well as an environmentally literate citizenry.
Soapy: an adaptive optics simulation written purely in Python for rapid concept development
NASA Astrophysics Data System (ADS)
Reeves, Andrew
2016-07-01
Soapy is a newly developed Adaptive Optics (AO) simulation which aims be a flexible and fast to use tool-kit for many applications in the field of AO. It is written purely in the Python language, adding to and taking advantage of the already rich ecosystem of scientific libraries and programs. The simulation has been designed to be extremely modular, such that each component can be used stand-alone for projects which do not require a full end-to-end simulation. Ease of use, modularity and code clarity have been prioritised at the expense of computational performance. Though this means the code is not yet suitable for large studies of Extremely Large Telescope AO systems, it is well suited to education, exploration of new AO concepts and investigations of current generation telescopes.
Labview Interface Concepts Used in NASA Scientific Investigations and Virtual Instruments
NASA Technical Reports Server (NTRS)
Roth, Don J.; Parker, Bradford H.; Rapchun, David A.; Jones, Hollis H.; Cao, Wei
2001-01-01
This article provides an overview of several software control applications developed for NASA using LabVIEW. The applications covered here include (1) an Ultrasonic Measurement System for nondestructive evaluation of advanced structural materials, an Xray Spectral Mapping System for characterizing the quality and uniformity of developing photon detector materials, (2) a Life Testing System for these same materials, (3) and the instrument panel for an aircraft mounted Cloud Absorption Radiometer that measures the light scattered by clouds in multiple spectral bands. Many of the software interface concepts employed are explained. Panel layout and block diagram (code) strategies for each application are described. In particular, some of the more unique features of the applications' interfaces and source code are highlighted. This article assumes that the reader has a beginner-to-intermediate understanding of LabVIEW methods.
Barriers Against Implementing Blunt Abdominal Trauma Guidelines in a Hospital: A Qualitative Study
Zaboli, Rouhollah; Tofighi, Shahram; Aghighi, Ali; Shokouh, Seyyed Javad Hosaini; Naraghi, Nader; Goodarzi, Hassan
2016-01-01
Introduction Clinical practice guidelines are structured recommendations that help physicians and patients to make proper decisions when dealing with a specific clinical condition. Because blunt abdominal trauma causes a various range of mild, single-system, and multisystem injuries, early detection will help to reduce mortality and resulting disability. Emergency treatment should be initiated based on CPGs. This study aimed to determine the variables affecting implementing blunt abdominal trauma CPGs in an Iranian hospital. Methods This study was conducted as a qualitative and phenomenology study in the Family Hospital in Tehran (Iran) in 2015. The research population included eight experts and key people in the area of blunt abdominal trauma clinical practice guidelines. Sampling was based on purposive and nonrandom methods. A semistructured interview was done for the data collection. A framework method was applied for the data analysis by using Atlas.ti software. Results After framework analyzing and various reviewing and deleting and combining the codes from 251 codes obtained, 15 families and five super families were extracted, including technical knowledge barriers, economical barriers, barriers related to deployment and monitoring, political will barriers, and managing barriers. Conclusion Structural reform is needed for eliminating the defects available in the healthcare system. As with most of the codes, subconcepts and concepts are classified into the field of human resources; it seems that the education and knowledge will be more important than other resources such as capital and equipment. PMID:27757191
Advances in Geologic Disposal System Modeling and Application to Crystalline Rock
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mariner, Paul E.; Stein, Emily R.; Frederick, Jennifer M.
The Used Fuel Disposition Campaign (UFDC) of the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE), Office of Fuel Cycle Technology (OFCT) is conducting research and development (R&D) on geologic disposal of used nuclear fuel (UNF) and high-level nuclear waste (HLW). Two of the high priorities for UFDC disposal R&D are design concept development and disposal system modeling (DOE 2011). These priorities are directly addressed in the UFDC Generic Disposal Systems Analysis (GDSA) work package, which is charged with developing a disposal system modeling and analysis capability for evaluating disposal system performance for nuclear waste in geologic mediamore » (e.g., salt, granite, clay, and deep borehole disposal). This report describes specific GDSA activities in fiscal year 2016 (FY 2016) toward the development of the enhanced disposal system modeling and analysis capability for geologic disposal of nuclear waste. The GDSA framework employs the PFLOTRAN thermal-hydrologic-chemical multi-physics code and the Dakota uncertainty sampling and propagation code. Each code is designed for massively-parallel processing in a high-performance computing (HPC) environment. Multi-physics representations in PFLOTRAN are used to simulate various coupled processes including heat flow, fluid flow, waste dissolution, radionuclide release, radionuclide decay and ingrowth, precipitation and dissolution of secondary phases, and radionuclide transport through engineered barriers and natural geologic barriers to the biosphere. Dakota is used to generate sets of representative realizations and to analyze parameter sensitivity.« less
Implications of Information Theory for Computational Modeling of Schizophrenia.
Silverstein, Steven M; Wibral, Michael; Phillips, William A
2017-10-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.
Implications of Information Theory for Computational Modeling of Schizophrenia
Wibral, Michael; Phillips, William A.
2017-01-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory—such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio—can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development. PMID:29601053
ERIC Educational Resources Information Center
Davis, Colin J.; Bowers, Jeffrey S.
2006-01-01
Five theories of how letter position is coded are contrasted: position-specific slot-coding, Wickelcoding, open-bigram coding (discrete and continuous), and spatial coding. These theories make different predictions regarding the relative similarity of three different types of pairs of letter strings: substitution neighbors,…
Design of neurophysiologically motivated structures of time-pulse coded neurons
NASA Astrophysics Data System (ADS)
Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lobodzinska, Raisa F.
2009-04-01
The common methodology of biologically motivated concept of building of processing sensors systems with parallel input and picture operands processing and time-pulse coding are described in paper. Advantages of such coding for creation of parallel programmed 2D-array structures for the next generation digital computers which require untraditional numerical systems for processing of analog, digital, hybrid and neuro-fuzzy operands are shown. The optoelectronic time-pulse coded intelligent neural elements (OETPCINE) simulation results and implementation results of a wide set of neuro-fuzzy logic operations are considered. The simulation results confirm engineering advantages, intellectuality, circuit flexibility of OETPCINE for creation of advanced 2D-structures. The developed equivalentor-nonequivalentor neural element has power consumption of 10mW and processing time about 10...100us.
C code generation from Petri-net-based logic controller specification
NASA Astrophysics Data System (ADS)
Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei
2017-08-01
The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dartevelle, Sebastian
2007-10-01
Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less
An open-source textbook for teaching climate-related risk analysis using the R computing environment
NASA Astrophysics Data System (ADS)
Applegate, P. J.; Keller, K.
2015-12-01
Greenhouse gas emissions lead to increased surface air temperatures and sea level rise. In turn, sea level rise increases the risks of flooding for people living near the world's coastlines. Our own research on assessing sea level rise-related risks emphasizes both Earth science and statistics. At the same time, the free, open-source computing environment R is growing in popularity among statisticians and scientists due to its flexibility and graphics capabilities, as well as its large library of existing functions. We have developed a set of laboratory exercises that introduce students to the Earth science and statistical concepts needed for assessing the risks presented by climate change, particularly sea-level rise. These exercises will be published as a free, open-source textbook on the Web. Each exercise begins with a description of the Earth science and/or statistical concepts that the exercise teaches, with references to key journal articles where appropriate. Next, students are asked to examine in detail a piece of existing R code, and the exercise text provides a clear explanation of how the code works. Finally, students are asked to modify the existing code to produce a well-defined outcome. We discuss our experiences in developing the exercises over two separate semesters at Penn State, plus using R Markdown to interweave explanatory text with sample code and figures in the textbook.
Amoroso, P J; Smith, G S; Bell, N S
2000-04-01
Accurate injury cause data are essential for injury prevention research. U.S. military hospitals, unlike civilian hospitals, use the NATO STANAG system for cause-of-injury coding. Reported deficiencies in civilian injury cause data suggested a need to specifically evaluate the STANAG. The Total Army Injury and Health Outcomes Database (TAIHOD) was used to evaluate worldwide Army injury hospitalizations, especially STANAG Trauma, Injury, and Place of Occurrence coding. We conducted a review of hospital procedures at Tripler Army Medical Center (TAMC) including injury cause and intent coding, potential crossover between acute injuries and musculoskeletal conditions, and data for certain hospital patients who are not true admissions. We also evaluated the use of free-text injury comment fields in three hospitals. Army-wide review of injury records coding revealed full compliance with cause coding, although nonspecific codes appeared to be overused. A small but intensive single hospital records review revealed relatively poor intent coding but good activity and cause coding. Data on specific injury history were present on most acute injury records and 75% of musculoskeletal conditions. Place of Occurrence coding, although inherently nonspecific, was over 80% accurate. Review of text fields produced additional details of the injuries in over 80% of cases. STANAG intent coding specificity was poor, while coding of cause of injury was at least comparable to civilian systems. The strengths of military hospital data systems are an exceptionally high compliance with injury cause coding, the availability of free text, and capture of all population hospital records without regard to work-relatedness. Simple changes in procedures could greatly improve data quality.
NASA Astrophysics Data System (ADS)
Jöckel, P.; Kerkweg, A.; Buchholz-Dietsch, J.; Tost, H.; Sander, R.; Pozzer, A.
2008-03-01
The implementation of processes related to chemistry into Earth System Models and their coupling within such systems requires the consistent description of the chemical species involved. We provide a tool (written in Fortran95) to structure and manage information about constituents, hereinafter referred to as tracers, namely the Modular Earth Submodel System (MESSy) generic (i.e., infrastructure) submodel TRACER. With TRACER it is possible to define a multitude of tracer sets, depending on the spatio-temporal representation (i.e., the grid structure) of the model. The required information about a specific chemical species is split into the static meta-information about the characteristics of the species, and its (generally in time and space variable) abundance in the corresponding representation. TRACER moreover includes two submodels. One is TRACER_FAMILY, an implementation of the tracer family concept. It distinguishes between two types: type-1 families are usually applied to handle strongly related tracers (e.g., fast equilibrating species) for a specific process (e.g., advection). In contrast to this, type-2 families are applied for tagging techniques. Tagging means the artificial decomposition of one or more species into parts, which are additionally labelled (e.g., by the region of their primary emission) and then processed as the species itself. The type-2 family concept is designed to conserve the linear relationship between the family and its members. The second submodel is TRACER_PDEF, which corrects and budgets numerical negative overshoots that arise in many process implementations due to the numerical limitations (e.g., rounding errors). The submodel therefore guarantees the positive definiteness of the tracers and stabilises the integration scheme. As a by-product, it further provides a global tracer mass diagnostic. Last but not least, we present the submodel PTRAC, which allows the definition of tracers via a Fortran95 namelist, as a complement to the standard tracer definition by application of the TRACER interface routines in the code. TRACER with its submodels and PTRAC can readily be applied to a variety of models without further requirements. The code and a documentation are included in the electronic supplement.
Integrated Aerodynamic/Structural/Dynamic Analyses of Aircraft with Large Shape Changes
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.; Chwalowski, Pawel; Horta, Lucas G.; Piatak, David J.; McGowan, Anna-Maria R.
2007-01-01
The conceptual and preliminary design processes for aircraft with large shape changes are generally difficult and time-consuming, and the processes are often customized for a specific shape change concept to streamline the vehicle design effort. Accordingly, several existing reports show excellent results of assessing a particular shape change concept or perturbations of a concept. The goal of the current effort was to develop a multidisciplinary analysis tool and process that would enable an aircraft designer to assess several very different morphing concepts early in the design phase and yet obtain second-order performance results so that design decisions can be made with better confidence. The approach uses an efficient parametric model formulation that allows automatic model generation for systems undergoing radical shape changes as a function of aerodynamic parameters, geometry parameters, and shape change parameters. In contrast to other more self-contained approaches, the approach utilizes off-the-shelf analysis modules to reduce development time and to make it accessible to many users. Because the analysis is loosely coupled, discipline modules like a multibody code can be easily swapped for other modules with similar capabilities. One of the advantages of this loosely coupled system is the ability to use the medium-to high-fidelity tools early in the design stages when the information can significantly influence and improve overall vehicle design. Data transfer among the analysis modules are based on an accurate and automated general purpose data transfer tool. In general, setup time for the integrated system presented in this paper is 2-4 days for simple shape change concepts and 1-2 weeks for more mechanically complicated concepts. Some of the key elements briefly described in the paper include parametric model development, aerodynamic database generation, multibody analysis, and the required software modules as well as examples for a telescoping wing, a folding wing, and a bat-like wing.
Color associations among designers and non-designers for common warning and operation concepts.
Ng, Annie W Y; Chan, Alan H S
2018-07-01
This study examined color-concept associations among designers and non-designers with commonly used warning and operation concepts. This study required 199 designers and 175 non-designers to indicate their choice among nine colors to associate with each of the 38 concepts in a color-concept table. The results showed that the designers and non-designers had the same color associations and similar strengths of stereotypes for 17 concepts. The strongest color-concept stereotypes for both groups were red-danger, red-fire, and red-hot. However, the designers and non-designers had different color associations for the concepts of escape (green, red), increase (green, red), potential hazard (red, orange), fatal (black, red), and normal (white, green), while the strengths of the 16 remaining associations for both groups were not at equivalent levels. These findings provide ergonomists and design practitioners with a better understanding of population stereotypes for color coding, and consequently to effectively use colors in their user-centered designs. Copyright © 2018 Elsevier Ltd. All rights reserved.
The commerce of professional psychology and the new ethics code.
Koocher, G P
1994-11-01
The 1992 version of the American Psychological Association's Ethical Principles of Psychologists and Code of Conduct brings some changes in requirements and new specificity to the practice of psychology. The impact of the new code on therapeutic contracts, informed consent to psychological services, advertising, financial aspects of psychological practice, and other topics related to the commerce of professional psychology are discussed. The genesis of many new thrusts in the code is reviewed from the perspective of psychological service provider. Specific recommendations for improved attention to ethical matters in professional practice are made.
Chumney, Elinor C G; Biddle, Andrea K; Simpson, Kit N; Weinberger, Morris; Magruder, Kathryn M; Zelman, William N
2004-01-01
As cost-effectiveness analyses (CEAs) are increasingly used to inform policy decisions, there is a need for more information on how different cost determination methods affect cost estimates and the degree to which the resulting cost-effectiveness ratios (CERs) may be affected. The lack of specificity of diagnosis-related groups (DRGs) could mean that they are ill-suited for costing applications in CEAs. Yet, the implications of using International Classification of Diseases-9th edition (ICD-9) codes or a form of disease-specific risk group stratification instead of DRGs has yet to be clearly documented. To demonstrate the implications of different disease coding mechanisms on costs and the magnitude of error that could be introduced in head-to-head comparisons of resulting CERs. We based our analyses on a previously published Markov model for HIV/AIDS therapies. We used the Healthcare Cost and Utilisation Project Nationwide Inpatient Sample (HCUP-NIS) data release 6, which contains all-payer data on hospital inpatient stays from selected states. We added costs for the mean number of hospitalisations, derived from analyses based on either DRG or ICD-9 codes or risk group stratification cost weights, to the standard outpatient and prescription drug costs to yield an estimate of total charges for each AIDS-defining illness (ADI). Finally, we estimated the Markov model three times with the appropriate ADI cost weights to obtain CERs specific to the use of either DRG or ICD-9 codes or risk group. Contrary to expectations, we found that the choice of coding/grouping assumptions that are disease-specific by either DRG codes, ICD-9 codes or risk group resulted in very similar CER estimates for highly active antiretroviral therapy. The large variations in the specific ADI cost weights across the three different coding approaches was especially interesting. However, because no one approach produced consistently higher estimates than the others, the Markov model's weighted cost per event and resulting CERs were remarkably close in value to one another. Although DRG codes are based on broader categories and contain less information than ICD-9 codes, in practice the choice of whether to use DRGs or ICD-9 codes may have little effect on the CEA results in heterogeneous conditions such as HIV/AIDS.
Multi-level bandwidth efficient block modulation codes
NASA Technical Reports Server (NTRS)
Lin, Shu
1989-01-01
The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.
Center of Gravity in the Asymmetric Environment: Applicable or Not
2006-06-01
public release; distribution unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The military concept of a Center of Gravity ( COG ) in...changed a great deal since the introduction of COG . And in today’s asymmetric environment, in which non-state actors use unconventional tactics, it is...becoming extremely difficult to apply the COG concept. The primary reason for this difficulty is that non-state actors do not operate as a unitary
Joint Chiefs of Staff > Directorates > J3 | Operations
Joint Staff Structure Joint Staff Inspector General Origin of Joint Concepts U.S. Code | Joint Chiefs of J8 | Force Structure, Resources & Assessment Contact J3 Operations Home : Directorates : J3
Sharing Resources In Mobile/Satellite Communications
NASA Technical Reports Server (NTRS)
Yan, Tsun-Yee; Sue, Miles K.
1992-01-01
Report presents preliminary theoretical analysis of several alternative schemes for allocation of satellite resource among terrestrial subscribers of landmobile/satellite communication system. Demand-access and random-access approaches under code-division and frequency-division concepts compared.
Megawatt Electromagnetic Plasma Propulsion
NASA Technical Reports Server (NTRS)
Gilland, James; Lapointe, Michael; Mikellides, Pavlos
2003-01-01
The NASA Glenn Research Center program in megawatt level electric propulsion is centered on electromagnetic acceleration of quasi-neutral plasmas. Specific concepts currently being examined are the Magnetoplasmadynamic (MPD) thruster and the Pulsed Inductive Thruster (PIT). In the case of the MPD thruster, a multifaceted approach of experiments, computational modeling, and systems-level models of self field MPD thrusters is underway. The MPD thruster experimental research consists of a 1-10 MWe, 2 ms pulse-forming-network, a vacuum chamber with two 32 diffusion pumps, and voltage, current, mass flow rate, and thrust stand diagnostics. Current focus is on obtaining repeatable thrust measurements of a Princeton Benchmark type self field thruster operating at 0.5-1 gls of argon. Operation with hydrogen is the ultimate goal to realize the increased efficiency anticipated using the lighter gas. Computational modeling is done using the MACH2 MHD code, which can include real gas effects for propellants of interest to MPD operation. The MACH2 code has been benchmarked against other MPD thruster data, and has been used to create a point design for a 3000 second specific impulse (Isp) MPD thruster. This design is awaiting testing in the experimental facility. For the PIT, a computational investigation using MACH2 has been initiated, with experiments awaiting further funding. Although the calculated results have been found to be sensitive to the initial ionization assumptions, recent results have agreed well with experimental data. Finally, a systems level self-field MPD thruster model has been developed that allows for a mission planner or system designer to input Isp and power level into the model equations and obtain values for efficiency, mass flow rate, and input current and voltage. This model emphasizes algebraic simplicity to allow its incorporation into larger trajectory or system optimization codes. The systems level approach will be extended to the pulsed inductive thruster and other electrodeless thrusters at a future date.
Performance Measures of Diagnostic Codes for Detecting Opioid Overdose in the Emergency Department.
Rowe, Christopher; Vittinghoff, Eric; Santos, Glenn-Milo; Behar, Emily; Turner, Caitlin; Coffin, Phillip O
2017-04-01
Opioid overdose mortality has tripled in the United States since 2000 and opioids are responsible for more than half of all drug overdose deaths, which reached an all-time high in 2014. Opioid overdoses resulting in death, however, represent only a small fraction of all opioid overdose events and efforts to improve surveillance of this public health problem should include tracking nonfatal overdose events. International Classification of Disease (ICD) diagnosis codes, increasingly used for the surveillance of nonfatal drug overdose events, have not been rigorously assessed for validity in capturing overdose events. The present study aimed to validate the use of ICD, 9th revision, Clinical Modification (ICD-9-CM) codes in identifying opioid overdose events in the emergency department (ED) by examining multiple performance measures, including sensitivity and specificity. Data on ED visits from January 1, 2012, to December 31, 2014, including clinical determination of whether the visit constituted an opioid overdose event, were abstracted from electronic medical records for patients prescribed long-term opioids for pain from any of six safety net primary care clinics in San Francisco, California. Combinations of ICD-9-CM codes were validated in the detection of overdose events as determined by medical chart review. Both sensitivity and specificity of different combinations of ICD-9-CM codes were calculated. Unadjusted logistic regression models with robust standard errors and accounting for clustering by patient were used to explore whether overdose ED visits with certain characteristics were more or less likely to be assigned an opioid poisoning ICD-9-CM code by the documenting physician. Forty-four (1.4%) of 3,203 ED visits among 804 patients were determined to be opioid overdose events. Opioid-poisoning ICD-9-CM codes (E850.2-E850.2, 965.00-965.09) identified overdose ED visits with a sensitivity of 25.0% (95% confidence interval [CI] = 13.6% to 37.8%) and specificity of 99.9% (95% CI = 99.8% to 100.0%). Expanding the ICD-9-CM codes to include both nonspecified and general (i.e., without a decimal modifier) drug poisoning and drug abuse codes identified overdose ED visits with a sensitivity of 56.8% (95% CI = 43.6%-72.7%) and specificity of 96.2% (95% CI = 94.8%-97.2%). Additional ICD-9-CM codes not explicitly relevant to opioid overdose were necessary to further enhance sensitivity. Among the 44 overdose ED visits, neither naloxone administration during the visit, whether the patient responded to the naloxone, nor the specific opioids involved were associated with the assignment of an opioid poisoning ICD-9-CM code (p ≥ 0.05). Tracking opioid overdose ED visits by diagnostic coding is fairly specific but insensitive, and coding was not influenced by administration of naloxone or the specific opioids involved. The reason for the high rate of missed cases is uncertain, although these results suggest that a more clearly defined case definition for overdose may be necessary to ensure effective opioid overdose surveillance. Changes in coding practices under ICD-10 might help to address these deficiencies. © 2016 by the Society for Academic Emergency Medicine.
Technical Support Document for Version 3.4.0 of the COMcheck Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan
2007-09-14
COMcheck provides an optional way to demonstrate compliance with commercial and high-rise residential building energy codes. Commercial buildings include all use groups except single family and multifamily not over three stories in height. COMcheck was originally based on ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989) requirements and is intended for use with various codes based on Standard 90.1, including the Codification of ASHRAE/IES Standard 90.1-1989 (90.1-1989 Code) (ASHRAE 1989a, 1993b) and ASHRAE/IESNA Standard 90.1-1999 (Standard 90.1-1999). This includes jurisdictions that have adopted the 90.1-1989 Code, Standard 90.1-1989, Standard 90.1-1999, or their own code based on one of these. We view Standard 90.1-1989more » and the 90.1-1989 Code as having equivalent technical content and have used both as source documents in developing COMcheck. This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.« less
Extracting and standardizing medication information in clinical text – the MedEx-UIMA system
Jiang, Min; Wu, Yonghui; Shah, Anushi; Priyanka, Priyanka; Denny, Joshua C.; Xu, Hua
2014-01-01
Extraction of medication information embedded in clinical text is important for research using electronic health records (EHRs). However, most of current medication information extraction systems identify drug and signature entities without mapping them to standard representation. In this study, we introduced the open source Java implementation of MedEx, an existing high-performance medication information extraction system, based on the Unstructured Information Management Architecture (UIMA) framework. In addition, we developed new encoding modules in the MedEx-UIMA system, which mapped an extracted drug name/dose/form to both generalized and specific RxNorm concepts and translated drug frequency information to ISO standard. We processed 826 documents by both systems and verified that MedEx-UIMA and MedEx (the Python version) performed similarly by comparing both results. Using two manually annotated test sets that contained 300 drug entries from medication list and 300 drug entries from narrative reports, the MedEx-UIMA system achieved F-measures of 98.5% and 97.5% respectively for encoding drug names to corresponding RxNorm generic drug ingredients, and F-measures of 85.4% and 88.1% respectively for mapping drug names/dose/form to the most specific RxNorm concepts. It also achieved an F-measure of 90.4% for normalizing frequency information to ISO standard. The open source MedEx-UIMA system is freely available online at http://code.google.com/p/medex-uima/. PMID:25954575
E-Standards For Mass Properties Engineering
NASA Technical Reports Server (NTRS)
Cerro, Jeffrey A.
2008-01-01
A proposal is put forth to promote the concept of a Society of Allied Weight Engineers developed voluntary consensus standard for mass properties engineering. This standard would be an e-standard, and would encompass data, data manipulation, and reporting functionality. The standard would be implemented via an open-source SAWE distribution site with full SAWE member body access. Engineering societies and global standards initiatives are progressing toward modern engineering standards, which become functioning deliverable data sets. These data sets, if properly standardized, will integrate easily between supplier and customer enabling technically precise mass properties data exchange. The concepts of object-oriented programming support all of these requirements, and the use of a JavaTx based open-source development initiative is proposed. Results are reported for activity sponsored by the NASA Langley Research Center Innovation Institute to scope out requirements for developing a mass properties engineering e-standard. An initial software distribution is proposed. Upon completion, an open-source application programming interface will be available to SAWE members for the development of more specific programming requirements that are tailored to company and project requirements. A fully functioning application programming interface will permit code extension via company proprietary techniques, as well as through continued open-source initiatives.
From Data to Knowledge through Concept-oriented Terminologies
Cimino, James J.
2000-01-01
Knowledge representation involves enumeration of conceptual symbols and arrangement of these symbols into some meaningful structure. Medical knowledge representation has traditionally focused more on the structure than the symbols. Several significant efforts are under way, at local, national, and international levels, to address the representation of the symbols though the creation of high-quality terminologies that are themselves knowledge based. This paper reviews these efforts, including the Medical Entities Dictionary (MED) in use at Columbia University and the New York Presbyterian Hospital. A decade's experience with the MED is summarized to serve as a proof-of-concept that knowledge-based terminologies can support the use of coded patient data for a variety of knowledge-based activities, including the improved understanding of patient data, the access of information sources relevant to specific patient care problems, the application of expert systems directly to the care of patients, and the discovery of new medical knowledge. The terminological knowledge in the MED has also been used successfully to support clinical application development and maintenance, including that of the MED itself. On the basis of this experience, current efforts to create standard knowledge-based terminologies appear to be justified. PMID:10833166
Quality Assurance of Cancer Study Common Data Elements Using A Post-Coordination Approach
Jiang, Guoqian; Solbrig, Harold R.; Prud’hommeaux, Eric; Tao, Cui; Weng, Chunhua; Chute, Christopher G.
2015-01-01
Domain-specific common data elements (CDEs) are emerging as an effective approach to standards-based clinical research data storage and retrieval. A limiting factor, however, is the lack of robust automated quality assurance (QA) tools for the CDEs in clinical study domains. The objectives of the present study are to prototype and evaluate a QA tool for the study of cancer CDEs using a post-coordination approach. The study starts by integrating the NCI caDSR CDEs and The Cancer Genome Atlas (TCGA) data dictionaries in a single Resource Description Framework (RDF) data store. We designed a compositional expression pattern based on the Data Element Concept model structure informed by ISO/IEC 11179, and developed a transformation tool that converts the pattern-based compositional expressions into the Web Ontology Language (OWL) syntax. Invoking reasoning and explanation services, we tested the system utilizing the CDEs extracted from two TCGA clinical cancer study domains. The system could automatically identify duplicate CDEs, and detect CDE modeling errors. In conclusion, compositional expressions not only enable reuse of existing ontology codes to define new domain concepts, but also provide an automated mechanism for QA of terminological annotations for CDEs. PMID:26958201
Cimino, J J
2000-01-01
Knowledge representation involves enumeration of conceptual symbols and arrangement of these symbols into some meaningful structure. Medical knowledge representation has traditionally focused more on the structure than the symbols. Several significant efforts are under way, at local, national, and international levels, to address the representation of the symbols though the creation of high-quality terminologies that are themselves knowledge based. This paper reviews these efforts, including the Medical Entities Dictionary (MED) in use at Columbia University and the New York Presbyterian Hospital. A decade's experience with the MED is summarized to serve as a proof-of-concept that knowledge-based terminologies can support the use of coded patient data for a variety of knowledge-based activities, including the improved understanding of patient data, the access of information sources relevant to specific patient care problems, the application of expert systems directly to the care of patients, and the discovery of new medical knowledge. The terminological knowledge in the MED has also been used successfully to support clinical application development and maintenance, including that of the MED itself. On the basis of this experience, current efforts to create standard knowledge-based terminologies appear to be justified.
Current multiple sclerosis treatments have improved our understanding of MS autoimmune pathogenesis.
Martin, Roland; Sospedra, Mireia; Rosito, Maria; Engelhardt, Britta
2016-09-01
Multiple sclerosis (MS) is the most common inflammatory disorder of the central nervous system (CNS) in young adults. When MS is not treated, it leads to irreversible and severe disability. The etiology of MS and its pathogenesis are not fully understood. The recent discovery that MS-associated genetic variants code for molecules related to the function of specific immune cell subsets is consistent with the concept of MS as a prototypic, T-cell-mediated autoimmune disease targeting the CNS. While the therapeutic efficacy of the currently available immunomodulatory therapies further strengthen this concept, differences observed in responses to MS treatment as well as additional clinical and imaging observations have also shown that the autoimmune pathogenesis underlying MS is much more complex than previously thought. There is therefore an unmet need for continued detailed phenotypic and functional analysis of disease-relevant adaptive immune cells and tissues directly derived from MS patients to unravel the immune etiology of MS in its entire complexity. In this review, we will discuss the currently available MS treatment options and approved drugs, including how they have contributed to the understanding of the immune pathology of this autoimmune disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-Power Hall Propulsion Development at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Kamhawi, Hani; Manzella, David H.; Smith, Timothy D.; Schmidt, George R.
2014-01-01
The NASA Office of the Chief Technologist Game Changing Division is sponsoring the development and testing of enabling technologies to achieve efficient and reliable human space exploration. High-power solar electric propulsion has been proposed by NASA's Human Exploration Framework Team as an option to achieve these ambitious missions to near Earth objects. NASA Glenn Research Center (NASA Glenn) is leading the development of mission concepts for a solar electric propulsion Technical Demonstration Mission. The mission concepts are highlighted in this paper but are detailed in a companion paper. There are also multiple projects that are developing technologies to support a demonstration mission and are also extensible to NASA's goals of human space exploration. Specifically, the In-Space Propulsion technology development project at NASA Glenn has a number of tasks related to high-power Hall thrusters including performance evaluation of existing Hall thrusters; performing detailed internal discharge chamber, near-field, and far-field plasma measurements; performing detailed physics-based modeling with the NASA Jet Propulsion Laboratory's Hall2De code; performing thermal and structural modeling; and developing high-power efficient discharge modules for power processing. This paper summarizes the various technology development tasks and progress made to date
Questioning the evidence for a claim in a socio-scientific issue: an aspect of scientific literacy
NASA Astrophysics Data System (ADS)
Roberts, Ros; Gott, Richard
2010-11-01
Understanding the science in a 'socio-scientific issue' is at the heart of the varied definitions of 'scientific literacy'. Many consider that understanding evidence is necessary to participate in decision making and to challenge the science that affects people's lives. A model is described that links practical work, argumentation and scientific literacy which is used as the basis of this research. If students are explicitly taught about evidence does this transfer to students asking questions in the context of a local socio-scientific issue? What do they ask questions about? Sixty-five primary teacher training students were given the pre-test, before being taught the 'concepts of evidence' and applying them in an open-ended investigation and were tested again 15 weeks later. Data were coded using Toulmin's argument pattern (TAP) and the 'concepts of evidence'. After the intervention it was found that, in relation to a socio-scientific issue, they raised significantly more questions specifically about the evidence that lead to the scientists' claims although questions explicitly targeting the quality of the data were still rare. This has implications for curricula that aim for scientific literacy.
High-Power Hall Propulsion Development at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Kamhawi, Hani; Manzella, David H.; Smith, Timothy D.; Schmidt, George R.
2012-01-01
The NASA Office of the Chief Technologist Game Changing Division is sponsoring the development and testing of enabling technologies to achieve efficient and reliable human space exploration. High-power solar electric propulsion has been proposed by NASA's Human Exploration Framework Team as an option to achieve these ambitious missions to near Earth objects. NASA Glenn Research Center is leading the development of mission concepts for a solar electric propulsion Technical Demonstration Mission. The mission concepts are highlighted in this paper but are detailed in a companion paper. There are also multiple projects that are developing technologies to support a demonstration mission and are also extensible to NASA's goals of human space exploration. Specifically, the In-Space Propulsion technology development project at the NASA Glenn has a number of tasks related to high-power Hall thrusters including performance evaluation of existing Hall thrusters; performing detailed internal discharge chamber, near-field, and far-field plasma measurements; performing detailed physics-based modeling with the NASA Jet Propulsion Laboratory's Hall2De code; performing thermal and structural modeling; and developing high-power efficient discharge modules for power processing. This paper summarizes the various technology development tasks and progress made to date.
Liang, Mingyu; Cowley, Allen W.; Mattson, David L.; Kotchen, Theodore A.; Liu, Yong
2013-01-01
Multiple genes and pathways are involved in the pathogenesis of hypertension. Epigenomic studies of hypertension are beginning to emerge and hold great promise of providing novel insights into the mechanisms underlying hypertension. Epigenetic marks or mediators including DNA methylation, histone modifications, and non-coding RNA can be studied at a genome or near-genome scale using epigenomic approaches. At the single gene level, several studies have identified changes in epigenetic modifications in genes expressed in the kidney that correlate with the development of hypertension. Systematic analysis and integration of epigenetic marks at the genome scale, demonstration of cellular and physiological roles of specific epigenetic modifications, and investigation of inheritance are among the major challenges and opportunities for future epigenomic and epigenetic studies of hypertension. Essential hypertension is a multifactorial disease involving multiple genetic and environmental factors and mediated by alterations in multiple biological pathways. Because the non-genetic mechanisms may involve epigenetic modifications, epigenomics is one of the latest concepts and approaches brought to bear on hypertension research. In this article, we summarize briefly the concepts and techniques for epigenomics, discuss the rationale for applying epigenomic approaches to study hypertension, and review the current state of this research area. PMID:24011581
How does creating a concept map affect item-specific encoding?
Grimaldi, Phillip J; Poston, Laurel; Karpicke, Jeffrey D
2015-07-01
Concept mapping has become a popular learning tool. However, the processes underlying the task are poorly understood. In the present study, we examined the effect of creating a concept map on the processing of item-specific information. In 2 experiments, subjects learned categorized or ad hoc word lists by making pleasantness ratings, sorting words into categories, or creating a concept map. Memory was tested using a free recall test and a recognition memory test, which is considered to be especially sensitive to item-specific processing. Typically, tasks that promote item-specific processing enhance free recall of categorized lists, relative to category sorting. Concept mapping resulted in lower recall performance than both the pleasantness rating and category sorting condition for categorized words. Moreover, concept mapping resulted in lower recognition memory performance than the other 2 tasks. These results converge on the conclusion that creating a concept map disrupts the processing of item-specific information. (c) 2015 APA, all rights reserved.
Investigation of Near Shannon Limit Coding Schemes
NASA Technical Reports Server (NTRS)
Kwatra, S. C.; Kim, J.; Mo, Fan
1999-01-01
Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.
Poly(A) code analyses reveal key determinants for tissue-specific mRNA alternative polyadenylation
Weng, Lingjie; Li, Yi; Xie, Xiaohui; Shi, Yongsheng
2016-01-01
mRNA alternative polyadenylation (APA) is a critical mechanism for post-transcriptional gene regulation and is often regulated in a tissue- and/or developmental stage-specific manner. An ultimate goal for the APA field has been to be able to computationally predict APA profiles under different physiological or pathological conditions. As a first step toward this goal, we have assembled a poly(A) code for predicting tissue-specific poly(A) sites (PASs). Based on a compendium of over 600 features that have known or potential roles in PAS selection, we have generated and refined a machine-learning algorithm using multiple high-throughput sequencing-based data sets of tissue-specific and constitutive PASs. This code can predict tissue-specific PASs with >85% accuracy. Importantly, by analyzing the prediction performance based on different RNA features, we found that PAS context, including the distance between alternative PASs and the relative position of a PAS within the gene, is a key feature for determining the susceptibility of a PAS to tissue-specific regulation. Our poly(A) code provides a useful tool for not only predicting tissue-specific APA regulation, but also for studying its underlying molecular mechanisms. PMID:27095026
MODFLOW-2005 : the U.S. Geological Survey modular ground-water model--the ground-water flow process
Harbaugh, Arlen W.
2005-01-01
This report presents MODFLOW-2005, which is a new version of the finite-difference ground-water model commonly called MODFLOW. Ground-water flow is simulated using a block-centered finite-difference approach. Layers can be simulated as confined or unconfined. Flow associated with external stresses, such as wells, areal recharge, evapotranspiration, drains, and rivers, also can be simulated. The report includes detailed explanations of physical and mathematical concepts on which the model is based, an explanation of how those concepts are incorporated in the modular structure of the computer program, instructions for using the model, and details of the computer code. The modular structure consists of a MAIN Program and a series of highly independent subroutines. The subroutines are grouped into 'packages.' Each package deals with a specific feature of the hydrologic system that is to be simulated, such as flow from rivers or flow into drains, or with a specific method of solving the set of simultaneous equations resulting from the finite-difference method. Several solution methods are incorporated, including the Preconditioned Conjugate-Gradient method. The division of the program into packages permits the user to examine specific hydrologic features of the model independently. This also facilitates development of additional capabilities because new packages can be added to the program without modifying the existing packages. The input and output systems of the computer program also are designed to permit maximum flexibility. The program is designed to allow other capabilities, such as transport and optimization, to be incorporated, but this report is limited to describing the ground-water flow capability. The program is written in Fortran 90 and will run without modification on most computers that have a Fortran 90 compiler.
Operational rate-distortion performance for joint source and channel coding of images.
Ruf, M J; Modestino, J W
1999-01-01
This paper describes a methodology for evaluating the operational rate-distortion behavior of combined source and channel coding schemes with particular application to images. In particular, we demonstrate use of the operational rate-distortion function to obtain the optimum tradeoff between source coding accuracy and channel error protection under the constraint of a fixed transmission bandwidth for the investigated transmission schemes. Furthermore, we develop information-theoretic bounds on performance for specific source and channel coding systems and demonstrate that our combined source-channel coding methodology applied to different schemes results in operational rate-distortion performance which closely approach these theoretical limits. We concentrate specifically on a wavelet-based subband source coding scheme and the use of binary rate-compatible punctured convolutional (RCPC) codes for transmission over the additive white Gaussian noise (AWGN) channel. Explicit results for real-world images demonstrate the efficacy of this approach.
Evaluation in industry of a draft code of practice for manual handling.
Ashby, Liz; Tappin, David; Bentley, Tim
2004-05-01
This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.
Regulation of mammalian cell differentiation by long non-coding RNAs
Hu, Wenqian; Alvarez-Dominguez, Juan R; Lodish, Harvey F
2012-01-01
Differentiation of specialized cell types from stem and progenitor cells is tightly regulated at several levels, both during development and during somatic tissue homeostasis. Many long non-coding RNAs have been recognized as an additional layer of regulation in the specification of cellular identities; these non-coding species can modulate gene-expression programmes in various biological contexts through diverse mechanisms at the transcriptional, translational or messenger RNA stability levels. Here, we summarize findings that implicate long non-coding RNAs in the control of mammalian cell differentiation. We focus on several representative differentiation systems and discuss how specific long non-coding RNAs contribute to the regulation of mammalian development. PMID:23070366
Optimized bit extraction using distortion modeling in the scalable extension of H.264/AVC.
Maani, Ehsan; Katsaggelos, Aggelos K
2009-09-01
The newly adopted scalable extension of H.264/AVC video coding standard (SVC) demonstrates significant improvements in coding efficiency in addition to an increased degree of supported scalability relative to the scalable profiles of prior video coding standards. Due to the complicated hierarchical prediction structure of the SVC and the concept of key pictures, content-aware rate adaptation of SVC bit streams to intermediate bit rates is a nontrivial task. The concept of quality layers has been introduced in the design of the SVC to allow for fast content-aware prioritized rate adaptation. However, existing quality layer assignment methods are suboptimal and do not consider all network abstraction layer (NAL) units from different layers for the optimization. In this paper, we first propose a technique to accurately and efficiently estimate the quality degradation resulting from discarding an arbitrary number of NAL units from multiple layers of a bitstream by properly taking drift into account. Then, we utilize this distortion estimation technique to assign quality layers to NAL units for a more efficient extraction. Experimental results show that a significant gain can be achieved by the proposed scheme.
Microgravity Materials Research and Code U ISRU
NASA Technical Reports Server (NTRS)
Curreri, Peter A.; Sibille, Laurent
2004-01-01
The NASA microgravity research program, simply put, has the goal of doing science (which is essentially finding out something previously unknown about nature) utilizing the unique long-term microgravity environment in Earth orbit. Since 1997 Code U has in addition funded scientific basic research that enables safe and economical capabilities to enable humans to live, work and do science beyond Earth orbit. This research has been integrated with the larger NASA missions (Code M and S). These new exploration research focus areas include Radiation Shielding Materials, Macromolecular Research on Bone and Muscle Loss, In Space Fabrication and Repair, and Low Gravity ISRU. The latter two focus on enabling materials processing in space for use in space. The goal of this program is to provide scientific and technical research resulting in proof-of-concept experiments feeding into the larger NASA program to provide humans in space with an energy rich, resource rich, self sustaining infrastructure at the earliest possible time and with minimum risk, launch mass and program cost. President Bush's Exploration Vision (1/14/04) gives a new urgency for the development of ISRU concepts into the exploration architecture. This will require an accelerated One NASA approach utilizing NASA's partners in academia, and industry.